Essential Guide

How to develop software the secure, Gary McGraw way

A comprehensive collection of articles, videos and more, hand-picked by our editors

McGraw: Software [in]security and scaling automated code review

Gary McGraw and Jim Routh talk through the pitfalls of scaling static source code review and offer some potential process improvements.

Coauthored by Jim Routh, CISO, Global Information Security function leader, Aetna

Gary McGraw hats

Not too many years ago, most firms tackling software security were concerned with whether to automate secure code review. No longer. Today, leading firms know they must automate code review. Nowadays they find themselves concerned with how best to scale their approach.

If your firm has yet to adopt a static analysis code review tool, you are way behind. On the other hand, if you already have a tool or two and you are trying to scale, well, we're here to help.

Describing the state of the practice

Figure 1
Figure 1: 11 activities make up the BSIMM code review practice.

As a descriptive model of software security, the Building Security in Maturity Model (BSIMM) provides a window on many practices in software security, including code review. Today we know for a fact that there are 26 software vendors that take software security seriously enough to measure the maturity of their software security initiatives, joined by more than 25 financial service firms doing the same. Even better, we know which software security activities (out of the 112 documented in BSIMM) are utilized by exactly how many of the 67 firms in the BSIMM Community. And we know there are at least 272,358 developers benefiting from real-world software security initiatives.

So, what does the BSIMM have to say about code review as a practice? Glad you asked -- have a look at figures 1 and 2.

Figure 2
Figure 2: Of the 67 firms in the BSIMM study, over50 take advantage of automated code review.

These hard data from the BSIMM can be humanized by citing particular stories and experience.  One of us (Jim Routh) has been directly involved in leading five distinct software security initiatives in five different firms, from American Express to JP Morgan Chase to Aetna. Let's take a look at the evolution of code review through the lens of his experience.

How code review has changed

Back in his American Express days a decade ago, Jim was starting from scratch with little clue.  Resources for improving and measuring software security were extremely limited, and tools were rudimentary, if they even existed at all. You could count the firms automating code review back then on one hand.

Things have changed. Today Jim has the benefit of the "scar tissue" (aka practical experience) gained by deploying software security in five different organizations. As a result, he is coming to know what effective practices look like, not to mention which approaches to avoid based on previous failures. 

A note on code review

As we wrote this article (using Microsoft Word), we sometimes encountered words with a red squiggly underline indicating a spelling error (especially when McGraw was typing). When we saw such a squiggle, we'd go back and correct the error as we completed the sentence.

Microsoft Word was alerting us to defects as we wrote the document, enabling us to fix the spelling errors so that you had a fighting chance of understanding the words and ultimately grokking what we had to say.

We would love to have a code review tool that enables application developers to see bugs in a similar fashion. The idea is to identify bugs as developers are writing code, so they can both correct them and learn how to avoid them going forward. Several of the tool vendors in the code review space have lightweight tools that attempt to offer that kind of approach, Cigital's SecureAssist being one specific example. In this article, we call those tools "IDE-based code review tools."

As an example, Jim knows that today, developers all over the world embrace the opportunity to learn software security techniques, whereas only 10 years ago they challenged the very idea with a "you do your job, I'll do mine" attitude. Jim has learned a great deal about why developers prefer to leverage frameworks and tools to help them understand how to build secure software in the first place. It turns out that developers don't just want their bugs identified after the fact by security weenies. Rather, they want both to fix bugs as early as possible in the development process and to avoid creating bugs in future code.

This realization has influenced how Jim views the integration of code review tools into the development process.

Industrial-strength code review tools, such as HP Fortify, IBM AppScan, Coverity, Checkmarx and others, are much more mature and robust than the code review tools available 10 years ago. There are several distinct ways to integrate these types of tools into the development process and several factors to be taken into consideration (not least whether the development team relies on an agile SDLC or a more conventional SDLC.) As the tools have matured to cover a broad range of vulnerabilities, they have in general evolved for integration into a build process on a big build server. That means in some cases they may not be feasible for use at the developer desktop.  Simply put, the industrial-strength tech eats a workstation alive.

 Because of their size and their technical approach to the problem, industrial-strength tools have issues running in an IDE on a development server, workstation or VM. If a developer has to tie up her development workstation for two to three hours to run a scan on a single build component, the result is that her productivity diminishes as she waits around for results. Centralizing code review on big iron can help but requires coordinating scan results with particular code globs.

This problem becomes severe with smallish agile sprints launching one after another in short time windows -- matching scan results with sprint versions can be a real pain. And there is a further issue. Sadly, all of the industrial-strength code review tools require output to be reviewed by someone knowledgeable in order to weed out false positives and prioritize bug information by severity. That doesn't scale either. To boil it all down, in some cases, industrial-strength code review tools can become a bottleneck, gating your development efficiency and making developers unhappy.

Getting the developers to do it

A decade ago, Jim's original vision for code review (which remains unchanged today) was for a developer to use a tool without requiring a security specialist to look over her shoulder interpreting and prioritizing results. (For whoever is wondering, the ITS4 tool McGraw wrote about in 1999 was integrated directly into Emacs and was intended for use by developers.)

Developers can help make code review scale by running scans as they code. The dilemma for CISOs and software security group leaders is to provide the flexibility that developers want with a robust approach for reducing information security risk. Tools integrated into an IDE are great stuff, but they can't find everything. So the question is how to remove the most common bugs prior to production through iterative testing and remediation while aligning with agile development.

Four and a half history lessons

How has Jim approached code review over his five software security initiatives?  Like this:

Figure 3

The current operating premise for Jim's approach number 5 goes like this: deploying an easy-to-use code review tool that teaches developers how to eliminate security bugs is easier for developers to adopt broadly across diverse development teams, provides excellent education for developers and provides a reasonable and measurable governance capability. Jim's most recent approach may eventually require adjustment to address bugs that can't be discovered using IDE-based code analysis.

The working hypothesis (which remains unproven) is that a combination of the IDE-based approach to code review with other controls will address the majority of software defects (bugs and flaws) to significantly reduce the application portfolio risk without the capital-intensive investment in a centralized code review scanning factory for code review. In order to prove (or disprove) this hypothesis, Jim is currently capturing data on initiative effectiveness with his team.  

The table below enumerates the other controls Jim is putting in place.

Figure 4

These are the controls used for mobile application development:

Evolving and learning with code review that scales

Over the next year, Jim will be determining if the investment mix for the controls above is appropriate and sufficient to meet future business needs. The reallocation of capital investment away from a centralized code review factory running an industrial-strength tool to controls that are in some sense less mature (open source scanning, easy-to-use IDE-based code review, network based Web application security enforcement) is a work in progress. Determining the effectiveness of this set of controls is critical, because our goal is ultimately understanding how best to invest limited resources in software security.

Now that (almost) everyone understands the necessity of applying static analysis code review in a software security initiative, we are all busy optimizing performance and scalability tradeoffs.  There is no doubt that code review tools will continue to evolve, resulting in simpler deployment models for developers. It is even likely that the industrial-strength tools will evolve to be more "agile friendly" and computationally less intensive, ultimately resulting a combination of IDE-based tools and industrial-strength factories that is more attractive as a package.

In fact, much of software security is evolving. We have moved well past the "should we do it?" stage as a field and are now in the throes of making everything scale as efficiently as possible.  As activities and practices evolve, the BSIMM evolves right along with it. Measurement, scalability and a focus on efficiency -- what's not to like about modern software security?

This was first published in January 2014

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

Essential Guide

How to develop software the secure, Gary McGraw way

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

SearchCloudSecurity

SearchNetworking

SearchCIO

SearchConsumerization

SearchEnterpriseDesktop

SearchCloudComputing

ComputerWeekly

Close