Feature

BSIMM4 measures and advances secure application development

A black and white hat in a ying-yang

Co-authored by Sammy Migues and Jacob West

BSIMM4 is the fourth iteration of the Building Security In Maturity Model (BSIMM) project, a tool used to measure an organization's software security activities.

The fourth major release of the BSIMM project was published in September 2012. The original study (March 2009) includes nine firms and nine distinct measurements. The second study (May 2010) includes 30 firms and 42 distinct measurements (some firms include very large subsidiaries that are independently measured). BSIMM3 includes 42 firms, 11 of which were measured a second time, for a total set of 81 distinct measurements. The current fourth study includes 51 firms, 13 of which have been measured again for a total set of 95 distinct measurements.

Our work with the BSIMM model shows that measuring a firm's software security initiative is both possible and extremely useful. Enterprises can use BSIMM measurements to plan, structure and execute the evolution of a software security project. Over time, firms participating in the BSIMM project show measurable improvement in their software security initiatives.

BSIMM member organizations

  • Adobe
  • Aon
  • Bank of America
  • Box
  • Capital One
  • The Depository Trust & Clearing Corporation (DTCC)
  • EMC
  • F-Secure
  • Fannie Mae
  • Fidelity
  • Goldman Sachs
  • Intel
  • Intuit
  • JPMorgan Chase & Co.
  • Mashery
  • McKesson
  • Microsoft
  • Nokia
  • Nokia Siemens Networks
  • QUALCOMM
  • Rackspace
  • Salesforce
  • Sallie Mae
  • SAP
  • Scripps Networks
  • Sony Mobile
  • Standard Life
  • SWIFT
  • Symantec
  • Telecom Italia
  • Thomson Reuters
  • Vanguard
  • Visa
  • VMware
  • Wells Fargo
  • Zynga

BSIMM4 basics

The Building Security In Maturity Model (BSIMM, pronounced "bee simm") is an observation-based scientific model describing the collective software security activities of 51 software security initiatives. Thirty-six of the 51 firms we studied have graciously allowed us to use their names -- they are listed under "BSIMM member organizations" in the sidebar.

BSIMM4 is used as a measuring stick for software security. As such, it is useful for comparing software security activities observed in a target firm to those activities observed among the 51 firms (or various subsets of the 51 firms). A direct comparison using the BSIMM is an excellent tool for devising a software security strategy.

We measure an organization by conducting a series of in-person interviews with the executive in charge of the software security initiative and those directly responsible for day-to-day efforts. We convert what we learn during the interviews into a BSIMM scorecard by identifying each of the 111 BSIMM activities the organization carries out. For the record, no firm carries out all 111.

BSIMM4 facts

BSIMM4 describes the work of 978 Software Security Group (SSG) members, all full-time software security professionals, working with a collection of 2,039 other people in their firms (the "satellite") to secure the software developed by 218,286 developers. On average, the 51 participating firms have practiced software security for four years and two months (with the newest initiative being about one year old and the oldest initiative being 17 years old when the model was published in September 2012).

All 51 firms agree that the success of their program hinges on having an internal group devoted to software security -- the SSG. The Software Security Group size on average is 19.5 people (smallest 1, largest 100, median 7.5) with a satellite of about 41 other developers, architects and people in the organization directly engaged in and promoting software security (smallest 0, largest 350, median 6). The average number of developers among our targets was 4,455 (smallest 11, largest 30,000, median 1500), yielding an average percentage of SSG to development of about 1.95%.

That means in our study population, on average, there are two SSG members for every 100 developers. The largest SSG was 10% the size of dev, and the smallest was 0.05%.

BSIMM4 describes 111 activities organized into the twelve practices of our Software Security Framework (SSF). During the study, we kept track of how many times each activity was observed across the 51 firms. The BSIMM document describes each activity with examples, and Figure 1 shows the results. Note that each of the 111 activity descriptions was updated between BSIMM3 and BSIMM4.

Chart showing number of times each of the BSIMM-monitored activities was seen
Figure 1.

Twelve of the 111 activities are highlighted. These are the most commonly observed activities in each practice.

The BSIMM data yields very interesting analytical results. Figure 2 displays two "spider charts" showing average maturity level reached over some number of organizations for the twelve practices. The first chart shows data from all 51 BSIMM4 firms. The second chart shows data from the top ten firms, determined by raw activity score.

Spider charts showing performance in twelve key categories for entire BSIMM sample group and for top ten of that group
Figure 2.

Spider charts are created by noting the highest level of activity in a practice for a given firm (a "high water mark") and then averaging those scores for a group of firms, resulting in twelve numbers (one for each practice). The spider chart has twelve spokes corresponding to the 12 SSF practices. Note that in all of these charts, level 3 (outside edge) is considered more mature than level 0 (inside point). Other more sophisticated analyses are possible, of course, and we continue to experiment with weightings by level, normalization by number of activities and other schemes.

The spider charts above are also useful for comparing groups of firms from particular industry verticals or geographic locations, or even business units within firms. Figure 3 shows data from 19 financial services firms and 19 independent software vendors charted together. On average, the two groups have similar maturity in nearly all practices. We are not entirely surprised to see financial services put more effort into Compliance & Policy because their industry is more heavily regulated. Similarly, we are not surprised to see ISVs putting more effort into testing in the development cycle (the Security Testing practice).

Spider chart comparing performance in twelve key BSIMM activities of 19 financial services organizations and 19 ISV's
Figure 3.

By computing a score for each firm in the study, we can also observe relative maturity and average maturity for one firm against the others. To date, the range of observed scores is [9, 93].

We are pleased that the BSIMM study continues to grow -- the data set has increased by 20% since publication of BSIMM3 and by nearly 500% since the original publication. Note that once we exceeded a sample size of thirty firms, we began to apply statistical analysis yielding statistically significant results. The model adjusts according to the data and the math.

Measuring your firm with BSIMM4

The most important use of the BSIMM is as a measuring stick to determine where your firm's software security initiative currently stands relative to other firms. Do this by noting which activities you already have in place, and useg "activity coverage" to determine the level and build a scorecard. In our own work using the BSIMM to make measurements, we found that the spider-graph-yielding "high water mark" approach (based on the three levels per practice) is sufficient to get a low-resolution feel for maturity, especially when working with data from a particular vertical, geography or set of business units.

Spider chart comparing performance in twelve key BSIMM activities between entire pool and a hypothetical firm
Figure 4.

One meaningful comparison is to chart your own maturity high water mark against the averages we have published to see how your initiative stacks up. In Figure 4, we have plotted data from a (fake) FIRM against the BSIMM Earth spider graph.

A direct comparison of all 111 activities is perhaps the most obvious use of the BSIMM. You can accomplish this by building a scorecard using the data displayed above.

The scorecard you see in Figure 5 depicts a (fake) firm performing 41 BSIMM activities (1is in the FIRM columns), including 8 of the most common activities per practice (purple boxes). On the other hand, the firm is not performing the four other commonly observed activities (red boxes) per practice and should take some time to determine whether these are necessary or useful to its overall software security initiative (they may not be). The BSIMM Firms column shows the number of observations (currently out of 51) for each activity, allowing the firm to understand the general popularity of an activity among the 51 BSIMM participants.

A BSIMM4 scorecard
Figure 5.

Once you determine where you stand with particular activities, you can devise a plan to enhance practices with other activities suggested by the BSIMM.

By providing actual measurement data from the field, the BSIMM makes it possible to build a long-term plan for a software security initiative and track progress against that plan. Once again, for the record, there is no inherent reason to adopt all activities in every level for each practice. Adopt only those activities that make sense for your organization and ignore those that don't.

BSIMM as a longitudinal study

Thirteen of the 51 firms have been measured twice using the BSIMM and one firm has been measured three times, for a total of 14 re-measurements to date. On average, the time between the two measurements is 20 months. Though individual activities among the 12 SSF practices come and go as shown in the longitudinal scorecard in Figure 6, re-measurement over time shows a clear trend of increased maturity thus far. The activity score went up in ten of the twelve firms an average of 11.5 points (a 25% average increase). Software security initiatives mature over time.

BSIMM longitudinal scorecard
Figure 6.

Here are two ways of thinking about the change represented by the longitudinal scorecard. We see the biggest changes in activities such as CR1.1 (top N bugs), where seven firms began executing this activity between assessments, and SR1.3 (compliance put into requirements) and PT1.2 (defect loop to developers) where six firms began executing these activities. There are six other activities that five of the firms undertook for the first time.

Less obvious from the scorecard is the "churn" among activities. For example, while the count of firms remained the same for T3.5 (office hours), four firms started this activity while it was no longer observed in four firms. Similarly, five started SM2.1 (publish data) while it was no longer observed in four; four started CMVM2.3 (ops app inventory) while it was no longer observed in three; and three started CR2.2 (enforce coding standards) while it was no longer observed in three.

The BSIMM Community

The 51 firms participating in the BSIMM Project make up the BSIMM Community. A moderated private mailing list allows SSG leaders participating in the BSIMM to discuss solutions with those who face the same issues, discuss strategy with someone who has already addressed an issue, seek out mentors from those further along a career path, and band together to solve hard problems (e.g., forming a BSIMM mobile security working group).

The BSIMM Community also hosts an annual private conference where up to three representatives from each firm gather together in an off-the-record forum to discuss software security initiatives. In 2012, more than 30 firms participated in the BSIMM Community Conference in New Jersey. During the conference, we ran workshops on Software Security and Fraud, Vendor Management Software Security Controls, and Agile Methods and SSDLs.

In 2013, Cigital Inc. hosted a BSIMM Europe Community Conference in London, where we discussed the software security situation on that side of the pond.

The BSIMM website, under BSIMM Community, posts information from the conferences, working groups, and mailing-list-initiated studies.

A comparison of observed activities across the four BSIMM releases

Figure 7.

BSIMM over time

Observations between the four BSIMM releases are shown side by side in Figure 7. The original model has retained almost all of its descriptive power even as the data set has multiplied by a factor of 9. We did make minor changes in the model between each release in order to remain true to our "data first, model second" approach. These changes involved "promoting" or "demoting" activities as appropriate. For example, a level 1 activity became a level 2 activity or vice versa.

BSIMM4 is the first time we added new activities to the model. The two activities are CR3.4 (use automation to find malicious code) and CMVM3.3 (simulate software crisis) and they are explained in the BSIMM document.

Readers interested in participating in the BSIMM5 project should contact us through the BSIMM website. We have already measured 12 firms, bringing our current data set to 63 firms.


This was first published in May 2013

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: