Creating meaningful information security metrics

Learn how to develop an effective information security metrics program and pitfalls to avoid.

This Content Component encountered an error

Security budgets have proven to be more resistant to the recession than many areas of IT, but they haven't been completely recession-proof. Security spending, which rose like a rocket ship with double-digit increases from 2002 through 2007, started to sputter about two years ago. Organizations report that discretionary security projects have been delayed or "sent back to the lab" for further evaluation. For 2010, Forrester Research...

expects that overall security budgets will rise less than 5 percent over 2009 --higher than in the previous year, but not by much.

The reluctance to increase security budgets places increased pressure on security managers to justify their projects. Security, sadly, is one of those professions where victories are taken for granted and go unnoticed, but failures are embarrassingly public. To the untrained eye, security staff, technologies and processes cost a lot of money but produce little tangible output on a day-to-day basis, other than a vaguely satisfied feeling that "nothing bad happened" today. As a result, smart security managers, sensing sudden vulnerability in their budgets, seek better ways to measure and prove the value of what they do every day.

But before plunging into a security metrics program, there are a number of issues to keep in mind. We'll look at some of the missteps that can lead to frustration and failure and the ingredients for an effective program.

METRICS MISTAKES

Some enterprises, daunted by the challenge of "measuring nothing," simply haven't made metrics a priority yet. Other organizations start ambitious security metrics programs but are tripped up by three major pitfalls, especially in the early stages of program development:

  • Try to boil the ocean. Faced with the pressure of not wanting to miss something important, enterprises try to measure "everything" they can think of: every threat and vulnerability class, dozens of operational metrics ranging from patching to spam to identity management, and multiple takes on how to quantify application risks and defects.
  • Pick convenient metrics, rather than meaningful ones. Nearly every security product enterprises own has some sort of reporting feature that generates numbers the vendor deems important. It's easy to use these as a starting point, and in some cases they make good metrics. But most are just statistics that you can file in the "fun facts of the day" folder. Your boss does not care how many spam e-mails your gateway blocked or the number of "policy violations" your desktops have -- whatever those are.
  • Miss the forest for the trees. Good security metrics should have five qualities. Most organizations know how to pick metrics that satisfy the first four qualities: namely, that they are expressed as numbers, have one or more units of measure, are measured in a consistent and objective way, and can be gathered cheaply. But only a few pick metrics that satisfy the most important criterion: contextual relevance. That is, the metrics must help someone -- usually the boss -- make a decision about an important security or business issue. Too many organizations use metrics to erect Byzantine temples to "security-ness" that measure the minutiae of what they understand rather than what the boss needs to know. Failure to pass the "so-what" test makes a metric potentially interesting but not insightful.

LESSONS LEARNED

The truth of the matter is that setting up a security metrics program is not easy. But it need not be stressful either. Putting together a metrics program means having the right perspective. Here are four lessons drawn from the experiences of enterprise security leaders:

  • Clarity and context eases acceptance. The meanings of some security metrics are very clear. It's easy to understand what the metric "average time to patch a workstation" means and how it might be derived; the units of measure are clearly expressed. The meanings of the word "patch" and "workstation" need no explanation. But what about an "application risk score" of 93? How much better is it compared to a score of, say 80? In these cases, experienced program managers make a point of explaining how the scores are derived. Their exhibits and dashboards clearly and succinctly explain what went into their less-obvious formulas, and how readers should interpret the results.
  • Insights flow from comparisons. Yale Professor Edward Tufte wrote in his superb book Envisioning Information, "If the numbers are boring, then you've got the wrong numbers." One of the best ways to gain real insights about the health of the security program is to stop treating the organization as a monolith. When you slice security metrics by business unit, division, manager or geography, revealing patterns always pop out. Which of your divisions are stars, and which are "cowboys" or renegades? By comparing different groups against each other, metrics that you are measuring become a lot more interesting, and insights readily apparent.
  • Less is more. In the computing and consumer electronics world, fans of Apple's products appreciate their minimalist, clean designs and streamlined user interfaces. What makes Apple's products special is not what the company puts in, but what it leaves out. Similarly, New England Patriots coach Bill Belichick's weekly game plans ask team members to excel at just a few things. "If you do these three or four things, you will win," he tells his players. Successful security measurement programs work the same way. Many, many factors go into making security organizations work well. But effective measurement programs focus teams by restraining the number of measures they have to worry about.
  • Balanced Scorecards keep everything in perspective. Nearly 20 years ago, Robert Kaplan and David Norton of Harvard University developed a concept called the "Balanced Scorecard." Invented as a better way to measure company performance, the Balanced Scorecard sets up four complementary perspectives that are critical to predicting long-term success: Financial, Customer, Internal Processes, and Learning and Growth. Adapted to security, the Balanced Scorecard helps bridge the gap between information security and management. Response to the Balanced Security Scoreboard concept in Forrester workshops has been electric.

CREATING A BALANCED SECURITY SCORECARD

What goes into a security scorecard --"balanced" or otherwise? Because every organization is different, the composition and number of metrics depend very much on the business context and priorities of each company. That said, successful scorecards are concise, clear and comprehensive. They don't bore readers with their length or baffle them with mystery terms. And they include enough key performance indicators so that the totality of the security program's activities is covered. When starting a security measurement program, an organization should:

  • Take cues from management. The security organization, and the senior management team it reports to, has a set of principles that shapes what the security program does and the impulses it responds to. These principles might be concerned with protecting information: "Be a good custodian of our patient's medical records" or "Protect our innovations no matter what." It might be concerned with reputation ("just keep our company out of the papers"), service excellence or cost control. These principles influence which metrics to select for the Financial and Customer perspectives in particular. Anticipating what hot-button issues the management team responds is key.
  • Pick a small number of metrics for each perspective. The four perspectives of the Balanced Scorecard enforce an ordering principle on the composition of the metrics you select for your dashboard. You can't be "over-weighted" in the Internal Process perspective because it means you skimp on Learning and Growth. But at the same time, too many metrics overall make the scorecard too difficult to comprehend. What works best is to have three or four metrics for each of the four perspectives. Figure 1 shows a sample of the kinds of metrics a security manager might want to use.

  • Mix perennials and seasonal metrics. Building a metrics program is like tending a garden: To keep it fresh and interesting, seed it with a base of metrics that you can always rely on (the perennials), but sprinkle in additional metrics for short periods of time. "Perennial" metrics should reflect the long-term managerial priorities for the security organization, such as keeping tabs on staffing levels, tracking compliance with benchmarks, and monitoring risk assessment scores. "Seasonal" metrics should be added to shine a spotlight on operational areas that need near-term improvement, such as data leakage, application security or abuse of social media.
  • Use sunshine to create peer pressure. As mentioned earlier, slicing and dicing metrics by business units or geographies is a terrific way to make the data more interesting. But it has another side effect: When you share data on a cross-section basis, you create a subtle form of peer pressure that motivates the laggards to perform more like the leaders. Nobody wants to be in last place. One company several years ago made a game of it: Each manager received a T-shirt with his or her application vulnerability score printed on it. You can imagine the fun at the meeting as managers introduced themselves by their numbers instead of their names ("Hi, I am '53'. What's your score?") to break the ice. Then they swapped lessons learned about why they had scored comparatively well or poorly. Now in truth, a T-shirted "sunshine policy" may not be appropriate for every organization. The key, though, is to share cross-sectional performance information in a judicious and non-judgmental way.

GET GOING

If you don't have a metrics program already, setting one up from scratch might seem hard to imagine, and even harder to implement. But measurement is a lot like physical fitness. While visible progress won't be instant, there is no excuse preventing you from starting today. Start by figuring out what data sources you have, what your team (and bosses) value, and what the scorecard ought to look like when you are done. Use the Balanced Scorecard as an organizing principle. There is no time like the present and the payoff will be invaluable

Andrew Jaquith is a senior analyst at Forrester Research, Inc. covering client and data security. He is the author of Security Metrics: Replacing Fear, Uncertainty and Doubt. Send comments on this article to feedback@infosecuritymag.com.

This was first published in March 2010

Dig deeper on Enterprise Risk Management: Metrics and Assessments

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchCloudSecurity

SearchNetworking

SearchCIO

SearchConsumerization

SearchEnterpriseDesktop

SearchCloudComputing

ComputerWeekly

Close