"How am I doing?"
There's no easy way to answer that question and there certainly isn't a generally accepted standard if you're trying to measure the effectiveness of your information security program. You can't take the same approach Supreme Court Justice Stewart Potter adopted for hard-core porn, for example: "I shall not today attempt further to define the kinds of material I understand to be embraced. . . . but I know it when I see it."
Recognizing the dilemma, the Center for Internet Security (CIS), which has made its mark by developing secure configuration benchmarks for operating systems, network devices and a growing list of heavily used applications, is taking a crack at identifying and defining key information security metrics.
"Security professionals often say they are really not sure how to define success," said CIS CEO Bert Miuccio. "We're creating very specific, unambiguous metric definitions for those indicators of security status that everyone agrees are among the most important to measure."
CIS is starting with an eight measurement criteria reached by a consensus of 85 security experts and others from government, business and academia. Miuccio said CIS will define the criteria later this fall, along with a metrics service to help organizations make sense of their data.
Wading through the myriad of good choices for criteria confounds enterprises and challenged the CIS project participants.
"There are many sources that list thousands of things that are good to measure to help you determine security status in the enterprise," said Miuccio. "If you try to use all that information, what results most often is paralysis by analysis."
Andrew Jaquith, an independent analyst, said CIS' record of developing configuration benchmarks makes them "a pretty good clearinghouse" to provide security metrics. Some companies, he said, are in real need of this kind of guidance.
"There are two kinds of enterprises," said Jaquith. "There are barnstorming innovators that do a fair amount of measurement already, and, in many cases, they view those metrics as competitive advantage and don't share a lot of it."
"Then, you have the other class of enterprises that want to be told what to measure. CIS is in a good position to, shall we say, define metrics for the rest of us."
Miuccio expects to expand the list to about 20 or more within the next year or so, but said these eight key areas are considered high priorities:
Now comes the hard part.
Now, said Miuccio, comes the really tough part of defining each criteria. For example, if you're measuring mean time between security incidents, when does an incident start? What constitutes an incident? How do you define recovery?
CIS will issue definitions that are as specific and unambiguous as possible, while recognizing that criteria vary under circumstances. Muccio's experience with the group's configuration benchmarks shows that "there were many instances in which the answer wasn't yes or no; it wasn't zero or a one. It was a zero in some cases and a one in other cases."
"You need to have very crisp definitions that are clear and unambiguous where the methodology can be reproduced," said Jaquith. "A metric isn't a metric unless there is standard way of doing it."
Miuccio said that CIS chose its first eight criteria, in part, because they are based on data that organizations with mature security programs are already collecting, though they may not be making effective use of that data. A software service will help member organizations develop reports, show trending and measure how they are doing against other organizations' security performance.
The bottom line is to help organizations decide where to focus their time, money and energy.
"The primary reason we were working in the area of information security metrics is to help organizations correlate the practices and processes in which they invest and engage with the outcomes that are produced," Miuccio said. "It sounds simple, but it's very hard work. There's a lot of heavy lifting going on."