Opinion: DBIR, other computer security statistics paint tricky picture

Verizon's annual breach report highlights a spate of new security research reports. However, overall conclusions from these are hard to come by.

There's something of a spring season in computer security reports.

This is in part because many reports aim to summarize the prior year, so they appear early in the following one. It's also because many vendors time the release of their reports to coincide with the RSA Conference. Over the past couple of months we've seen reports of various sizes and gravity from Veracode, Trustwave, Cisco, Solutionary, the Ponemon Institute, Symantec, Fortinet, Mandiant, Prolexic, Vormetric and Arbor Networks, to name a few.

The publication of the Verizon Data Breach Incident Report (DBIR) Monday forms something of a natural close to this busy spring season. One can't help but wonder whether all this research, taken as a whole, creates any coherent overall view. Are there "big takeaways" from all this tallying?

The short answer is that consistent takeaways are hard to come by. For one thing, there are some contradictions among the reports.

Take distributed denial-of-service (DDoS) attacks, for example. Prolexis, in its Quarterly Global DDoS Attack Report, stated that average attack bandwidth totaled 48.25 Gbps in the first quarter of 2013, a 718% increase versus the previous quarter. Arbor, on the other hand, has used its ATLAS program to compile Internet traffic data from over 250 service providers, tracking a whopping 42 Tbps of traffic, and finding that in Q1 2013 the average attack size was 1.77 Gbps compared to 1.48 Gbps in 2012. Without putting too fine a point on it, these answers are vastly and improbably different in scale.

Or consider SQL injection. The Veracode State of Software Security report, arguing that knowledge of prevalent software flaws predicts prevalent attack vectors, noted that 32% of applications Veracode analyzed contain SQL injection flaws. "Knowing that," the report states, "you should not be surprised that Trustwave reported that SQL injection was the attack method for 26% of all reported breaches in 2012." Yes, but by the same logic we should perhaps be rather surprised indeed to learn that this year's Verizon DBIR reports SQL injection was involved in 8% of breaches involving "hacking," which it says are 52% of incidents overall. In other words, SQL injection weighs in at about 4% of breaches overall.

There's an even bigger potential problem with Veracode's findings for those of a cynical predisposition. The Verizon DBIR asserts that software flaws per se are mostly too "fancy" for most attackers to bother to exploit. Attackers go for what might be called ridiculously low-hanging fruit. They use stolen credentials and brute force, or phishing and malware. As the report puts it, "While some may argue that we are dealing with an intelligent and adaptive adversary, the data tells us that adaptation isn't necessary for many of these attackers."

If attackers are mostly just asking you nicely to hand over your password, looking at software vulnerabilities isn't going to tell you much about actual attacks. Veracode's report makes a pretty good case, though, that software developers continue to make the same amateur security blunders they've made for the past several years.

Maybe it is hard to draw a straight line from software vulnerabilities to attack vectors in the wild, but that doesn't necessarily justify taking reports from the field as a framework for tying various research results together. For one thing, reports about damages are inextricably beholden to their underlying assumptions, some of which the report authors are fairly, well, eager to assume. A recent report the Ponemon Institute prepared and Venafi sponsored concluded that each of the Global 2000 enterprises worldwide was "projected to lose $35m over the next 24 months due to improper trust technology management; this estimate is based on a total possible cost exposure of $400 million."

When the survey was released, I asked the institute's founder and chair, Larry Ponemon, about how "expected value" worked in compiling dollar values for estimates of financial risk. Ponemon said the survey "required respondents to estimate both exposure value and probability of occurrence to nine separate scenarios relating to operational issues, compliance and cyberattack incidents all relating to key and certificate compromise." The exposure value was based on estimating four categories of potential loss. "The probability of exposure," Ponemon said, was "estimated from a number line between 0 to 10."

Surveys have to take these kinds of "scale of 1 to 10" shortcuts, to be sure. And it's often the case that broader categories ("Is it bigger or smaller than a breadbox?") provide paradoxically better estimates. But on the other hand, the value is directly derived from an estimated risk that doesn't necessarily square with the risks one might determine by examining, say, Verizon's numbers, based as they are on actual field observations, though Verizon's information on the prevalence of stolen credentials in successful breaches is clear and compelling.

Yes, but then again there's a skew in the incident data at Verizon's disposal that is quite possibly not reflected in other reports. One could argue that the DBIR should really be called the "ATM and Point of Sale, Along with Some Servers, Breach Incident Report," because ATMs and POS device-related breaches are heavily represented in Verizon's 2013 dataset. The incidence of "user device" attacks is skewed high, Verizon senior analyst Kyle Maxwell said in a prerelease interview with SearchSecurity.com editors, "because we categorize ATMs as user devices, which sometimes seems a little bit controversial to people. In the end we decided that since the device's main focus is interaction with an end user -- in this case a bank customer -- the ATM attacks get thrown in there."

More generally, the fact that vendors issue or sponsor these reports means not so much that the results are unreliable, but that conclusions are drawn based on niches and lines of inquiry that are sufficiently different in their definitions and underlying assumptions that no meaningful comparisons can be made from one report to the next. Isn't it problematic, for instance, that Verizon's report doesn't consider cloud-related attacks as a separate category?

None of this is to say that there aren't important, thought-provoking findings lurking in these reports. But most of the insight comes within an intentionally narrow focus. Tucked within the Veracode report, for example, is a fascinating look at how code quality improves as the number of rounds of vulnerability testing, fixing and resubmission increases.

The Verizon DBIR is riveting (and alarming) on the subject of how long it takes organizations to find out they've been pwned. (Short answer -- they remain thoroughly clueless till someone outside the organization bothers to tell them.) But is there a coherent summary finding from all that research going on out there? Not this time around.

This was first published in April 2013

Dig deeper on Security Industry Market Trends, Predictions and Forecasts

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchCloudSecurity

SearchNetworking

SearchCIO

SearchConsumerization

SearchEnterpriseDesktop

SearchCloudComputing

ComputerWeekly

Close