Last week, a news headline on this very site read, "FBI says attacks succeeding despite security investments." Upon reading the article, I discovered that the FBI conducted its own study and discovered -- Quick! Stop the presses! -- many companies with firewalls still experience cybercrime.
Let me give some personal background. I have had significant training in statistics, with a lot of focus on examining the statistical validity of studies. I know how statistics can be manipulated and misused. For example, do you know that the early CSI/FBI studies -- often referred to as "the most quoted studies in the field," but not related to this survey -- actually stated that the findings were not statistically valid? All of the later versions of the study also weren't statistically valid; and they've never mentioned how the small response size creates such a large margin of error. Of course, these invalid results are regularly quoted in sales presentations.
Why is statistical validity important? Without it, it means that the study design is fundamentally flawed. It means that the results are going to be used and abused by reporters lacking the know-how to challenge the numbers, and to perpetuate fear, uncertainty and doubt by salespeople and speakers at conferences.
Also, while the report cites a list of supposedly esteemed advisors, the results were improperly portrayed. For example, the term "attacks" was broadly defined, and there was an implication that the technologies in use were meant to stop the broadly defined attacks.
Worse yet is that the survey unfairly portrays security technologies, and will likely be used to question requests by security managers for larger budgets to acquire new security technologies. The survey states that 84% of companies experience virus incidents. At the same time 98% of companies use antivirus software. Hence, it could be interpreted that antivirus software is useless. The survey, however, failed to ask whether or not the respondents properly maintained their AV software. There are relatively similar implications with the spyware questions.
Likewise from a survey perspective, more than 50% of the respondents were from companies with less than $5 million in annual revenue. Can the results therefore be generalized to the Fortune 2000? Of course not. Statistically, it is possible that the results can't even be generalized for companies with less than $5 million in yearly revenue. Frankly, GE or GM alone would report larger incidents than the entire study combined.
Regardless of its validity, the survey is out there, and we have to know how to answer management concerns about the uselessness of security technology. The best way to do that is by demonstrating how much the current FBI study differs from past CSI/FBI research. For example, in the FBI study, 3% of respondents belong to FBI's InfraGard program -- the information-sharing and research effort supported by the IT industry -- while in the CSI study 32% of respondents belong to InfraGard. More relevant is that in the FBI study 23% of respondents had IDS/IPS systems, while in the CSI study 72% of companies used IDS systems and 35% of companies used IPS systems. These discrepancies demonstrate major problems in generalizing and accepting the survey results as hard facts, and most importantly, quoting the survey's results as proven facts. The fact that the current FBI survey has both a larger sampling rate and a larger number of responses than even past statistically invalid CSI/FBI studies is completely irrelevant.
I asked the FBI why it decided to do the survey, and I learned different field offices were mandated to do so. It does not appear that they were given any resources to properly execute a study. They are relying upon random people whose names they came upon. While there are a couple notable people who advised the FBI, they apparently only used those people for implied legitimacy, not to provide a scientific foundation and a real legitimacy to the study itself.
For example, security countermeasures that a knowledgeable practitioner (at least me) would recommend be added to the study include:
Without those, you have no idea whether those "worthless" firewalls and security software suites are being implemented properly.
Likewise, there was no mention in the analysis that only 23% of responding companies had an IDS or IPS, and only 60% of companies kept logs (even though they never asked how frequently logs are reviewed), so the number of incidents are likely to be severely underreported. There was also no breakdown of results by organization size and industry.
Perhaps the greatest shortfall of the entire survey is that there are no questions related to how effective security technology has been. For example, why didn't the FBI ask how many incidents had been prevented, or detected and mitigated, as a result of security technology? If the average company only loses $15,000, security technology is clearly doing something right.
While the media may be focusing on the apparently uselessness of security technology, a number of other completely valid headlines from the study could have been:
Yes, the last bullet can accurately represent the study, but I seriously hope you know that this is only a sign that the companies don't know where to look, or have nothing of any value.
To the FBI's credit, the agent in charge of the study is well meaning and acknowledged that there are likely some problems, and that the study is a work in progress. However, that doesn't negate the fact that it is grossly irresponsible to release a study that is so fundamentally flawed. This study was a gross waste of taxpayer dollars.
Let's face it: surveys for the security industry are created primarily for marketing purposes. This one is doing a disservice both to the FBI and the IT community as a whole. It would have been valid for the FBI to try to identify ways of getting companies to report crimes, but that was not the focus of the survey. Individual field offices should be spending their time performing law enforcement, not wasting the time of the bureau's invaluable special agents.
Ira Winkler is president of the Internet Security Advisors Group. He has over 20 years of experience in the intelligence and security fields, and has worked for the National Security Agency, and consults to a wide variety of Fortune 50 corporations. The author of Spies Among Us, he is an occasional contributor to Security Wire Perspectives.