Many IT managers have been lulled into a false sense of security by the "coin-flip principle," which goes like this: Let's say you flip a coin nine times, and it comes up heads every time. What are the chances it will come up heads again on the 10th time? You may be tempted to say one in 1,000, or one in 100,000 or one in 1 million.
But the correct answer is disarmingly obvious: one in two.
Every time you flip a coin, the chance of it coming up heads is 50/50. In this case, history has no impact on the future; it's only our flawed thinking about probabilities that makes us assume otherwise.
While the analogy is inexact, the coin-flip principle can be applied to some forms of cyberattack. Because they haven't happened yet, we assume they won't. Perhaps the best example of this is the notion of a combined physical/cyber attack on our national infrastructure.
Like most people, I've always pooh-poohed the idea of a "cyber-Pearl Harbor" because, for all the hype about it, it hasn't happened. But a recent conversation I had with Dan Geer radically changed my mind. Geer is perhaps best known as a coauthor of last year's white paper on the perils of a Microsoft monoculture. Suffice it to say, he's no stranger to controversial positions that, when you think about them, make perfect sense.
Geer suggests that the only reason we've avoided a combined physical/cyber attack is sheer dumb luck. He points to 9/11 and the Nimda worm
As you may recall, Nimda appeared one week after the 9/11 terrorist attacks. Using multiple exploit vectors, the worm rampaged through the Internet, causing massive network outages. Nimda also left a backdoor on infected systems that, in theory, could be exploited by its creators. The backdoor, of course, could also be exploited by a "chaser" program written by someone else.
Enter the E911 virus. Back in March 2000, some 18 months prior to 9/11, AV experts began tracking a low-level virus that caused modems on infected computers to endlessly dial 911, wait for an answer, and then hang up. The evil genius of this program was that it exploited the unique functionality of the 911 emergency response system. In ordinary telephone calls, the caller controls the connection -- once he hangs up, the switch drops the call. But in 911 systems, the switch works in reverse: Only the 911 console can drop the connection. That way, emergency services can trace the call even if the caller hangs up.
If some malicious opportunist had reprogrammed the E911 virus to exploit Nimda's backdoor, and then released it as a chaser on Sept. 19, millions of infected computers would have DoS'd the nation's 911 systems. If you tried to call 911 during that time, you'd get a busy signal.
Such an attack, Geer correctly surmises, would have caused a "grand mal seizure" on the nation's already fragile psyche and, worse yet, resulted in needless deaths of people waiting for emergency services.
Geer's point with this story isn't to scare people, but to bring the coin-flip principle into an all-too-real cyberspace context. An E911 chaser wouldn't have been a "coordinated" physical/cyber attack in the sense that Al-Qaeda would have orchestrated it one week after 9/11. But, like 9/11 itself, it would have come out of the blue, totally unexpected, with devastating effect on the national infrastructure.
In the wake of 9/11, we'll never again be complacent about the threat of terrorism on any level. One can only hope that we aren't similarly complacent today about the threat of a combined physical/cyber attack. One also hopes it doesn't take an actual E911 event to wake us up.
About the author:
Andrew Briney, CISSP, is editor-in-chief of Information Security magazine and editorial director of the TechTarget Security Media Group.
Note: This column originally appeared in the August 2004 issue of Information Security
for your free subscription.
This was first published in August 2004