Information Security

Defending the digital infrastructure


Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Schneier-Ranum Face-Off on the dangers of a software monoculture

Security experts Bruce Schneier and Marcus Ranum debate the impact of a software monoculture on computer security.

Point: Bruce Schneier

In 2003, a group of security experts -- myself included -- published a paper saying that 1) software monocultures are dangerous and 2) Microsoft, being the largest creator of monocultures out there, is the most dangerous. Marcus Ranum responded with an essay that basically said we were full of it. Now, eight years later, Marcus and I thought it would be interesting to revisit the debate.

The basic problem with a monoculture is that it's all vulnerable to the same attack. The Irish Potato Famine of 1845--9 is perhaps the most famous monoculture-related disaster. The Irish planted only one variety of potato, and the genetically identical potatoes succumbed to a rot caused by Phytophthora infestans. Compare that with the diversity of potatoes traditionally grown in South America, each one adapted to the particular soil and climate of its home, and you can see the security value in heterogeneity.

Similar risks exist in networked computer systems. If everyone is using the same operating system or the same applications software or the same networking protocol, and a security vulnerability is discovered in that OS or software or protocol, a single exploit can affect everyone. This is the problem of large-scale Internet worms: many have affected millions of computers on the Internet.

If our networking environment weren't homogeneous, a single worm couldn't do so much damage. We'd be more like South America's potato crop than Ireland's. Conclusion: monoculture is bad; embrace diversity or die along with everyone else.

This analysis makes sense as far as it goes, but suffers from three basic flaws. The first is the assumption that our IT monoculture is as simple as the potato's. When the particularly virulent Storm worm hit, it only affected from 1--10 million of its billion-plus possible victims. Why? Because some computers were running updated antivirus software, or were within locked-down networks, or whatever. Two computers might be running the same OS or applications software, but they'll be inside different networks with different firewalls and IDSs and router policies, they'll have different antivirus programs and different patch levels and different configurations, and they'll be in different parts of the Internet connected to different servers running different services. As Marcus pointed out back in 2003, they'll be a little bit different themselves. That's one of the reasons large-scale Internet worms don't infect everyone -- as well as the network's ability to quickly develop and deploy patches, new antivirus signatures, new IPS signatures, and so on.

The second flaw in the monoculture analysis is that it downplays the cost of diversity. Sure, it would be great if a corporate IT department ran half Windows and half Linux, or half Apache and half Microsoft IIS, but doing so would require more expertise and cost more money. It wouldn't cost twice the expertise and money -- there is some overlap -- but there are significant economies of scale that result from everyone using the same software and configuration. A single operating system locked down by experts is far more secure than two operating systems configured by sysadmins who aren't so expert. Sometimes, as Mark Twain said: "Put all your eggs in one basket, and then guard that basket!"

The third flaw is that you can only get a limited amount of diversity by using two operating systems, or routers from three vendors. South American potato diversity comes from hundreds of different varieties. Genetic diversity comes from millions of different genomes. In monoculture terms, two is little better than one. Even worse, since a network's security is primarily the minimum of the security of its components, a diverse network is less secure because it is vulnerable to attacks against any of its heterogeneous components.

Some monoculture is necessary in computer networks. As long as we have to talk to each other, we're all going to have to use TCP/IP, HTML, PDF, and all sorts of other standards and protocols that guarantee interoperability. Yes, there will be different implementations of the same protocol -- and this is a good thing -- but that won't protect you completely. You can't be too different from everyone else on the Internet, because if you were, you couldn't be on the Internet.

Species basically have two options for propagating their genes: the lobster strategy and the avian strategy. Lobsters lay 5,000 to 40,000 eggs at a time, and essentially ignore them. Only a minuscule percentage of the hatchlings live to be four weeks old, but that's sufficient to ensure gene propagation; from every 50,000 eggs, an average of two lobsters is expected to survive to legal size. Conversely, birds produce only a few eggs at a time, then spend a lot of effort ensuring that most of the hatchlings survive. In ecology, this is known as r/K selection theory. In either case, each of those offspring varies slightly genetically, so if a new threat arises, some of them will be more likely to survive. But even so, extinctions happen regularly on our planet; neither strategy is foolproof.

Our IT infrastructure is a lot more like a bird than a lobster. Yes, monoculture is dangerous and diversity is important. But investing time and effort in ensuring our current infrastructure's survival is even more important.

Bruce Schneier is chief security technology officer of BT Global Services and the author of Schneier on Security. For more information, visit his website at

Counterpoint: Marcus Ranum

"Yawn! The death of the Net predicted" ….

Eight years later, monoculture remains a poor and misleading comparison. Why do we need to analogize about computers as if they were biological systems? We ought to be perfectly capable of assessing them on their own terms. We have a rich vocabulary of security terminology, based on a set of commonly understood principles, so why do we feel it's important or useful to squint hard and say, "Computers are kind of sort of like biological organisms; therefore, they're likely to fail in similar ways"? Computers fail like computers, and organisms fail like organisms -- any resemblances between the two are largely coincidental.

Let me illustrate how silly these analogies can get with a simple thought experiment. Suppose for a few minutes we're going to pretend a network plus a bunch of computers is an organism. We can construct one analogy that sounds pretty scary by saying, "Computers, of course, don't have an immune system." Or, we can construct another analogy by saying, "The system administration team plus the combined security researchers at all the antivirus/antimalware vendors plus configuration management software is the immune system." See what I mean? We're wasting time arguing about which analogy is better, which is pointless. It makes more sense to talk about computer security problems using the language of computer security, which is rich enough, even if you exclude the marketing buzzwords.

In fact, the monoculture concept only seems to carry zing because the biological metaphors obscure the basic silliness of the concept. Talking about it in the language of computer security, what the monoculture fearmongers are saying is something (trying to be fair) like: "Too many computers share a common operating system, and therefore share its common flaws; consequently, at a certain point a shared vulnerability could be used to cause massive, cascading failures of critical infrastructure. Therefore, be very afraid."

However, in the real world we observe that:

  • The first part of that scenario has already happened; in fact, it has happened about once a week for the last 15 years.
  • The second part of that scenario hasn't happened, or even anything close to it.

Why not? Because every computer/network out there is managed differently, patched differently; has different addressing and routing schemes, different firewall rules, different configuration management practices, different diagnostic and analytic capabilities, and different system administrators. If you don't get blinded by the shiny analogy, you realize pretty quickly why the monumental collapse scenarios haven't happened since Robert Morris, Jr., took down a small but significant percentage of the nascent Internet for several hours, back in 1988.

There are large numbers of systems that are managed and configured in lock-step -- for example, smartphones, certain point-of-sale terminals, and ATMs. Generally they tend to be special-purpose systems, "walled gardens," or consumer-oriented systems which need zero demand for system administration. In fact, many of those systems run Microsoft Windows -- the very stuff that the monoculture paper warned us about. But there haven't been meltdowns, outside of the occasional entire application-specific load-out (such as one particular bank's ATM network, or a specific wireless provider's smart phone) toppling over, briefly. What we see is exactly what we'd expect to see if the monoculture idea were absolutely wrong: whenever a new vulnerability is discovered, some systems topple, some are immune, some quickly react with workarounds, and home users wonder why their personal computers have suddenly gotten a bit slower.

A more formal explanation why monoculture isn't a problem can be found in Charles Perrow's 1999 book Normal Accidents, in which he analyzes failures in terms of the complexity and interdependence of systems. In Perrow's worldview, a system can be said to be "tightly coupled" if the correct function of one component depends subtly on another, and another in turn. The greater the degree to which components are interdependent, the more likely they are to experience complex, unpredictable accidents -- accidents that Perrow says are easily enough understood in hindsight but are nearly impossible to model predictively because the interdependencies are not discoverable in advance of the accident.

Now, consider modern networks, systems, and software in that light: some pieces are interdependent and others aren't. Yes, a lot of systems depend on components such as DNS, but the upper layers "understand" that it's a piece of the system that fails, and try to fail gracefully along with it. You won't, however, see one service provider building deliberate interdependencies with a competitor unless it's angling for a featured spot on FAIL Blog. The systems and networks we depend on are exactly as wobbly and unreliable as they possibly can be, and yet still function; failure is a built-in fact of the environment, and that's why "belt and suspenders" remains the byword of geek chic.

The monoculture argument was, barely concealed, nothing more than an extended whine about Microsoft's market dominance -- and I happen to know that the main authors were all Mac users. I suspect that security was less the real issue than the frustration Mac users felt a decade ago at being blown off by corporate IT. But look what's happened: the technology landscape has changed, and now there are two completely different operating system/application stacks -- neither of which has yet toppled in a catastrophic failure.

That's partly because of market dynamics; it seems that when one vendor gains a sufficiently strong lock on a market it over-prices and under-innovates until a cheaper, cooler, and shinier alternative becomes attractive. The entire history of the computer industry is a swirling jumble in which one company dominates enough to become scary and create its competitors -- the way IBM's lock on business computing in the 1970s triggered the departmental computing revolution of the 1980s, and "big IT" and system administration in the 1990s justifies the "cloud computing" backlash.

Monoculture won't happen because every vendor needs to differentiate its products in the marketplace if there is still room to innovate. The "all the eggs in one basket" scenario you're worrying about is a natural reaction to the vendor-inspired technology fragmentation of the 1980s; it's just the normal ebb and flow of the market.

Marcus Ranum is the CSO of Tenable Network Security and is a well-known security technology innovator, teacher and speaker. For more information, visit his website at

Article 7 of 7
This was last published in November 2010

Dig Deeper on Security industry market trends, predictions and forecasts

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

Get More Information Security

Access to all of our back issues View All