Point: Marcus Ranum
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Before we get started, I need to confess my biases and background: I've been a coder, project leader, VP of engineering, CTO and CEO -- I've held every job in the software task tree that exists in a software company. I'm going to make a few assertions in this column that I won't have room to back up in detail, but they're facts and you should accept them as such. Most of what we need to know for this discussion is summarized in this observation by the co-inventor of the buffer overflow, Brian Kernighan: "Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it."
Finding security holes in software is harder than debugging. And finding a hidden security trapdoor in software would be even harder.
So it follows from this assertion that if you don't know how to write code at all, you're lunchmeat if anyone, anywhere, is able to inject malicious code into your software supply. In fact, the current primary mode of software production (please don't call it "engineering") is the "mashup" -- a process by which applications are constructed out of other live applications, which are often large conglomerates of other applications, etc. The result is a software supply chain in which processing is dynamic and the behavior of a high-level program can be changed by the owner of one of its components. Simply put, that means that whoever owns code you depend on, owns your data.
The operating environments of choice today operate similarly -- the device driver for your USB keyfob or graphic card was written by a contractor to a subcontractor; it runs in kernel space and can access any process currently running in system memory. Whoever owns your device driver owns your keyboard, your hard drive, and your encryption keys -- and, often, nobody knows who that is because it came with the hardware's OEM bundle.
In short, we talk like we're concerned about data leakage and information security, but our behavior says otherwise. And it's interesting to watch how the rest of the world has been dealing with the same problem. For the foreign powers, it was the fact that everything, eventually, is touched by code from Microsoft and microcode from Intel. It's every counter-intelligence officer's nightmare: all your secrets are eventually handed over to a trade secret--protected mass of software and hardware produced by a country that has its own history of playing dirty tricks with technology. In the early 1990s, the Europeans made a few muted whimpers about the topic, but since then it seems everyone has fallen silent.
But now -- if we're smart -- it's our turn to worry. Nobody in the government writes software, anymore -- it's all outsourced. And because of the push toward using commercial off-the-shelf (COTS) software wherever possible, there's no dividing line between specialized code that does something important and the general-purpose code that automates an unclassified supply chain application -- it's all the same stuff, from the same people, and it's being fielded to non-programmers. That's the part where it all breaks down -- someone who doesn't know how software works (at least well enough to write it) doesn't know enough to tell if software might be misbehaving. "It works" is the only criterion non-programmers are capable of holding the software to. If you're not a programmer, you can't even imagine all the possible covert channels I could come up with to leak data through your firewall. We're beginning to see the size and shape of the elephant, thanks to malware writers and bot-herders, but I think the immortal words of Kurt Vonnegut are appropriate here: "Everything will get unimaginably worse and never get better again."
We're in the early stages of the government's IT death-spiral; it's impossible for the government to attract the kind of technical people it needs because they can make three times as much as contractors doing the same work, which means outsourcing is now the only option that remains. But here's the problem: you need experienced programmers to at least glance at a code deliverable to see if it's any good and to verify if the contractors actually accomplished what they were supposed to. So what do you have all over federal IT, today? Contractors reviewing other contractors' work to judge whether it's acceptable. The only way to tell if you've bought a load of crappy code is to have a good programmer look at it (because a bad programmer will look at it and think "I can learn from this...") You cannot be in the IT business without good programmers at the top of your technical food-chain.
What does it mean? It means that if I could say one sentence to Barack Obama, it would be, "Sir, our government's extreme reliance on outsourcing software development to third parties is a threat to national security."
Marcus Ranum is the CSO of Tenable Network Security and is a well-known security technology innovator, teacher and speaker. For more information, visit his website at www.ranum.com.
Counterpoint: Bruce Schneier
Information technology is increasingly everywhere, and it's the same technologies everywhere. The same operating systems are used in corporate and government computers. The same software controls critical infrastructure and home shopping. The same networking technologies are used in every country. The same digital infrastructure underpins the small and the large, the important and the trivial, the local and the global; the same vendors, the same standards, the same protocols, the same applications.
With all of this sameness, you'd think these technologies would be designed to the highest security standard, but they're not. They're designed to the lowest or, at best, somewhere in the middle. They're designed sloppily, in an ad hoc manner, with efficiency in mind. Security is a requirement, more or less, but it's a secondary priority. It's far less important than functionality, and security is what gets compromised when schedules get tight.
Should the government -- ours, someone else's? -- stop outsourcing code development? That's the wrong question to ask. Code isn't magically more secure when it's written by someone who receives a government paycheck than when it's written by someone who receives a corporate paycheck. It's not magically less secure when it's written by someone who speaks a foreign language, or is paid by the hour instead of by salary. Writing all your code in-house isn't even a viable option anymore; we're all stuck with software written by who-knows-whom in who-knows-which-country. And we need to figure out how to get security from that.
The traditional solution has been defense in depth: layering one mediocre security measure on top of another mediocre security measure. So we have the security embedded in our operating system and applications software, the security embedded in our networking protocols, and our additional security products such as antivirus and firewalls. We hope that whatever security flaws -- either found and exploited, or deliberately inserted -- there are in one layer are counteracted by the security in another layer, and that when they're not, we can patch our systems quickly enough to avoid serious long-term damage. That is a lousy solution when you think about it, but we've been more-or-less managing with it so far.
Bringing all software -- and hardware, I suppose -- development in-house under some misconception that proximity equals security is not a better solution. What we need is to improve the software development process, so we can have some assurance that our software is secure -- regardless of what coder, employed by what company, and living in what country, writes it. The key word here is "assurance."
Assurance is less about developing new security techniques than about using the ones we already have. It's all the things described in books on secure coding practices. It's what Microsoft is trying to do with its Security Development Lifecycle. It's the Department of Homeland Security's Build Security In program. It's what every aircraft manufacturer goes through before it fields a piece of avionics software. It's what the NSA demands before it purchases a piece of security equipment. As an industry, we know how to provide security assurance in software and systems. But most of the time, we don't care; commercial software, as insecure as it is, is good enough for most purposes.
Assurance is expensive, in terms of money and time, for both the process and the documentation. But the NSA needs assurance for critical military systems and Boeing needs it for its avionics. And the government needs it more and more: for voting machines, for databases entrusted with our personal information, for electronic passports, for communications systems, for the computers and systems controlling our critical infrastructure. Assurance requirements should be more common in government IT contracts.
The software used to run our critical infrastructure -- government, corporate, everything -- isn't very secure, and there's no hope of fixing it anytime soon. Assurance is really our only option to improve this, but it's expensive and the market doesn't care. Government has to step in and spend the money where its requirements demand it, and then we'll all benefit when we buy the same software.
Bruce Schneier is chief security technology officer of BT Global Services and the author of Schneier on Security. For more information, visit his website at www.schneier.com.
Dig Deeper on Secure software development