- Chris Paget, Contributor
As a security consultant, it's rare I get to talk about success stories. The infosecurity press is usually full of tales of failure; bugs that have been found, new and interesting techniques for breaking in, and cancellations of talks when vendors fail to fix problems. However, shortly before the Black Hat Briefings in Las Vegas in July, a non-disclosure agreement expired that I signed five years earlier -- an NDA that had granted me an unprecedented level of access to the source code, developers and documentation for Windows Vista. Despite its failure in the marketplace, Vista was a real success story in the security world; I couldn't pass up the rare opportunity to talk about that success as a model for others to follow.
Before I begin though, a confession: I don't like Windows. I avoid it wherever I can, preferring Linux on my PCs and smartphones, and BSD on my servers. Of the dozens of computers I own, only two run Windows -- a machine I use to play games, and another for running various bits of test equipment that don't have open source drivers. There are too many things Windows can't do for me, and in my opinion, Microsoft goes too far in making things “usable” for average users at the expense of the real power that lets technical folks like me do our jobs. It's not for me and I avoid it -- but that's not really relevant here. Whether Vista and its descendants were liked is not the point; the question is whether they were secure.
It's a well-known cliché in the security industry that security is a process, not a product. It's an evolving, living thing that needs constant care and feeding; it's not a magical box you can simply drop into a network to make the problems go away. The process that Microsoft designed for securing Windows is called the Security Development Lifecycle (SDL), and my involvement was in the final stage of that process: the FSR, or Final Security Review. This was a final opportunity to catch any nasty bugs before the product shipped; a once-over before release. Let me be clear: SDL was (and continues to be) the gold standard of security processes. I've spent lots of time since Redmond teaching SDL to others; Microsoft used to have a very poor reputation for security, and SDL is how it has turned that around to become one of the best.
One of the reasons Microsoft was able to pay for this review was because they had done some work internally to figure out how much security bugs actually cost. Factoring in the cost of analyzing the flaw, fixing it, running QA tests on the patch, deploying the patch, and all of the other hidden costs is no easy task; Microsoft had arrived at a figure of $250,000. This made the company’s calculations very easy; if we (as expensive consultants) were finding enough bugs that our fee was less than the cost of the bugs we were finding, then it was coming out ahead.
After the initial few months of review it turned out that Microsoft was coming out so far ahead that it tripled the number of consultants, tripled the duration of the assessment (I ended up spending about nine months in Redmond), and delayed Vista to facilitate our testing. We all got T-shirts printed with “I Delayed Windows Vista,” a few of which were even autographed by Brian Valentine, the vice president in charge of Vista's development.
FOLLOWING THE RAT HOLES
Before I get into the details of SDL, a little context: When Microsoft hired me (and many others) as consultants to review Vista, the infosecurity world was a very different place. We had seen Code Red, Nimda and Blaster; we hadn’t seen Conficker or Stuxnet, or any of the more insidious modern malware that can survive formatting of your hard drive. Ransomware was just emerging as a concept. The idea of malware that encrypts your files and makes you pay to decrypt them wasn't popular, neither was threatening to knock people's websites offline unless they paid up. Lulzsec and Anonymous hadn’t appeared; in fact the entire concept of “hacktivism” had yet to reach any significant level of awareness. The TJX, Google and RSA attacks hadn't happened; the term APT hadn’t yet been coined. Infosecurity stories were rarely seen in the mainstream press, and yet Microsoft had the foresight to see where the world was heading, and to get a head start on the security problems that were, in many ways, still emerging.
So what was SDL? To begin, by the time my team and I arrived onsite, Microsoft had already completed a high-level risk assessment and software security review. By dividing Vista into features (encompassing small areas of functionality) and asking simple questions such as, “Does this feature handle credentials?” or, “Does this feature process network traffic?, Microsoft was able to create a list of targets sorted by risk. If your feature processed usernames and passwords, it was higher risk than a feature that didn't; this gave us a great handle on where to start. Every feature also had to provide documentation (such as a threat model and dataflow diagrams) so we knew roughly what we were dealing with; we were each handed a list of high-risk features to review and the documentation was a great foothold into what each component was supposed to do.
After reviewing the docs, we started the real meat of each feature review. We started by interviewing the project managers, architects and developers; anyone we thought would be useful to us was made available. If we didn't like the answers we were getting (and after a while it became easy to spot the people who were hiding things) then we would dive into the code itself, all the while filing bugs as we went. This gave Microsoft a great level of insight into how the review was proceeding, both in terms of how many features had been reviewed, as well as how many bugs had been filed. Our scope of assessment was “anything new since XP”; this didn't include all of the older legacy code where many of Windows’more serious problems lay, although later reviews (such as that for Windows 7) were extended to include the entire codebase. As I understand it, this was done purely to limit the size of the engagement; SDL had been gaining momentum within Microsoft for some time and it was felt that incremental progress was the best way forward; I don't disagree. While our scope was intended to cover only the new code, toward the end of the engagement some of my colleagues did venture into some of the legacy code. Let me assure you, there's some real unpleasantness lurking in Windows; I became convinced that a “Windows Light” without any of the legacy content would be a much faster, safer product if Microsoft ever chose to make it.
In general, we were encouraged to “follow the rat holes,” chasing down potential problems for as long as it took to be sure of whether or not problems existed and where they ultimately lay. The official shirt we were given at the end of the engagement reflected this mentality -- after much debate, the quote chosen for the back of the shirt was from H.P. Lovecraft: “Searchers after horror haunt strange, far places.” It nicely reflects how deeply we delved into the code at times, searching for the ultimate answers to our questions; these quests into the innards of Windows often proved fruitful and many serious bugs were identified and fixed as a result.
THE BUG BAR
Ultimately, all of our findings were being ranked against the “Bug Bar” -- essentially a way for Microsoft to consistently classify the severity of any discovered flaw. Each bug was broken down in several different ways -- whether it was local or remote, affected users or administrators, was in a default configuration and other such decision points. This became a very handy tool for us; it was easy to demonstrate the properties of each flaw we found, making it easy to prove the severity of the flaw in a consistent manner. It didn't always work though; Microsoft had drawn a line on the bug bar, essentially saying, “Anything above this line must be fixed before we ship the product,” so we still ended up arguing sometimes. In one particular case, the argument came down to “is a Bluetooth vulnerability local or remote?”A remote bug meant it would be fixed, while a local flaw would not be. While it was difficult to prove if this particular bug was even exploitable (not your average buffer overflow, for sure), after much back-and-forth it was decided the 20-foot range of Bluetooth was “local” and its priority was dropped as a result.
When using a bug bar it's important to remember not every bug will fit neatly into it. New techniques always emerge that don't work the way you expect; it's important to keep some flexibility so you can respond to these kinds of situations. The best advice I can offer is to leave “other” as an option; if you're ever forced to choose it, then you know you have something new on your hands, and that your bug bar has to be updated to account for it. In general, it's a very powerful tool (we certainly appreciated having it), but it's important to keep some flexibility in there for the situations you don't expect.
What did Microsoft get out of the review? Most importantly, it got a lot of very serious bugs fixed. We weren't involved in fixing the flaws we found (although in some cases we did assist with verifying the fixes), which allowed us to cover a lot of ground very quickly, focusing solely on the offensive bug hunting. I don't recall exactly how many bugs we found (and the outstanding terms of my NDA would prevent me from disclosing that number even now), but I do remember multiplying the number by a quarter million dollars per bug, and comparing the result to the cost of our consulting. Microsoft got spectacular value out of the review when looked at this way, and a much more secure product as a result. The company also got some great documentation; development teams had to complete their threat models and other documentation, or risk being pulled from the DVD if they didn't. As consultants, we reviewed all of these threat models (for some of the team who were doing so fulltime this was a very onerous task), but I have no doubt the effort was worthwhile.
Further, as a result of many of the bugs we found, Microsoft got to write new signatures and rules for their automated code review tools. The company makes extensive use of automated review so much of our knowledge about types of vulnerability got translated into rules and applied across the entire codebase. Finally, Microsoft got to validate that its process works -- SDL had proven itself in battle, so to speak, and garnered the company a lot of knowledge about what areas of Windows were less secure than others; a key part of SDL is in applying that knowledge to figure out which of your developers need better training or what that training needs to cover.
Some of the personal insights I took away from the engagement were also very valuable. A few years prior, I released a white paper about a new class of vulnerabilities in Windows, leveraging GDI messages for local privilege escalation. I called these “Shatter Attacks” since they broke Windows; Vista included an attempt to fix these problems, but I wasn't allowed to spend time looking at it. Essentially Microsoft realized it needed to do something about the problem, but couldn't really fix the root issue (since it was largely up to developers to do the right thing). I was told the company knew the fix didn't really work properly, but it was intended as an additional hurdle to cross.
As it turned out, Microsoft was right; Internet Explorer was the main consumer of the technology, and it was broken shortly after release. I did, however, get to meet the developer who wrote the initial fix for my case study example; I had been forced to release a zero-day flaw since Microsoft initially refused to accept the problem, and it turned out my fear of repercussions from Redmond was almost exactly mirrored in the fear of the developer who had to make sure the fix was correct on an extremely short timeline. As it turned out, nobody walked away from that issue with any sense of satisfaction, although I'm proud of the fact that the comment for the fix includes, “We have to check this value because there are very bad people out there.” According to the Windows source code, I'm a “very bad person.” I was disappointed when Microsoft rejected my request to print that one-line comment on a T-shirt.
We were also given a chance to see some of the internal tools used by Microsoft, many of which were being put to extremely good use. The Fuzzer Common Library, for example, was an automated tool used to generate broken input for all manner of file formats and network protocols; if parsers were unable to handle many thousands of test cases from FCL, then they had to be fixed. Other surprises included a remarkable lack of profanity in the source code (in contrast to many closed source code bases that I've looked at); we found it much more interesting to look for “BUGBUG” or “FIXME” comments and read things like “We add 1 here. We don't know why we add 1 here but it breaks if we don't, so we'll add 1 and leave it be.” We ended up with an internal competition to find the most difficult-to-read code and the funniest comments; we found some real gems, and again, it was all valuable insight for Microsoft to look at things after the fact.
RAISING THE BAR
In general, Microsoft did an incredible amount of work making sure Vista was as secure as it could be. True enough, people have continued to find security holes in it (and its descendants), but those bugs are getting harder to find and harder to exploit because of SDL; any sufficiently complex system will always have bugs, and Windows is a big target for attackers. SDL went a long way to raising the bar; I can't speak highly enough of either the process itself or the people who implemented it, and I feel privileged to have been a part of it.
In the end, was Vista secure? That's a hard question to answer, but the results speak for themselves; the bottom line is that vastly fewer bugs were present in Vista because of Microsoft's efforts. SDL has significantly improved things and it keeps getting better; if security really is a process, then Microsoft is nailing it. Bringing in a bunch of hackers and giving us access to their source code was a huge gamble for Microsoft, but it was the right thing to do, and it's paying dividends. There are a lot of companies out there that could learn a few things from Microsoft; it’s come a long way, and the industry-wide adoption of SDL is proof of its success.
Chris Paget is Chief Hacker at Recursion Ventures. Send comments on this article to email@example.com.
- Windows Server 2012 Security from End to Edge and Beyond –ComputerWeekly.com