This article can also be found in the Premium Editorial Download "Information Security magazine: Compliance vs. security: Prevent an either-or mentality."
Download it now to read this article plus other related content.
Software developers are an intriguing mix of artist and engineer. This is what makes them so creative, but is also the root cause of an epidemic in today's applications--poorly engineered code.
The problem is that software development is not treated like other engineering disciplines. The core concepts are taught, but the rigor is not required prior to becoming a professional. Universities teach courses on cryptography and algorithms, but almost none teach quality or security as part of their software engineering or computer science majors.
To exacerbate the problem, we as an industry do not demand certifications, EIT (engineer-in-training), or other residency programs like other disciplines. As a result, we build houses of straw, just like the three little pigs in that old fairy tale. We are arrogant in our thinking that the house of straw will stand when the Big Bad Wolf comes a-blowing. Our software developers lack security training and discipline, and we've let them get away with it.
And tools are not going to solve this problem.
In fact, they won't even help--not until we know how to use them. I didn't become a better mechanical design engineer because I learned how to use AutoCAD; the tool was simply something that helped me do my job more efficiently--the job I was trained to do properly. Even then, I wasn't allowed to lead a project until I had passed an industry-endorsed certification exam and studied under a certified engineer for five years.
I know, you think we can't wait that long to roll out well-qualified software engineers, right? The rate of technology adoption and demand for developers compresses the time-to-market window for new recruits--and that is exactly why we have such low-quality, insecure software today.
We've got a start on good software development methodologies. Programs like Capabilities Maturity Model (CMM) aim to measure repeatability and how well a software development effort documents its process. Unfortunately, there is no correlation between CMM level and the security of the code produced--whether it's good or bad, the only thing you can be confident in is that it will be consistently good or bad.
Microsoft's Security Development Lifecycle (SDL) is also a step in the right direction, providing specific activities for a development team to integrate security into their process. But this still lacks practical implementation guidelines for how organizations can meld it with the way they create code today.
Neither CMM, SDL, nor any other methodology addresses the core problem: Our developers lack training. They need to be trained on the job and off, in school and as part of their continuing, post-graduation education.
The Big Bad Wolf is not the hacker who gets press by doing things like breaking into Paris Hilton's cell phone. It is terrorist organizations that can wreak substantial havoc on national infrastructure, utilities and transportation systems. Hackers actually help raise awareness and identify problems by tripping land mines caused by poorly written software.
This is not to praise criminal activity, but companies should learn a lesson from hackers: Your software is insecure and you need to do something about it. And remember, it's your poorly developed software that let in the hackers in the first place.
This was first published in March 2007