By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
I think there's two ways to apply the top 25. When you are doing the testing, before you deliver the software or before you deploy it, you can make sure you have a testing process that can check for all the defects or bugs that are in the top 25. Make sure you have software development tools, a manual process or other technologies that can test for those. Automation is always better. On the developer side, both the architects designing the software and the developers building the software should look at that Monster Mitigation list and make sure they are using those techniques because they will prevent many different flaws from getting into the software. You have to be preventative and you have to test to see if anything got in. Does the issue of speed and getting a project completed on time make it difficult to apply both preventative techniques and testing?
The biggest roadblock to secure software is actually just doing the work in the process. We have these lists, we have the information of what not to do, we have the mitigation list to prevent security errors. There are good static and dynamic testing tools out there. The challenge has been trying not to disrupt the development process and not to make it take longer or more people to do the job. To me, the challenge is, how can we insert, in a lightweight way, this information and testing into the development process and still have the software done on time. That's really the challenge that the industry is faced with right now.
Asking them to eliminate all 25 programming errors is going to be a challenge. Veracode Inc. doesn't have a black and white rating system for our software. Only for the highest assurance software, which we call level-five business critical software, we say you need to eliminate all 25. Certainly there's a need for lesser standards for software that is not business critical. It's not running the stock exchange or flying a plane. You can't have a one-size-fits-all list. So I recommend that it gets increasingly easier for vendors to hit the target for software that is not as critical as life or limb. A few years back when Hewlett-Packard Co. acquired SPI Dynamics Inc., IBM Internet Security Systems acquired Watchfire Corp. and Microsoft began really pushing secure software development, did the environment change at all?
When I started at Veracode three and a half years ago we told potential customers that you need to do static analysis in your software development lifecycle; a lot of people didn't know what that was. They didn't understand why they needed to be thinking of security as they were building the software. Some companies got it, like Microsoft and some of the other large product companies, but 95% of the people building software looked at me like I had two heads; that has changed a lot. I think the OWASP Top 10, the top 25 and other ways of describing the problem that focused on talking to developers has made them aware that if they don't do something as part of the software development process, they're going to end up with these vulnerabilities in their products. I also think it has made customers aware so they can ask developers do something about it.
It has in certain verticals. We've seen it in financials and in government. When those organizations are building the software internally, they're outsourcing it or buying some sort of custom software; they're really aware that they need to put some requirements on the development teams to put some focus on security because if they don't they just know it won't be [secure]. Is there another issue where older legacy software is the cause of many major problems; the development teams are forced to go back and take a look at the software and some of them don't have the source code?
That is a major problem. Michael Howard at Microsoft said the highest correlation to how many security bugs are in a piece of software is how old it is. The older the software, the more likely it is problematic because it was built in a way that people didn't understand, either at the platform level, at the language level or the libraries used; no one was thinking about security. So the older the software gets, the worse it is. That is a big challenge because the way we build software is to reuse old code. No one builds software from scratch. It's pretty rare unless it's for a completely new platform or doing something new. In most businesses there's a lot of reuse. Shared libraries or just reused routines that have been around for a while. If you put in a process to secure your application you can't ignore the fact that it's not just the new code. You can't put something in a developer's hands and say whenever you write a line of code you're going to make sure it's secure, because that's only going to solve half the problem. You have to solve the problem of all the old code that's being reused. The other part of it is that there's whole packages of software that haven't been touched in years that are running in organizations. That is a ticking time bomb that is going to become a problem that companies are going to have to address eventually.