Conference Coverage

Browse Sections
This content is part of the Conference Coverage: RSA Conference 2015 special coverage: News, analysis and video
News Stay informed about the latest enterprise technology news and product updates.

McGraw: Software security testing is increasingly automated

One of the challenges of software security, Gary McGraw, CTO of Cigital Inc., recently said, is that "we kind of know what to do in some sense, but scaling that activity across a huge enterprise can be a challenge." The key to progress, said McGraw, a top expert in the field of secure coding, is "automating a pretty standard approach across the entire portfolio."

In this interview, recorded at the 2015 RSA Conference, SearchSecurity editorial director Robert Richardson sat down with McGraw to discuss the prospects for automation. "Both dynamic black-box testing and guided testing using scripts and fairly straightforward simple code review can all be automated," McGraw said. The people charged with carrying out the reviews need to be "directed to automate everything that can possibly be automated."

Increasingly, McGraw argued, testing has to cover all of the apps in an enterprise's application portfolio. "Too often the first approach is to [ask] 'what are my highest risk applications?' and look at those alone. But if all of your low-risk applications have a million vulnerabilities, that's going to take you down."

McGraw acknowledged that looking for bugs that cause software security problems was much more tractable as an automation project than trying to grapple with vulnerabilities caused by flaws in the application's underlying design and architecture: "Trying to scale anything like threat modeling or architectural analysis has been a massive, massive problem."

That said, McGraw noted that "there are some aspects of the threat modeling process and architecture analysis process that you can automate. For example, if in your portfolio tech stack you always rely on a similar set of COTS or similar set of modules or libraries with certain sets of flaws, you can do a very simple-minded look and say well, there are three apps that use that library, that library is susceptible to this kind of flaw, therefore I'm going to look for it."

View All Videos

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

Do you use automated testing to help ensure that your software is secure?
Automate black box testing and code reviews? What? The entire purpose of those activities is to have a human thoughtfully perform them.
This is the key phrase -
"The people charged with carrying out the reviews need to be "directed to automate everything that can possibly be automated.""
This is the key word -

Critical thinking, analysis, investigation, judgment are impossible to automate.

But now let's look into the other aspect: cost - risks. The business might want to limit spending on security testing (and automation) because they are comfortable with the risks.
I think we need to be very careful when we use the Automate word here with security.  There are tools that can help, however, Most of them are static analyzers which can read and interpret common code problems.  Those are fairly easy to find, and should be reviewed periodically by the devs so they can learn not to create the flawed code in the first place.

However, there are also other tools which, allow you to get a peak at what's going on during normal use, a Dynamic test.  I'm afraid that too many are buying this myth that you can automate away all of your Security testing.  

Automation is a tool, but like any tool, it won't nail anything if a human doesn't operate it.
yes, trying to - are they specific test cases/ tools that can be used for this?

I have in the past, but now I am in a small shop and management will not put out the cash for those types of things. Right now I am fighting to get a change management system in place. They do not see the ROI on the software purchases.

A bit of a side note: testing in itself cannot ensure that the software is secure. Unless those bugs found were taken care of and retested.
Automation also cannot identify newly appearing weaknesses.
I'm not sure I understand.  How exactly do you automate dynamic black box testing?  I thought part of the problem was that many of the exploits that are zero day, haven't been discovered yet.  You can't write code to anticipate strange new quirks introduced by some third party libraries upgrade.  Just to give an example. I agree that automated checks can do a lot, but I'm sorry to say, developer training, and penetration testing likely are going to be more important than automation.  (And its NOT testing. It's an automated check of some pattern that can be identified. It cannot have that aha moment and recognize that new pattern, at least not yet - Maybe when we have AI's doing work for us.)
Veretax hit the nail on the head... that was my first thought exactly! Automate black box testing and code reviews? What? The entire purpose of those activities is to have a human thoughtfully perform them.
Sorry -- I didn't see these comments before now. I can't really speak for Gary, but I don't *think* he'd say that the need for human analysis goes away, just that it won't scale to all the applications in a big organization. Whether automated black box testing is enough to cover lower-risk apps is plenty debatable, but on the other hand, it's more than most organizations are doing at present.
I may agree in the sense of checking for known problems, and simulating "brute force" attacks. However, no testing, whether assisted by tools or not can prove the absence of defects.
No argument from me on that.