WASHINGTON -- Not many topics in the security community have been discussed, chewed on and dissected as often as vulnerability disclosure. It's become the "Brown Eyed Girl" of security conferences, the old standard that's trotted out when all else fails.
But a trio of vulnerability researchers on a panel at the Gartner IT Security Summit here today managed to put a new spin on the discussion by suggesting that it's time for enterprises to shoulder some of the responsibility for testing applications. By doing their own research before they decide to buy or deploy a new product, IT departments could put pressure on vendors to pay more attention to security, the panelists said.
"One of the ways that we make vendors more accountable is we let them know that we're going to do our due diligence before we deploy a product," said Thomas Ptacek, principal and founder of Matasano Security, a consultancy based in New York. "Enterprises should be using security as one of the criteria in product testing and the purchasing process."
Security researchers and customers both have complained for years about the amount of time it takes many vendors to fix security vulnerabilities once they're notified. Most large software companies, like Microsoft Corp., Oracle Corp., and others, have established guidelines for dealing with new flaws and researchers know roughly how long it will be before a patch is ready. But many smaller vendors don't have such processes in place, and it's not uncommon for it to take them six months or more to fix a vulnerability.
This has led to the practice by some researchers of publicly disclosing details of a vulnerability before a patch is ready—or threatening to do so—in order to pressure the vendor to act more quickly. That behavior is less common now than it was a few years ago, as many researchers now abide by some form of disclosure policy whereby they might release a few details about the flaw and then wait to disclose the rest until a fix is available. This responsible disclosure philosophy has placated many vendors, but some in the research community still don't think much of it.
David Maynor, co-founder and chief technology officer of Errata Security, likened vulnerability disclosure to noticing that the front door to a neighbor's house is open. If you call and tell the neighbor, and don't tell anyone else beforehand, that's no disclosure. But if you call your friends, go in and have a party and swim in the pool, that's full disclosure, he said.
"The equivalent of responsible disclosure is you go in, eat some food and try on some of their clothes and then you tell them their door is open," Maynor said. "I'm not a big fan of trying on other people's clothes to be honest."
Chris Wysopal, chief technology officer of Veracode Inc., and a long-time vulnerability researcher, said things aren't always that black and white in the real world.
"Some things take more than a few weeks to fix," he said. "I've waited over a year for things to be fixed because they were serious design flaws."
Wysopal also agreed with Ptacek's contention that enterprises should be more demanding of the vendors they deal with. While everyone is worried about the disclosure policies of individual researchers, he said, who is holding the vendors accountable for the way they handle bugs?
"Some of these vendors are irresponsible. They have to actually communicate with the researchers," he said. "Customers need to hold vendors accountable. Ask your vendors what their policy is when they get a vulnerability notice from a researcher."
Most large enterprises do some level of testing of new products during their buying process, but much of it is focused on the performance of the application and whether it works well with the company's existing infrastructure. That, the panelists said, needs to change if customers expect software makers to sit up and take notice.
"I guarantee you that if an enterprise finds a vulnerability in a product it's about to deploy, they're not going to wait two years for a fix," Ptacek said.