RSA 2017: Special conference coverage
Reporting and analysis from IT events
SAN FRANCISCO -- Security practitioners have a difficult job when it comes to talking about cyber incidents and...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
vulnerabilities because "loose lips sink ships," according to one expert.
Wendy Nather, principal security strategist at Duo Security, told an audience at RSA Conference 2017 there can be tension between what a researcher might expect from a company after disclosing a vulnerability and a CISO's reality in the security remediation process.
"It used to be that none of the talks you saw [at a security conference] were speakers coming up here and saying, 'This is how we're doing something and it's working really well,'" Nather said, because it would make that company a target. "Then we started seeing more discussions, but they tended to be in the past tense. ... Now we're starting to see more leaders coming forward and really talking about what they're doing now on an ongoing basis and how it's working."
Nather noted that some companies -- including Google and Facebook -- have been more transparent about security remediation processes, while others -- like Target -- have been forced to speak more openly because of serious breaches. But, Nather credited Target for becoming a good role model for retail after its breach and said it now even gives security operations center tours.
"There's more and more discussion now about how things are working in defense and whether they're working well but at the same time, there's still a lot of risk," Nather said. "And, who's benefitting from taking one for the team? Who is benefitting from getting up here and saying, 'This is how we're doing it.'"
Behind the scenes of security remediation
Nather said there is almost always more going on at an organization when it comes to remediation or secure design than one might expect.
The answer to, "What idiot did this!?" is almost always, "A smart, well-intentioned person making tradeoffs you hadn't even considered."— Jason Specland (@jayspectech) December 14, 2016
"This is especially prevalent in government because they are sitting on a huge base of legacy code going back in some cases hundreds of years. If you're talking about legislation as code, they're sitting on legacy code that they have to make work even though it was written hundreds of years ago," Nather said. "That's just as hard to do and they have to make things work despite the legacy code they're sitting on."
Beyond issues of legacy code, Nather noted legacy systems were designed without anticipating the need for complex passwords.
"Speaking of legacy practices, does anyone remember why we have the silly rule about changing passwords every 90 days? It's because somebody very smart calculated at the time that it would take about 90 days on average to crack your average password," Nather said. "That became so glued into our systems that we're just now starting to push back against it. There is no point in making thousands or millions of users change their passwords on a regular basis just because at some point in the past there was a valid risk scenario."
Frequent forced password rotation is a security worst practice. It usually indicates an organization runs on cargo cult security.— Perry E. Metzger (@perrymetzger) February 21, 2017
Nather said mainframes and other legacy systems are likely to stick around for a long time because they are too expensive to replace and generally, the older the legacy system, the more inertia it has. "If it's still in use, it's generally business critical," Nather said.
Additionally, Nather said security design needs to take into account a huge number of variables. For example, when designing multifactor authentication, you need to take into account that not everyone will have a cell phone or smartphone, a stable internet connection, or the understanding of how technology works. Each of these cases needs to be included in a perfect design.
CISOs may also not take into account the hidden cost of bad UI, Nather said, which will increase the need to handle support calls in the future.
"It is a cost to your organization every time you have to do support, and that's a very real business cost," Nather said. "Whereas security, maybe it'll happen, maybe something bad won't happen. You can argue the probability there, but there is a probability of one that if you get a lot of support calls it's going to cost you money."
Disconnect between researchers and CISOs
Security researchers are often critical of organizations because of how long it takes them to do security remediation, Nather said. But, researchers don't always take into account why a company or government entity may need more time to fix an issue.
"CIOs do not have people just sitting around waiting to fix something. They have all their resources tightly planned; every single person is usually at least full time, if not double time with tasks already, especially with things that were planned two years out," Nather said. "You have no idea what's going to happen in five years, but you have to plan for it anyway. Those things are tightly scheduled and you cannot just bring in contractors, especially in government."
Nather noted that legislative mandates can also impose tasks that cannot be delayed in favor of security remediation. Nather said researchers often ignore the fact that "security remediation is more than a code change."
"Researchers have no idea how long it really takes to fix code," Nather said. "They know what it would take them to do the fix, but there's so much more in the process that has to happen after that."
The code change needed to remediate a security issue may only take a relatively short time, according to Nather, but after the code change the organization needs time to schedule quality assurance, to identify any issues those changes introduce, to develop more fixes for those issues and to do more QA testing. In some cases, code changes may take about six weeks, but deploying the changes to production may take up to one year.
Additionally, Nather said security researchers may have views very different from organizations of the probability of attack and the risk of remaining vulnerable. Nather said organizations can often end up with "cheeseburger risk management" where an enterprise will accept the risk of remaining vulnerable (ignore the health risk of eating cheeseburgers) until something bad occurs (heart attack), or until another organization experiences the "heart attack" and that can be used to justify security remediation costs.
"Unfortunately, this is a model that is practiced a lot in security today. There's nothing to say this isn't a reasonable business model, because if it would cost you $1 million a year to do a security program, but you don't get breached until your second year and it costs you $500,000, you came out ahead. So from a business standpoint you can't really argue with cheeseburger risk management in some cases."
Learn more about overcoming the challenges of working with legacy code.
Find out why government compliance-based security remediation is failing.
Get info on why defenses may focus more on damage control.