A new survey conducted by the SANS Institute and sponsored by Illumio Inc. broke down common attack vectors and...
pain points in cybersecurity strategy. Illumio's takeaway was that cybersecurity needs to get more dynamic and adaptive. Experts tend to agree on that, but don't agree on how to achieve adaptability.
The survey, The State of Dynamic Data Center and Cloud Security in the Modern Enterprise, polled 430 security and risk professionals across a range of business sizes and found that 44% of the respondents who had experienced a breach and were able to share their experience suffered the loss of sensitive data. Respondents also revealed that 63% had experienced at least one breach resulting in data loss over the past 24 months.
Illumio found that fears about attacks didn't always match the reality of attacks. While 68% of respondents feared attacks that took advantage of access management vulnerabilities, only 18% blamed access management as the actual root cause of compromise in breaches.
Sumit Agarwal, co-founder and vice president of product at Shape Security Inc., in Mountain View, Calif., said the fear was likely more a fear of public embarrassment than a fear of compromise.
"It's understandable that IAM has the attention of many in the security community," Agarwal said. "Many of the most highly publicized breaches over the last couple of years were traced to poor IAM hygiene. When these errors make the news, it is highly embarrassing for those responsible."
Topping the list of attack vectors that led to compromise were application vulnerabilities, with 50% of respondents placing blame there.
Alan Cohen, chief commercial officer for Illumio in Sunnyvale, Calif., said this was due in large part to the speed of development.
"Increasingly, people build applications in a fast, agile manner," Cohen said. "On one hand, you get DevOps and you get really fast application development, but the security technology to protect that has not kept pace. The ability to write code is a lot faster than the ability to protect code and protect data."
Michael Taylor, applications and product development lead for Indianapolis-based Rook Security Inc., said that secure development has to be emphasized from the beginning.
"Attempting to shoehorn security into an application as an afterthought or post-breach is inefficient from a cost perspective in development time," Taylor said. "Thorough testing of applications developed in-house utilizing the same methods that an attacker would use should also be included as a standard operating procedure for companies that value security."
Regardless of attack vector, the survey showed that enterprises still struggle when it comes to containing an attack and remediating vulnerabilities afterward. The survey showed that 37% of respondents were able to contain an attack within eight hours of detection, and attributed this number to the relative speed with which IT could quarantine "application-centric or malware-based infections."
Cohen said that while this number could be seen as recognition that security professionals have made improvements and are more vigilant today, the real challenge is in the 63% who couldn't contain the attack in the first eight hours. But he also admitted the first time an attack is detected may not be anywhere close to when the attack started.
"When do you recognize you have a cold -- the minute the cold virus first infected your system or when you first show symptoms?" Cohen asked. "Let's say a piece of malware gets on a server; it may sit there for six months, and in that six months, it may not be doing any harm. It's just effectively dormant." He said that "when it wakes up and starts to take action is probably the best you can hope for to be able to do something about it."
According to Cohen, "east-west" traffic will be the new battleground for containment.
"On one hand, these things need to communicate with each other to actually do their job as a multi-tier application. On another, you must have the ability to control it. If you don't lock down the communications between the application servers, once they're breached, everything in that application is wide open."
However, the survey also showed that containment was not the only area in which IT departments were moving slowly. When it came to changing security controls, 56% of respondents said it took less than two weeks to have a security change control configured, approved and applied, while 35% said it took longer than two weeks and 9% had no idea how long it took.
Cohen said this proved security controls change too slowly, and the best option would be to re-architecture security, so it is adaptive to changes.
"Most data center security technologies are based on chokepoints in the network. They are manually configured and managed. As computing becomes more dynamic and distributed -- e.g., containers, cloud -- it is managed through orchestration and automation," Cohen said. "Security must follow a parallel model, and have the ability to detect and respond to computer power spinning up, down or moving. Moreover, traditionally, security is managed by a dedicated team in a silo, separately from the development and operations/infrastructure teams. As DevOps speeds up the application development and onboarding process, security must operate at the same pace."
While experts agreed that cybersecurity strategy needs to change in order to be faster, more dynamic and more adaptive, how to best do that was a point of disagreement.
Robert Brown, director of services for Verismic Software, advocated for increased use of cloud and subscription services.
"I have seen evidence of the vast infrastructure these companies use, like Dropbox and Office365, which have the crème de la crème of resources and technology working to keep your data and service as safe as possible. I'll encourage everyone to use the cloud, and I see very little to sway me otherwise."
Agarwal and Taylor both said the trust models that security teams have been using need to be rethought. Agarwal said the move to more virtualization should allow resource use and reconfiguration more dynamic, while Taylor suggested not trusting any network traffic by default.
"Instead of building a walled network that everyone who is inside is trusted, you inherently mistrust all network traffic and design your systems accordingly," Taylor said. "This prevents the over reliance on firewalls and IDS/IPS solutions, and encourages application developers to create more robust and security-conscious code. Too frequently, developers assume that because their application will only be accessed from within the 'trusted' network that they do not need to think about securing it from attacks coming from that vector."
Brad Hibbert, CTO at Phoenix-based BeyondTrust Inc., said a layered security model will continue to be essential, but each layer should not be viewed in isolation and threat analytics will be key.
"Advanced persistent threats often go undetected because traditional security analytics solutions are unable to correlate diverse data to discern hidden threats. Seemingly isolated events are written off as exceptions, filtered out or lost altogether in a sea of data," Hibbert said. "The intruder continues to traverse the network, and the damage continues to multiply. Taking a more holistic view of security enables information and context to be shared across the various layers, enabling tightened and more dynamic security controls, while, at the same time, minimizing impact on the business. The goal is to enable IT and security staff to reveal previously overlooked cases of user, account and asset risk, and take appropriate actions -- faster."
Learn more about implementing data center security best practices.
Learn how the IoT explosion challenges data center security.
Find out why advanced persistent threats can be so hard to detect.