News Stay informed about the latest enterprise technology news and product updates. breach shows poor website security testing

Experts say the latest security breach of the website was caused by lacking security process maturity, downplaying the importance of website security testing.

According to experts, the latest security breach of the website underscores the continuing lack of adequate penetration testing and vulnerability assessment conducted on the troubled online health exchange by the federal IT managers and their contractors. breach recap

The most recent breach, first detected in late August and reported on Sept. 4, was described by federal administrators as "an intrusion on a test server" involving the installation of malware developed to initiate a denial-of-service (DoS) attack on other websites.

A spokesman for the Centers for Medical and Medicaid Services in Baltimore, which runs the website, told The New York Times that the test server did not contain consumers' personal information, no data was transmitted and the website was not specifically targeted.

Nevertheless, the agency reportedly acknowledged that the test server should not have been connected to the Internet, the server manufacturer's default password had not been changed and administrators had failed to conduct standard security scans of the server.

The incident is thought to have been part of a broader DoS attack effort in which hackers managed to upload malware to a development server used to test code. The server was not configured properly and was not supposed to be connected to the Internet.

The latest breach involving, following several past security incidents, is particularly worrisome given the fact that security experts have warned that is a prime target for hackers trolling for personal information. Moreover, U.S. cybersecurity standards now require continuous monitoring of federal systems for vulnerabilities and possible exploits.

Web security testing, process maturity lacking

"It all comes down to process maturity," insisted security analyst Dave Shackleford, principal consultant at Voodoo Security. Along with a lack of adequate security processes, particularly in regard to change management, he said the healthcare site lacks adequate review of new applications and security testing.

"It's amazing they haven't had a larger [breach]," Shackleford said bluntly. "The right hand doesn't know what the left hand is doing and vice versa."

Eric Cowperthwaite, vice president of advanced security and strategy with Boston-based Core Security Inc., noted that the breach and other high-profile incidents over the last 18 months have involved relatively mundane aspects of network security.

"This is an across-the-board issue, not just," stressed Cowperthwaite, who previously served as chief security officer for Providence Health & Services, a Seattle-based healthcare delivery organization.

A key difference in the case of the healthcare exchange is that millions of dollars were spent to build it. "It should have better security, better processes than a $10 million system," Cowptherwaite added. "You need to take both of those things into account" when drawing conclusions from the recent test server breach.

Given the sheer size of the federal healthcare site, Cowperthwaite worries that system complexity and "data overload" are overwhelming security administrators. "We're almost to the point where the human factor can't solve the [security] problem," he warned. "The data overload problem is reaching the point of no return."

That means systems administrators may have to start relying more on data analytics to monitor areas like network configurations, vulnerability and threat data along with known exploits. Then they would attempt to fuse that data analysis to button down network security.

In response to recent attacks, federal network security standards have been tightened in recent months to require continuous monitoring of systems for vulnerabilities, possible attacks and exploits.

According to reports, the test server intrusion was launched with a malware upload as far back as July. The intrusion was not detected until Aug. 25. "One month to detect an intrusion does not sound like continuous monitoring" as now required under the Federal Information Security Management Act, Cowperthwaite stressed.

Added Shackleford, "It's not like they are making catastrophic [security] errors, but they have to be right every time. The hacker only has to be right once."

Next Steps

Expert Michael Cobb offers advice on how to test an e-commerce website's security.

Cobb also providers his Web application penetration testing best practices.

Dig Deeper on Web Server Threats and Countermeasures

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

What lessons has your organization learned from the security breaches?
Great point! I can't tell you how often I see people proclaim that they have done everything that's needed to ensure the security of their network and application environment without spending one iota of effort properly testing their Web applications. Some people believe that a basic vulnerability scan using a network vulnerability scanner (not even a Web vulnerability scanner) is all that's needed. Furthermore, and the thing that bugs me the most, is the canned responses of "Our vendor hosts this so we're guessing they're doing the proper testing" of "Our Web application is hosted in a secure data center and we've reviewed their SOC 2 report and everything looks good" like I wrote about here:
You know what they say about assuming things...