Mention information security regulation and large companies start spending big bucks to lobby Congress against...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
In December 2003, four industry groups including the Information Technology Association of America and the U.S. Chamber of Commerce hosted the National Cyber Security Summit. The summit was formed in response to Rep. Adam Putnam's proposal, which would have required companies to implement information security measures. About 30 high-profile industry members, largely information security companies including RSA Security and Entrust, Inc., were invited to demonstrate the industry's commitment to enacting stronger cybersecurity precautions; they were aiming to prove that no Federal regulation was needed. Nearly a year has passed with no substantial summit-related cybersecurity improvements.
MORE ON REGULATING INFORMATION SECURITY:
- Find out if the stick or the carrot is best for improving information security.
- Learn six policies that can help govern information security in an organization.
- Read some previous testimony on the need for legislation to clean up the software development process.
Frankly, I challenge anyone to show me that Internet security self-regulation will work. Sure, the summit representatives may already implement strong security practices. But, even if those 30 companies represent 1% of Internet-connected computers, the remaining insecure PCs create the bulk of information security problems -- we're talking millions of machines that aren't under the summit members' control.
One top argument that I hear against regulated information security is that small companies can't "afford" to adhere to regulations. Do you think we shouldn't have vehicle safety standards because small rental companies can't afford to maintain their vehicles? If maintaining proper information security is cost-prohibitive, perhaps small companies shouldn't do business that puts corporate data, and other people's information at risk. It's not fair to organizations and folks implementing good security that unsecured systems are used as launching points for attacks.
Another argument is that regulations will be poorly written, outdated or make the situation worse. We've all seen how poorly written regulations are problematic. For that reason, I've developed my regulatory proposal to address the bulk of information security problems. In fact, Richard Clarke (the former cybersecurity czar who was never a fan of regulation), and David Culliane, president of the Information Systems Security Association and CISO of Washington Mutual (also not a fan of regulation) have heard the details of my proposal and consider it sound. So, here's the "Winkler Act" in brief:
- Computers attached to external networks must be installed and hardened per the vendor's, Center for Internet Security's and/or National Institute of Standards and Technology's security baselines.
- Vendor patches must be applied to systems attached to external networks within a specified time period based on the criticality of the associated vulnerability. For example, 30 days for a critical vulnerability and quarterly for low risk vulnerabilities.
- Vulnerability assessment tools for identifying the presence of known threats should be utilized on a quarterly basis (at a minimum).
- System administrators, or anyone responsible for maintaining operating systems and applications must take a security course approved by the vendor or legitimate certifying authority.
- Vendors must have an established and documented software test program in place that accounts for common security problems and represents a measurable component of the development process. In lieu of an oversight group, a quantifiable measure written into law would place ownership on vendors for proving proper security software testing.
- Internet service providers must implement software to detect and deactivate systems used to launch well-known attacks, such as denial-of-service and those that distribute spam and viruses -- until they are fixed.
- The civil liability resulting from the failure to implement these rules (i.e., the monetary loss from malicious system use by a third party) would ensure regulatory enforcement; criminal liability may even apply in some situations. Clearly, this part of the law would be specific, to include penalties, oversight groups, language that addresses commerce crossing state boundaries, etc.
Everyone owns the information security problem. While we can always expect miscreants to attack us, either maliciously or for profit, studies by the Defense Information Systems Agency and the Computer Emergency Response Team indicate that more than 97% of successful attacks are preventable. We can't continue to ignore the fact that we're negligently enabling the attackers.
My proposal isn't perfect, yet the recommendations address the major sources of vulnerabilities -- poorly configured systems, badly written software and inaction on the part of people who provide access. The "Winkler Act" would allow enterprises, SMBs, ISPs and vendors to appreciate that enacting and documenting strong security measures would protect them from liability, as well as reduce the likelihood and impact of attacks.
I'd be happy to present the "Winkler Act" to Congress, but I don't have a staff of lobbyists behind me, and as an advocate for federally regulated information security, it's unlikely that companies with lobbying power will invite me to testify. So, the "Winkler Act" remains couched, while Congressional staffers rely on company-sponsored lobbyists.
About the author
Ira Winkler, CISSP, CISM has almost 20 years of experience in the intelligence and security fields, and has consulted to many of the largest corporations in the world. He is also author of the forthcoming book, Spies Among Us.