Audit failure: How one lab raised IT security awareness and its audit grade

Learn how Argonne National Lab raised IT security awareness and its audit grade from 'F' to 'A'.

ISM May 2004Year after year, security audits of Argonne National Laboratory were, in a word, abysmal. On a network that lacked even basic firewall protection, every desktop and server was essentially open to the outside world. Breaking in was a cinch.

The appalling reviews -- regardless of who conducted them -- were hardly surprising at the Department of Energy's oldest lab, located in suburban Chicago and devoted to everything from nanotechnology to supercomputing. When it came to investing in science or security, security always got the short end. Argonne's scientific community had little interest in protecting assets under a decentralized system that did little to foster cooperation.

Finally, in early 2001, management that had for so long paid lip service to security decided it was time to stop taking so much lip.

The lab was determined to overhaul its Cyber Security Program Plan -- a flawed, largely ignored document that was at the heart of its audit failures. Too often, Argonne's security policy mimicked the loose language of DOE mandates, failing to clearly outline specific steps needed to protect its IT infrastructure.

Determined to earn "A's" on their audits, IT staffers carefully analyzed systems, beefed up each section of the 40-page document (minus a growing list of appendices) with specific instructions and revamped the network architecture for compliance. It was a gigantic, two-year effort that created a new security culture.

Open Science, Open Networks

Argonne is a popular target for port scans and intrusion attempts by crackers worldwide who are unaware that the center's 5,000 scientists work mainly on unclassified projects. The lab is akin to a giant college campus -- sans students -- which means Argonne operates with open networks to allow collaboration, and with high bandwidth to accommodate the heavy flow of data. In addition, each division, such as physics and chemistry, has its own IT staff and security designee.

"Scientists were so used to having an open-type network forum that they didn't have to worry about this or that protocol not working," explains Mike Skwarek, associate cyber-security program manager. "They were still using Telnet and FTP -- cleartext applications. At the time, nobody was mandating that this was not a good thing to do.

"Every flavor of every operating system ever created existed here," he adds. "Every desktop, prior to us installing a firewall, had every port and every protocol open to the world. We had a number of incidents and compromises because you just couldn't patch all of the machines with all of the vulnerabilities that were out at the time. It was really a cat-and-mouse game."

It's very difficult to write down a policy that makes sense and can be implemented and not detrimentally impact science.

Gene Rackow, senior security administrator

Past audit teams sent by the General Accounting Office and DOE's inspector general pinpointed the cause of Argonne's poor security posture. In addition to few network security resources, such as a decent firewall, there were few written, enforceable rules regarding safe computing practices; those that did exist were routinely ignored.

"The scientific community at the lab had no strong desire for security," recalls Remy Evard, a computer scientist who was recruited to help lead a massive overhaul of the security program in the spring of 2001.

The reason for resistance: Shutting down systems and outlets for security meant shunting research opportunities, maybe even a Nobel Prize, in the eyes of Argonne's end users. Working against that mind-set was mounting political pressure following bad publicity for security lapses at other high-profile DOE research facilities, such as Sandia, Los Alamos and Livermore.

"It's very difficult to write down a policy that makes sense and can be implemented and not detrimentally impact science," explains Gene Rackow, a senior security administrator for the Mathematics and Computer Science Division at Argonne.

It's an especially difficult task when that science is built upon hardware and operating systems that functioned for years on a user- (and hacker-) friendly network.

A Monumental Task

Key to the security turnaround at Argonne was creating a set of policies and procedures to meet federal mandates without compromising the experimentation that is the lab's lifeblood. Finding such a balance meant everyone had to be sold on security.

"We needed to work with users in finding ways of supporting their research, but give them a method of being able to access their machines securely," says Rackow.

So, a team of about two dozen IT members -- including representatives from each research division -- spent the summer gathering user input via weekly town meetings and analyzing network data to reshape the architecture at the lab to make it far less prone to security breaches.

To create the security blueprint, lab officials needed to know exactly how each scientific sector operated.

"If I had one piece of advice, it's know thy network," says Scott Pinkerton, the lab's network solutions manager.

Pinkerton spent weeks analyzing each division's data flow so his team would know which tunnels needed to remain open and what applications might break after a stateful inspection firewall was introduced. Such intense inspections were cumbersome but crucial in deciding which ports to use, which IP addresses behaved more like servers and what custom firewall rules were needed.

In the following months, networks were reclassified and segmented as thousands of machines' IP addresses were changed and brought in line with similar systems, each classified by access level. The "green" network allows outside access; "orange" provides intranet collaboration between main campus buildings and "Argonne West" in Idaho. Machines that don't need to communicate outside a division are on a "yellow" network; contractors, visitors and other outside users are "violet."

The team mapped out how to bring each VLAN behind a firewall while keeping "friendly fire casualties to a minimum." But finding a firewall that was robust enough to meet the unusually high bandwidth requirements was a project in itself. The team eventually settled on Cisco PIX.

"There's not a great marketplace for a 10-gigabit firewall," Pinkerton says.

Argonne's IT staff spent eight months manually tuning signatures on 12 Cisco IDS sensors distributed across the campus backbone; staff-written scripts helped minimize false positives. Argonne used Microsoft and Unix tools to scan for vulnerabilities and St. Bernard Software's UpdateEXPERT to automate patching.

The People Part

The trick to making all the architectural changes stick, though, was changing the way Argonne's scientists viewed security. From the start, the ad hoc security team involved the entire Argonne community in the project.

"In a distributed system or large organization like this, it's important to keep all of the organization involved in setting policy and implementing solutions," says Evard, who normally is focused on high-performance computing. "Management has to be on board, but so does everyone else."

The town meetings were instrumental in gaining cooperation and expanding understanding of the importance of security. When end users have a chance to provide input, they gain a greater appreciation for the importance of security and the security team's work. It also gives them a stake in the process, which encourages cooperation.

"Let's face it, what a user does or doesn't do is really what protects the organization," Skwarek says.

Among the changes that needed to be embraced were weaning everyone from using cleartext password applications and standardizing on secure tools and protocols, like SSH. Users had to be patient with network downtime, which the team minimized by doing installations and upgrades during off-hours. It also created a Web-based system that allowed each department's IT staff to track the source of application breakdowns during configuration changes.

As one project neared completion, another nascent program would gain momentum. This kept the security program in the forefront and on schedule. The first six months, everyone admits, were crazy.

Making the Grade

In a large organization like this, it's important to keep all of the organization involved in setting policy and implementing solutions. Management has to be on board, but so does everyone else.

Remy Evard, computer scientist

For national security reasons, details of Department of Energy's network audits are classified, and visits aren't announced to the labs far in advance. The team had no idea when the next audit would be, so it raced to have as much of the new policy in place as possible and invited peers from other facilities to review its work. Their colleagues were impressed with the transformation -- as were the auditors who had repeatedly shaken their heads in disgust.

This time, in April 2003, auditors nodded in approval when they reviewed documents, conducted external and internal scans, and quizzed users.

The final assessment: "Effective," the equivalent of an "A" in government audit books, according to security team members.

"It was like night and day," Skwarek says. The living security plan now includes details of how the network is laid out and how people must configure new software and hardware. If, for instance, someone wants to add a new Windows machine to the network, there's a "cookbook" with detailed deployment instructions.

To make the process repeatable, the policy includes an annual Web-based self-assessment, where each division's security designee verifies what exists on his network and if the risk levels have changed. A summary is delivered to Skwarek's department and, eventually, to the lab director.

Skwarek says the education of end users and the coming together of the lab's IT community made all the difference in creating guidelines that actually could be followed.

"We have to try to get people to understand that security is now as mainstream as the safety classes you have to take to work with, say, ladders," he says. "It's going to be here, and it's not going away."


ANNE SAITA is senior editor at Information Security and news director of TechTarget's Security Media Group.

This was first published in April 2004

Dig deeper on IT Security Audits

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchCloudSecurity

SearchNetworking

SearchCIO

SearchConsumerization

SearchEnterpriseDesktop

SearchCloudComputing

ComputerWeekly

Close