Revamped FISMA requirements aim to improve federal security

An automated tool and mandates for continuous monitoring try to improve federal information security efforts.

Compliance with the Federal Information Security Management Act of 2002 (FISMA) has long been a thorn in the side

of government agencies. Failing grades from the General Accounting Office have been commonplace, leading to increased scrutiny of government security and the state of data security within respective agencies.

"FISMA was never implemented by measuring security effectiveness, it was only used to justify wasteful exercises in compliance," says Alan Paller, director of research at the SANS Institute.

FISMA, often considered an ineffective paper exercise, has since undergone something of an overhaul. The introduction of an automated reporting tool and mandates for continuous monitoring are aimed at moving agencies beyond data collection to risk management and ultimately, better information security. The road to streamlined FISMA requirements has its challenges, though.


In October 2009, seven years after FISMA was enacted and racked up some $40 billion in costs, Federal CIO Vivek Kundra unveiled CyberScope. The automated FISMA reporting tool replaces the costly annual paper reports, which were manually compiled. Aside from saving the government time and money, the Web-based reporting process aims to enable continuous monitoring and a deeper understanding of federal cybersecurity. 

Developed by the Justice Department, CyberScope is overseen by the Department of Homeland Security (DHS). “We define what gets reported into CyberScope and what CyberScope actually collects,” says Matthew Coose, director of federal network security for DHS’s National Cyber Security Division. The data collected by CyberScope differs from what has been collected in the past for annual FISMA reports, the focus now being on continuous monitoring in response to new guidelines published by the Office of Management and Budget that emphasize continuous monitoring.   

These efforts are also supported by legislation proposed this spring by U.S. Representative James Langevin (D-Rhode Island). The Executive Cyberspace Coordination Act would, among other things, create the National Office of Cyberspace and require agencies to obtain third-party audits of their information systems to confirm their compliance with FISMA. Perhaps most importantly, the bill would make continuous monitoring a requirement for FISMA compliance.   


The call for continuous monitoring and deeper insight into the state of information systems all boils down to shifting operational efforts away from reporting for compliance’s sake to reporting for risk management. The State Department was a trailblazer for the government in its attempt to adopt this “new” approach to cybersecurity.

 “We began doing continuous monitoring even while the FISMA of 2002 reporting requirements were still happening. No one asked us to, but we thought it would help us at the State Department be better prepared for attacks. We were not sure the information we were collecting under FISMA was helping. In fact, we were pretty sure that it wasn’t, so we began looking for something better,” says John Streufert, Deputy CIO for information assurance and CISO at the State Department.

 “The State Department did something with [FISMA] data that no one else did,” says Paller. “They used it to calculate a composite risk score and grade every element of the State Department and managed that at a higher level. The result was a more than 90 percent reduction in risk in less than 12 months. They have hard measurements that show they were able to respond to the Aurora threat at 85 percent in under 10 days, where other agencies were at 65 percent after one month. That’s what led people to say ‘we want continuous monitoring’,” says Paller.

The State Department is accompanied by a handful of agencies that have implemented continuous monitoring and a risk-based approach to improving security, but many other agencies are just beginning their move to continuous monitoring with the mandate to file FISMA reports via CyberScope. And for them, this is a drastic change.

Prescott Winter, chief technology officer for the public sector at ArcSight, an HP company, and former CIO and CTO of the National Security Agency, describes  the situation: “Many agencies are still relying on what I consider old-fashioned perimeter-oriented security monitoring and managing processes. Increasingly, you have to move to a risk management framework that allows you to see where your risks are. Putting that in place in most of the agencies is in its early phases. Many are not as far along as they should be in the previous generation of security and now we’re asking them to move to the threat management model with logs, better command of the correlation process, and on top of that they have to figure out how to bundle it all up and send it to the OMB.”


Paller describes four stages, or levels, of FISMA compliance maturity: having access to data about information systems, having the tools to collect that data, using those tools to feed data into CyberScope and, finally, using that data to minimize threats and manage risk.

Potential challenges lie at each of these stages, beginning with access to the data.  FISMA reports require two sets of information, explains Paller. One set can be automatically generated by the information systems themselves and the other set is provided by humans. Some CIOs don’t have the right to monitor their agencies’ systems for the status of patches, configurations, antivirus updates, etc. “Often the agencies just won’t let the CIO look at everything unless the CIO has a lot of power over them,” says Paller.

These CIOs must rely on other people to provide data that could otherwise be automated, in addition to the data that must be human-generated. If the CIO is given the data, there is no guarantee that it is reliable. This was a problem with the former FISMA reporting process, says Paller.

Of the CIOs who have the right to access the data on their systems, the issue becomes a matter of having software installed on the systems to collect the information. “The right to the systems doesn’t give you the data. The right to the systems only gives you the ability to collect the data,” says Paller.

Agencies have myriad security tools such as antivirus, configuration management and remote vulnerability testers, says Paller. “But they don’t have an agent on each of the machines that feeds the actual configuration,” he said. They could use a vulnerability scanner to identify weaknesses, and “they all have one. But it takes a long time to run across a big agency and uses a lot of resources,” says Paller.

Agencies that are successful at risk management use agent-based endpoint management tools, such as the IBM Tivoli Endpoint Manager (formerly BigFix). While these tools are in a lot of agencies, many  don’t have the tools deployed agency-wide. “Some are saying ‘that’s a show-stopper for me’,” says Paller. As a result, these agencies are stuck at level two of Paller’s maturity model.

The Department of Housing and Urban Development is one such agency. “A few agencies are providing automated data feeds, but HUD is not part of that elite community,” says Marian Cody, CISO for the U.S. Department of HUD.

Two challenges are preventing HUD from submitting automated feeds, according to Cody: The lack of clarity around reporting requirements and a lack of automated tools. “We don’t know the full requirements yet. The DHS specs are not stable. Number two, you’ve got to have automated tools in place in order to put the automated feeds in place. We have a few of the automated tools, but we haven’t worked through all the processes needed to understand exactly what has to be sent,” Cody says.


Understanding the reporting requirements has proven to be a hang-up for agencies and vendors alike. Some agencies are waiting for vendors to make their products integrate with CyberScope, and the vendors are waiting for the National Institute of Standards and Technology (NIST) and  DHS to finalize the guidelines for continuous monitoring.

 “Vendors are important because we’re looking to them to solve the technical problems,” Cody says.

 “We’re waiting to get enough detail on what the reporting process is like and final details on what goes into it,” says HP’s Winter. “In many cases it would be easy to add a reporting module to baselines and operational processes. The question is making sure we get a well-defined technical specification for how it’s supposed to be formatted and flow control -- where it goes and what happens once it gets there.”

But, according to Paller, feeding the data into CyberScope is a non-issue. “It takes a couple days to write the feed to integrate with CyberScope. . All vendors supply data in readable formats. It’s not a showstopper… anyone who has been in the software business for a couple years knows that generally someone has to write a translator so that the two software items can work together,” says Paller.

Coose at the DHS agrees. “If you’re making product development decisions, you don’t want to change the product to accommodate a moving target. But that’s not to say you can’t accommodate the schema with a much less expensive means. A lot of vendors have developed external utilities . . . and when they know what the final requirements are, they can make some bigger investments,” he says.

Coose says fully defining the guidelines for continuous monitoring is a complex process.  “DHS is working with NIST on the standards piece. There’s a lot of moving parts we’re trying to orchestrate. It’s going to have to happen in waves, regardless of how we’d like it to happen,” he says.

However, Coose says the DHS is aware of the difficulty in meeting changing reporting requirements. “On our monthly feeds, when we do turn those on, it will match exactly what we asked for in 2010,” Coose said. “We’re trying to stay consistent. FISMA metrics were changing too frequently in the past and agencies couldn’t establish themselves in one direction before they were changed.”

The reporting requirements of 2010 are a testament to that, as some agencies are still working through the process of obtaining new data sets. “Finding information has been a challenge,” says HUD’s Cody. “If you don’t have business processes in place that have generated the level of detail that is now created in FISMA, it takes some time to get yourself oriented. Under the old FISMA report, we knew how many systems we had. Now we have different data points. It takes awhile to change and begin collecting new [information].”.

Agencies are currently required to submit quarterly reports via CyberScope and, according to Coose, DHS has not required quarterly data that could be automated. Nor will the DHS require manual data when monthly reporting begins. “It’s not worth people’s time and money to collect data manually. We would rather have them spend their time improving security,” Coose says. While CyberScope currently accepts both manual and automated feeds, eventually agencies will be required to submit monthly automated feeds.


And that is the crux of it all. CyberScope is doing much more than making the FISMA reporting process more efficient. It is forcing agencies to put the tools in place that are required to fight today’s threats, Paller says.

 “Security was done manually through the last 10 years because it was primarily a compliance function. The attacks have radically escalated in the past three years and manual methods can’t keep up,” says Paller. “There is no substantive defense against most current attacks if people don’t have these tools in place, so the need for the tools is completely independent of CyberScope.  But they don’t think about it this way because only a few of them think of themselves as security people where their job is to minimize vulnerabilities. Most of them think of themselves as compliance people because their bosses tell them that they’ll find someone new if they fail an audit. The big job is to minimize vulnerabilities, and the only way to do that is with automated systems.”

 “CyberScope is pushing the agencies to get the tools in place so they have a chance to minimize vulnerability. And that’s good – great, actually. And, it’s having the desired effect. DHS is having enormous success in moving people from [level] two to three, and you can’t get to [level] four without [level] three. Is it successful in the agencies that have responded to it? Yes,” says Paller.

Crystal Bedell is an award-winning freelance writer based in Spokane, Wash. Send comments on this article to


This was first published in May 2011

Dig deeper on Government IT Security Management



Enjoy the benefits of Pro+ membership, learn more and join.



Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: