Information Security

Defending the digital infrastructure

iSTOCK/GETTY IMAGES

Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Fostering an information security culture: CISOs share their best practices

Security pros often talk about establishing an information security culture. These C-level execs explain how to make it happen.

Participants:

MICHAEL ASSANTE
Chief Security Officer, American Electric Power (AEP)

JIM ROBBINS
President, Electronic Warfare Associates (EWA Canada) and Canadian CERT

LINDA STUTSMAN
Chief Information Security Officer, Xerox

ANDREW BRINEY (moderator)
Editorial Director, Information Security
 

ANDREW BRINEY, INFORMATION SECURITY MAGAZINE (ISM): Lots of security pros talk about the importance of making security "relevant" in their company. Define what that means in your organizational context.

MICHAEL ASSANTE, AMERICAN ELECTRIC POWER (AEP): For us, it's an awareness of the importance of security across the enterprise. Post-9/11, critical infrastructure companies have to look at homeland security issues. For AEP, the emphasis is on security around the generation and transmission of power. We've got a number of different protection models to struggle with, as well as traditional corporate security concerns. The criteria for us ranges from how we protect our customer data, risks of outages and downtime, all the way to the risk to our operations, especially those tied to digital control systems.

ISM: Some practitioners will scoff at the notion of security relevance, assuming it's impossible to achieve. Any advice for them?

LINDA STUTSMAN, XEROX: It's said time and time again, but it's absolutely true: You have to get to the point where risk management becomes part of the way you work. That starts with good policies driven by the business-not by security. Communication is absolutely the top factor, through policies and training programs. Then it's determining the few significant metrics that you need to measure.

ISM: A recent ISM survey showed that only about half of IT security decisions are guided by security policy. Is that the case in your organization?

ASSANTE: Not all of our actions are guided by policy. Policy establishes best practices for the business units. We use risk management to evaluate the risks and make decisions, sometimes outside of policy. We take a look at what actions are critical from the perspective of our contribution to the energy grid as well as the value to AEP. If these actions fall outside of policy, we identify additional remediation or protective measures.

STUTSMAN: All of our security decisions are guided by policies, though we always have exceptions for business-justified reasons. Then we look at compensating controls, at other ways to manage that risk. I'm less interested in how compliant I am with a policy and more interested in how secure I am. I can have pockets where baseline policy compliance is sufficient, and high-risk pockets where we need a higher level of control.

ISM: How do you make sure that you are communicating with all of the business units, and that there's consistency in their application of security?

It's very important for CSOs to be the security advocates, to educate the business managers on how security loss or realized risk affects what they do. We need to help them develop models that allow them to see how security will benefit what they're doing.
Michael AssanteAmerican Electric Power

ASSANTE: We have an executive security committee chaired by the chief risk officer. The senior VPs of the various business units are committee members, as well as representatives from significant corporate functions like legal, corporate communications and audit. I provide metrics reviews, and we look at our overall risk map and controls and review significant incidents. The committee is also a clearinghouse for approving policy. If there are requests for exceptions to policy, we ask for risk acceptance by responsible managers. If they can't accept the risk, or if it will put other business units at risk, our executive VPs will resolve it.

It's really important to make sure you have direct access to the senior leadership and, at the same time, have a business unit security support group aligned with the business unit. That way, you establish relationships at strategic, tactical and operational levels.

STUTSMAN: We have a similar approach: An information risk council, comprised of very senior-level management of every business unit globally, and they do basically the same thing. They look at and accept pockets of risk within Xerox. They drive policy. They set the expectations for the security organization and share corporate objectives.

JIM ROBBINS, ELECTRONIC WARFARE ASSOCIATES: It's clear that Xerox and AEP are doing an excellent job, but I don't think that reflects the rest of the world. It's certainly exceptional for senior managers to have policies in place that are actually implemented from a technical perspective. Typically, we find technologists and systems administrators are doing so many things on their own that are totally irrelevant to policy. You're talking about an education awareness program that goes right across the entire business. You're talking about risk management committees. This sounds too good to be true.

ASSANTE: I'll be the first to tell you that to engineer compliance to policy, you have to have the culture in place, as well as the awareness and training. I'm not saying we're totally there yet, but the business unit heads are responsive and aware of the importance of risk management.

ISM: For those organizations that don't have a culture of security, how do they get there? Who or what drives change? Customers? Industry standards? Law and regs? Vendors?

ASSANTE: Typically, advocacy organizations propose legislation, and then establish regulatory requirements. California's new law [CA 1386] is a good example, where companies are now being held accountable for security issues. That helps drive change.

ROBBINS: The regulatory environment is probably more applicable to the critical infrastructure organizations. On the commercial side, it's an entirely different set of change agents. They're more motivated by the real business issues-the bang for the buck and what I need to do to keep my business operating.

The lessons we've learned from talking about all of the theft and all of the fraud and all of the technology issues haven't really done that much to improve overall security. But since the regulatory environment is having a greater impact, we've actually seen a change in how these things are being adopted.

ASSANTE: Education is also an important tool for CSOs or CISOs to drive change. Senior managers are under a lot of pressures. They've got goals and objectives that often have nothing to do with security. Security is something that's very difficult to keep on their radar screen, and it typically only comes into focus when an incident has had a direct effect on them.

It's very important for CSOs to be the security advocates, to educate the business managers on how security loss or realized risk affects what they do. We need to help them understand how their organizations and their processes are permissive to risk or loss, and we need to help them develop models that allow them to see how security will benefit what they're doing.

ISM: Let me play devil's advocate for a moment. If I'm a business manager, and you start spewing grandiose concepts about security mindsets and permissive risk models, my eyes are going to roll back in my head.

ASSANTE: That's a fair comment. One way around that is to approach it as an opportunity-driven event. You have an incident or loss, which gives you an opportunity for a one-on-one conversation, "Here's why this happened and what we should do about it going forward." But sometimes it has to be on a larger scale. We provide both the rationale for security requirements to the business unit and in what ways they'll apply to the business models they're designed to protect.

ROBBINS: It's really the auditors who are getting management's attention these days. The IT security governance documents from ISACA focus on the metrics that the boards of directors and the CEOs have to worry about. These are starting to appear in auditor reports, and that's doing a heck of a lot more to grab executive attention than the incidents themselves. As soon as the auditors start reporting on these things, you've got boardroom-level attention, and that's the driving point.

ISM: What about the people who are actually doing the work in the business units. How do you communicate with them and effectively implement change?

STUTSMAN: We integrated security requirements into our project management methodology. Also, we have an information risk coordinator network in each business unit-some 40 to 50 people who report up through the information risk council. They're the front line, ensuring that security is communicated within the business units, keeping an eye on projects to be sure security requirements are being met, and performing risk assessments.

It's the auditors who are getting management's attention. The governance documents from ISACA focus on the metrics that boards of directors and CEOs have to worry about. These are starting to appear in auditor reports, and that's the driving point.
Jim RobbinsElectronic Warfare Associates

ISM: Do you have a formalized process for server or application code review that applies across the company?

STUTSMAN: We have consistent assessment methodologies as part of our project methodology. Having said that, my organization is one of governance and oversight. We develop common checklists, we give common tools to the organization and provide consulting services to those major projects.

ISM: Describe these tools and checklists.

STUTSMAN: The policy level is very high. Then we drive down to standards, which carry requirements. The checklist takes that down another level. For example, what are your standard firewall configurations? If you're developing a new application, what sort of controls do you need? What do you use for remote access authentication?

ISM: Do you have policies and procedures and checklists in place for how custom apps are developed?

ASSANTE: We work very closely with the CIO to discourage customized application development. Even so, I know that more custom apps will surface, especially around mobile computing. So, we're trying to work on big picture application security controls and training and awareness sessions for development.

At a utility, many testing applications grow up around digital control systems, and these typically fall outside IT for support and development. In one case, we had to specially develop appropriate change-control procedures, production management controls and more focused controls on production networks.

ISM: Do you impose any security requirements on suppliers who connect to your network?

STUTSMAN: At Xerox, every connection with a supplier and external partner is reviewed to ensure that they meet our requirements. By contract, our suppliers must comply with our security policies here. The checklists and tools I referred to earlier are made available to them as well.

ASSANTE: Our third parties are typically two types. First, there are those who need to interact with our energy management systems because they're calling for power or are going to send power into our system. In those cases, we've had to redesign that network with DMZs and restricted access to information. We've had to do a lot of work there, and it's been very costly, but the risk from a critical infrastructure perspective was well understood.

The other type falls into our support structure for all of our assets in the field. It's very difficult, but we're working to get a view of all maintenance contracts we have with people who are supporting gear in the field and want out-of-band connections to equipment. Policy only goes so far, but it can help set expectations and standards for third-party connections. We also have contracting language, but given that other companies have their own ways of doing business, it's sometimes very difficult to change those behaviors.

ISM: Jim, you've written a lot about metrics in a security context-about the characteristics of metrics in terms of what, why and who. Can you describe briefly what you've developed?

ROBBINS: I was involved for several years in a project to develop the Systems Security Engineering Capability Maturity Model. The whole question of whether process maturity even produced anything of value always gave us concern. We tried to define specific metrics for basic business practices, and developed a number of papers on those, including metrics for CIOs. We were probably about three or four years ahead of our time in terms of organizations ready to accept the whole notion of security engineering maturity. Most organizations were still in the checklist days, and the notion of then looking at how mature those processes were just never happened. Beyond that, you had to get to a certain level to even consider metrics. I don't think this has been followed up to the extent that it could have been.

I'm less interested in how compliant I am with a policy and more interested in how secure I am. I can have pockets where baseline policy compliance is sufficient, and high-risk pockets where we need a higher level of control.
Linda StutsmanXerox

ISM: Linda, tell us a little bit about metrics that you've developed in an IT risk management framework-how you measure them, how you develop them.

STUTSMAN: Typically, our metrics have been focused in operational areas: antivirus, vulnerabilities, metrics, patch management. We're evolving to more of a proactive approach: What is the state of security and managed risk across Xerox? We're just in our infancy on this.

ISM: For AV metrics, do you define a baseline of what's an "acceptable" level of viruses that get past your scanners, or acceptable annualized losses due to viruses, measured year over year, or what?

STUTSMAN: We haven't gotten to that level. We're measuring against policy at this point-for example, ensuring that clients and servers are running the latest approved antivirus software. We're measuring incidents in a separate category, year over year, with whatever costs we can derive. It's very difficult to derive any actual information, though.

ISM: So how do you measure what you would consider to be successful incident response, or incident containment, year over year?

STUTSMAN: We're measuring in the areas of response to those emergencies, and in terms of downtime or lost productivity. It's difficult to measure, because it depends on how many potential incidents you have in any given year, and how well you've responded against them.

ISM: If you're developing metrics around things like incident response and AV success, don't you have to factor in the probability that your tracking mechanisms aren't wholly reliable?

ASSANTE: Let's look at traditional physical security issues, like loss prevention. When you measure loss, you assume that maybe 38 percent of what you suffer from theft or vandalism or lost assets is actually being reported to you. In the information security world, when you go after network events, you're looking at a very small percentage. You're probably seeing only about 25 percent of the total.

ISM: Do you use that in your metrics calculation?

ASSANTE: We're experimenting with that number, and it clearly includes things that wouldn't be evaluated as security events. And I'm not trying to say we set limits at 25 percent-we monitor as much as we can. As we fine-tune monitoring and alert processes, the actionable percentage increases. And, if there's a security reason behind an event, the odds increase proportionately that it will show up on my radar screen.

ISM: Would it be fair to say that you're basing your risk management decisions on what you do know, assuming that those are going to be applicable to what you don't know?

ASSANTE: We sample, analyze the threat, draw conclusions and deploy security in terms of policy or operational security protection strategies. Only in areas where the threat is really high-or where we have to understand system vulnerabilities at a more granular level-do we actually get resources and develop custom protection strategies.

ROBBINS: That's a stopgap measure. What you want to learn is how you gather these metrics about processes and effectiveness as you design new applications.

ASSANTE: There's got to be a standard process, and right now we're working toward that goal.

ISM: In a security setting, metrics are very difficult to institutionalize-create, implement, measure against. What do we do about that?

ASSANTE: It's almost overwhelming. Looking purely at the networked world-I hate to say it-but in some cases you have to wait for technology to catch up with requirements to appropriately measure things. Take application events, for example. When someone says, "Turn on the logging capability of a server," I'll say, "Great-who's going to monitor it? How are we going to process that information? How are we going to look at events and correlate activity?" Without those elements, we're unable to do real security information management. And right now, there's nothing we can buy that will satisfy all those requirements.

ROBBINS: The maturity model we're developing includes something on the order of 128 base practices. For most of those, there are identified work products that you would expect to see if these things were actually being done correctly. We did a number of pilot projects and found in many cases that organizations were doing it or understood what they need to be looking for to do it. However, they didn't have a formal program, because the model was just being developed. But they certainly understood the value. We did work in the aerospace industry, the security and intelligence field, and in major software development companies. We tried to capture the things that seemed to be relevant to those organizations.

ISM: Carnegie Mellon's capability and maturity model utilizes five different levels or phases, from initiating to repeatable processes up to level five, optimizing processes. Have you thought about trying to apply that type of thinking to security operations context?

STUTSMAN: Yes. The problem I'm running into is that it becomes very subjective. Say, I place us at a "3" in a particular area-so what? I've asked our auditors if they agree with this scoring, but again, so what? It's my opinion and their opinion-against what? Against the standard in the industry? Who says what's a repeatable process in information security? Which process? There are hundreds of things that we deal with on a regular basis. Some of them may be level five, some of them may be down at level two. Do I average them? Do I weight them? If, for example, the application's security process is lower than, say, the awareness and training process, then perhaps they need to be different scales.

ASSANTE: When we identify a process that works, even if we rate it a "3" or "4," I always worry if it's aligned with the risk. Also, some processes are OK at a "3," but I also need to examine ones that don't even have repeatable processes on the scale. For example, we're got incredible network security control, and we've established a great perimeter, but we need to leverage those resources to increase our overall security posture. We need to put application security controls in place, and we need to design better administrative processes. Contracting, background checks-these are all areas that need to be constantly developed. And they aren't easy, quick wins.

Article 7 of 17
This was last published in September 2003

Dig Deeper on Security Awareness Training and Internal Threats-Information

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

What's your best tip for fostering an information security culture in an organization?
Cancel

Get More Information Security

Access to all of our back issues View All

-ADS BY GOOGLE

SearchCloudSecurity

SearchNetworking

SearchCIO

SearchEnterpriseDesktop

SearchCloudComputing

ComputerWeekly.com

Close