None of us relishes an audit--outsiders poking around for the holes in my system? When someone says "audit," you probably think of the surprise inspections your company's auditors pull to try to expose IT weaknesses (see "Incomplete Audits").
But you're the one on the hot seat if your organization gets hacked. If you're responsible for information security, you should want--you should insist--on thorough annual audits. In some cases, you may have no choice. Financial institutions, for example, are required to have external auditors certify compliance with regulations such as the Gramm-Leach-Bliley Act (GLBA). Your own organization's audit department may require it. Or potential partners or customers may insist on seeing the results of a security audit before they do business with your company and put their own assets at risk.
So you bring the auditors in. But what if the auditors fail to do their job correctly? You're still the one feeling the heat after an attacker brings your Web site down or steals your customers' financial information.
Don't let this happen to you. And it won't, if you know how to:
Intelligently evaluate the ultimate deliverable--the auditor's report. An audit can be anything from a full-scale analysis of business practices to a sysadmin monitoring log files. The scope of an audit depends on the goals. The basic approach to performing a security assessment is to gather information about the targeted organization, research security recommendations and alerts for the platform, test to confirm exposures and write a risk analysis report. Sounds pretty simple, but it can become quite complex.
Establish a Security Baseline
How to manage a successful audit
- Establish a security baseline through annual audits.
- Spell out your objectives.
- Choose auditors with "real" security experience.
- Involve business unit managers early.
- Make sure auditors rely on experience, not just checklists.
- Insist that the auditor's report reflects your organization's risks.
Your security policies are your foundation. Without established policies and standards, there's no guideline to determine the level of risk. But technology changes much more rapidly than business policies and must be reviewed more often. Software vulnerabilities are discovered daily. A yearly security assessment by an objective third party is necessary to ensure that security guidelines are followed.
Security audits aren't a one-shot deal. Don't wait until a successful attack forces your company to hire an auditor. Annual audits establish a security baseline against which you can measure progress and evaluate the auditor's professional advice. An established security posture will also help measure the effectiveness of the audit team. Even if you use different auditors every year, the level of risk discovered should be consistent or even decline over time. Unless there's been a dramatic overhaul of your infrastructure, the sudden appearance of critical security exposures after years of good reports casts a deep shadow of doubt over previous audits.
If you don't have years of internal and external security reviews to serve as a baseline, consider using two or more auditors working separately to confirm findings. It's expensive, but not nearly as expensive as following bad advice. If it isn't practical to engage parallel audit teams, at least seek a second opinion on audit findings that require extensive work.
Objectives: Know What You Want
Spell out what you're looking for before you start interviewing audit firms. If there's a security breach in a system that was outside the scope of the audit, it could mean you did a poor or incomplete job defining your objectives.
Let's take a very limited audit as an example of how detailed your objectives should be. Let's say you want an auditor to review a new Check Point firewall deployment on a Red Hat Linux platform. You would want to make sure the auditor plans to:
- Review and document the security mechanisms configured on the Check Point firewall and the Check Point Management Station.
- Review the Check Point firewall configuration to evaluate possible exposures to unauthorized network connections.
- Review the Red Hat Linux OS configuration to harden it against security exposures.
- Review router configuration and logging procedures.
- From a security perspective, certify the firewall and OS for production.
- Document disaster recovery procedures for the firewall and OS and "good housekeeping" procedures for Check Point's Object Management.
- Perform a penetration test once the firewall and OS are in production.
In general, when we talk about audits--especially by outside auditors--we're talking about security assessment reviews. A complete security assessment includes penetration testing of internal and external systems, as well as a review of security policies and procedures. It's a cooperative, rather than adversarial, exercise to learn about the security risks to your systems and how to mitigate those risks.
There are other kinds of audits that have a much narrower focus and are of far less value. In the worst-case scenarios, they can do more harm than good:
Black Box Audits:
Some IT managers are enamored with "black box" auditing--attacking the network from the outside with no knowledge of the internal design. After all, if a hacker can perform digital reconnaissance to launch an attack, why can't the auditor?
A black box audit can be a very effective mechanism for demonstrating to upper management the need for increased budget for security. However, there are some drawbacks in emulating the actions of malicious hackers. Malicious hackers don't care about "rules of engagement"--they only care about breaking in. They have plenty of time to gather information and have no concern about what they break in the process. Who owns the first router into the network, the client or a service provider? A malicious hacker wouldn't care. Try hacking an ISP and altering a site's DNS records to break into a network--and maybe get a visit from the FBI.
A black box audit is a view from a single perspective--it can be effective when used in conjunction with an internal audit, but is limited on its own.
Audit departments sometimes like to conduct "surprise inspections," hitting an organization without warning. The rationale behind this approach is to test an organization's response procedures. In reality, it's usually an attempt to catch someone with their pants down rather than a proactive effort to improve an organization's security posture.
Surprise inspections can backfire badly if critical work is interrupted by such a "fire drill." Think of a trading floor getting flooded with port scans during prime business hours. Some auditors seem to believe an organization will take extra security measures if they know an audit is pending. In reality, even if the organization performs a quick cleanup, it won't disguise embedded security problems. Surprise inspections run the risk of causing as much service interruption as an actual hacker attack.
Hiring an Auditor
You may be tempted to rely on an audit by internal staff. Don't be. Keeping up with patches, making sure OSes and applications are securely configured, and monitoring your defense systems is already more than a full-time job. And no matter how diligent you are, outsiders may well spot problems you've missed.
Technical audits identify risks to the technology platform by reviewing not only the policies and procedures, but also network and system configurations. This is a job for computer security professionals. Consider these points in the hiring process:
Look at the auditing team's real credentials. Don't be influenced by an alphabet soup of certification letters. Certifications don't guarantee technical competence. Make sure the auditor has actual work experience in the security field acquired by years of implementing and supporting technology.
Résumés of the auditors should detail security projects--not just audits--they have worked on, including references. Real-world experience implementing and supporting security technology gives an auditor insight into subtle issues that could reveal serious security exposures. Any published works should be included to demonstrate the auditor's expertise.
And don't be impressed by people who call themselves "ethical hackers." Many so-called ethical hackers are just script-kiddies with a wardrobe upgrade. Do your homework. Network with people you know and trust in the industry. Find out what they know about prospective auditing firms. See if you can track down clients who have used the firms but are not on their reference list.
Find the right fit. Meet with a range of auditing firms. Consider the small firms specializing in security, along with the Big 4 accounting firms to see which best meets your needs. An auditing firm needs to know if this is a full-scale review of all policies, procedures, internal and external systems, networks and applications, or a limited scope review of a specific system.
Smaller firms may choose not to bid on a large-scale project, and larger companies may not want to bother with a review of one system, because they're reluctant to certify a system without looking at the entire infrastructure.
Insist on the details. Some firms may be reluctant to go into great detail about their methods without a contract. They may simply slide a sales brochure across the table and say, "Our record speaks for itself." Don't be hoodwinked by this; while it's nice to know they have a combined 200 years of security expertise, that doesn't tell you a lot about how they plan to proceed with the audit.
If they're serious about bidding for your business, the auditors will put together a statement of work (SOW), which details how they plan to meet your objectives--the methodologies and deliverables for the engagement. The devil is in the details, and a good SOW will tell you a lot about what you should expect. The SOW will be the basis for a project plan.
The SOW should include the auditor's methods for reviewing the network. If they balk, saying the information is proprietary, they may simply be trying to hide poor auditing methods, such as simply running a third-party scanner with no analysis. While auditors may protect the source of any proprietary tools they use, they should be able to discuss the impact a tool will have and how they plan to use it. Most good auditors will freely discuss their methods and accept input from your organization's staff. Basic methodology for reviewing systems includes research, testing and analysis.
Agree on the appropriate payment plan. The bottom line for the bid is how much it will cost and what you're getting for your money. Some auditing firms quote a flat rate in return for a report detailing their findings and recommendations. Others may estimate the number of days an audit will take, with both sides agreeing to a flexible cost, within limits.
For a complex audit of an entire company, many unanticipated issues could arise requiring extensive time from the auditors, making a flat rate more attractive for the contracting organization. If the organization has good documentation or if the scope is limited, a flexible rate may be more economical.
Prepare to Be Audited
Auditors must make certain assumptions when bidding on a project, such as having access to certain data or staff. But once the auditor is on board, don't assume anything--everything should be spelled out in writing, such as receiving copies of policies or system configuration data. These assumptions should be agreed to by both sides and include input from the units whose systems will be audited.
Nobody likes surprises. Involve the business and IT unit managers of the audited systems early on. This will smooth the process and perhaps flag some potential "Gotchas!", such as a dispute over the auditor's access.
Consider the case of one respected auditing firm that requested that copies of the system password and firewall configuration files be e-mailed to them. One of the targeted organizations flatly refused. In fact, they thought the request was a social engineering test. Their security policy prohibited external release of any files requiring privileged access to read. If the audited organizations had been involved in the process from the start, problems like this might have been avoided.
So, set the ground rules in advance:
1.) Your managers should specify restrictions, such as time of day and testing methods to limit impact on production systems. Most organizations concede that denial-of-service or social engineering attacks are difficult to counter, so they may restrict these from the scope of the audit.
2.) Make sure the auditors conform to your policy on handling proprietary information. If the organization forbids employees from communicating sensitive information through nonencrypted public e-mail, the auditors must respect and follow the policy. The audit report itself contains proprietary data and should be handled appropriately--hand delivered and marked proprietary and/or encrypted if sent through e-mail.
3.) Give the auditors an indemnification statement authorizing them to probe the network. This "get out of jail free card" can be faxed to your ISP, which may become alarmed at a large volume of port scans on their address space.
As part of this "prep work," auditors can reasonably expect you to provide the basic data and documentation they need to navigate and analyze your systems. This will obviously vary with the scope and nature of the audit, but will typically include:
- Copies of all relevant policies and procedures. Policies may include end-user policies (password expiration, virus scanning, acceptable use); privacy (for internal users and client data); privileged access (sysadmins) and incident handling. Some of the procedures to review are data backup, disaster recovery, incident response and system administration.
- A list of OSes.
- Network topology, specifying target IP ranges.
- External security devices (firewall software, IDS).
- List of application software.
The entire process of analyzing and then testing your systems' security should be part of an overall plan. Make sure the auditor details this plan up front and then follows through. For instance, using the Check Point/Red Hat example cited above, a general outline would include analyzing and then testing vulnerabilities:
- For the OS: Directory structure, application packages installed, logging capabilities and services available for the Linux OS.
- For the firewall and management console: system configuration and authentication mechanisms, in addition to logging capabilities and available services.
The auditor should begin by reviewing all relevant policies to determine the acceptable risks. They should check for unauthorized implementations such as rogue wireless networks or unsanctioned use of remote access technology. The auditor should next confirm that the environment matches management's inventory. For example, the auditor may have been told all servers are on Linux or Solaris platforms, but a review shows some Microsoft servers. If the auditing team was selected for Unix expertise, they may not be familiar with Microsoft security issues. If this happens, you'll want the auditor to get some Microsoft expertise on its team. That expertise is critical if auditors are expected to go beyond the obvious. Auditors often use security checklists to review known security issues and guidelines for particular platforms. Those are fine, but they're just guides. They're no substitute for platform expertise and the intuition born of experience.
More on conducting successful audits
Learn how to ensure audit success with sound security audit procedures
Read up on how to survive a PCI DSS audit
Get help auditing cloud computing to reduce cloud security concerns
The auditor will use a reputable vulnerability scanner to check OS and application patch levels against a database (see cover story, "How Vulnerable?") of reported vulnerabilities. Require that the scanner's database is current and that it checks for vulnerabilities in each target system. While most vulnerability scanners do a decent job, results may vary with different products and in different environments. The auditor should use several tools (see "The Auditor's Toolbox") and methods to confirm his findings--most importantly, his own experience. For example, a sharp auditor with real-world experience knows that many sysadmins "temporarily" open system privileges to transfer files or access a system. Sometimes those openings don't get closed. A scanner might miss this, but a cagey auditor would look for it.
Discovering security vulnerabilities on a live production system is one thing; testing them is another. Some organizations require proof of security exposures and want auditors to exploit the vulnerabilities. This can be dangerous. A successful system compromise may be a graphic way to convince management of the dangers of the exposure, but are you prepared to risk compromising or even bringing down a live system?
The SOW should specify parameters of testing techniques. And the auditor should coordinate the rules of engagement with both your IT people and the business managers for the target systems. If actual testing isn't feasible, the auditor should be able to document all the steps that an attacker could take to exploit the vulnerablility. For example, if the system password file can be overwritten by anyone with specific group privileges, the auditor can detail how he would gain access to those privileges, but not actually overwrite the file. Another method to prove the exposure would be to leave a harmless text file in a protected area of the system. It can be inferred that the auditor could have overwritten critical files.
The Audit Report
The audit's done, and you look at the report. Did you get your money's worth? If the findings follow some standard checklist that could apply to any organization, the answer is "no." If you see pages of reports generated by a vulnerability scanner, but no independent analysis, the answer is, again, "no."
While some commercial vulnerability scanners have excellent reporting mechanisms, the auditor should prove his value-added skills by interpreting the results based on your environment and a review of your organization's policies.
That analysis should reflect your organization's risks. Tools lack analytical insight and often yield false positives. You hired expert people, not tools, to audit your systems. So, how do you know if the auditor's risk assessment is accurate? For starters, have your IT staff review the findings and testing methods and provide a written response.
The auditor's analysis should follow established criteria, applied to your specific environment. This is the nitty-gritty and will help determine the remedies you implement. Specifically, the report should outline:
- The source of the threat--from internal users or the public Internet.
- The probability of exploitation. Have other sites suffered intrusions because of this exposure?
- The impact of the exposure. Bottom line, how much money--or loss of reputation, etc.--will it cost the organization if this exposure is exploited?
- Recommended actions to fix problems. Is it an amendment to the policy, stating something like, "all software must be licensed appropriately," applying patches or a redesign of the system architecture? If the risk is greater than the cost of repair. A low-risk problem, like not displaying warning banners on servers, is easily fixed at virtually no cost. Using an application with a history of repeated security problems may be a higher risk, but it may be more costly to integrate a more secure application. The most secure application may not be the best business application. Security is a balance of cost vs. risk.
- Potential legal liability. Could your systems become a repository for contraband (e.g., child porn, pirated software)? For example, a Web server may have an exposure that would permit an outsider to post files to it, though not overwrite content. This may not seem like a big issue, but people who trade in contraband look for untraceable storage locations for their data.
- The risk of service interruption, such as a DoS attack.
The auditor's report should include a brief executive summary stating the security posture of the organization. An executive summary shouldn't require a degree in computer science to be understood.
A statement such as "fingerd was found on 10 systems" doesn't convey anything meaningful to most executives. Information like this should be in the details of the report for review by technical staff and should specify the level of risk.
Finally, there are occasions when auditors will fail to find any significant vulnerabilities. Like tabloid reporters on a slow news day, some auditors inflate the significance of trivial security issues.
What do you say if there's nothing to say? Rather than inflate trivial concerns, the auditors should detail their testing methods and acknowledge a good security posture. To add value, they could point out areas for future concern or suggest security enhancements to consider.
However, it should be clear that the audited system's security health is good and not dependent on the recommendations. Remember, the purpose of the audit is to get an accurate snapshot of your organization's security posture and provide a road map for improving it. Do it right, and do it regularly, and your systems will be more secure with each passing year.
About the author:
Carole Fennelly is a partner in Wizard's Keys, a security consultancy in the New York City area. She has more than 20 years experience in Unix system administration, primarily focused on security.
Dig deeper on IT Security Audits
Carole Fennelly, Contributor asks:
What is the most underrated best practice or tip to ensure a successful audit?
0 ResponsesJoin the Discussion