How would you know if your organization has been breached? It's actually a simple question and the answer is often
an assertion of some degree of incident detection capability. However, as one CIO wryly told me during a network assessment, if he chose to spend time and money building a detection architecture that actually worked, it might somehow prove he has security problems. Unfortunately, I don't think he was entirely kidding.
Despite significant advances in detection technologies, many organizations are woefully behind the times with respect to building robust capabilities to successfully identify genuine incidents. Detection is not simply a technical toolset but a complex capability, one that ideally includes well-defined technical and process domains, managed by competent staff. Weakness in any one domain severely diminishes detection effectiveness..
Unfortunately, in many organizations, detection is simply not viewed as a strategic security capability. The result is that activities are limited to deployment of signature-based network intrusion detection system (IDS) sensors that track rather obvious simplistic and typically non-threatening actions, such as port scans. It may be necessary to track port scans to see who may be interested in mapping your asset landscape, but these simple out-of-the-box IDS techniques are not examples of significant detective capabilities for current threats. Moreover, many companies do not have sufficient analytical capabilities to correlate data from multiple gathering points to detect broad attack patterns or complex attacks, despite the wide availability of technologies aimed at solving this very problem.
Some choose to outsource the management of technical detection devices. However, many who do virtually ignore the data provided back to them by their service provider, all the while proudly displaying their managed services contracts to regulators and auditors, as supposed evidence of organizational detection capability. While such contracts seem to have satisfied many regulators and auditors, a proclamation of compliance by an auditor should not be taken by the organization as evidence of a functional capability.
Weak detection capability presents multiple problems. Obviously, such limitations could result in organizations being victimized by network intruders for extended periods, resulting in significant losses of information and perhaps impacting computing resource availability. Another problem is the inability to conduct an adequate forensic investigation after the breach is discovered. Sophisticated detection technologies typically provide some mechanisms useful for capturing historical data that could be beneficial in post-incident analysis; analysis results also may be combined in the form of metrics to support the improvement of controls. Sparse event data about incidents could also weaken civil and criminal cases the organization may wish to pursue. Similarly, failure to develop a strong detective capacity could potentially indicate management's negligence in technology management and therefore a failure to exercise appropriate "due care."
The creation of effective detection capabilities requires development of a comprehensive architecture, including both technical and process components that work to detect activities that may actually threaten organizational assets. This includes not only technical architecture such as intrusion detection/prevention and security information and event management systems but supporting oversight activities as well. A technical toolset to detect threats to assets, if allowed to be the sole emphasis of detection design efforts, is entirely irrelevant in the absence of sufficient operational monitoring and response processes.
Periodic network penetration testing is sometimes misinterpreted as a valid test of detection capabilities. Technical techniques for testing the effectiveness of a detection architecture such as pen testing are only effective in conjunction with close correlation of response actions, which evaluates both success in technical detection (does the IDS identify the test attack?) and appropriate defenses (did the owner of the IDS notice the automated alert and, subsequently, did they react appropriately?). One approach I have employed during security assessments is the use of test techniques of successively increasing severity. In other words, we knock on a door with increased strength and watch for (a) the moment when the knocking is noticed and (b) the appropriateness of the response actions relative to the attack type. This sort of testing supports evaluation of overall detection effectiveness.
Organizations face a clear choice in many realms of security that can be reduced to one very basic concept: Is the goal of the effort to satisfy auditors or is it to actually identify threats in progress in order to better protect information? Those that take minimalist approaches, such as subscribing to low-quality managed detection services without designing appropriate internal processes to act on the data are either kidding themselves, their regulators, or both.
Paul Rohmeyer is a faculty member in the graduate school at Stevens Institute of Technology. He provides technology risk management guidance to firms in the financial services industry, and previously held management positions in the financial services, telecommunications and pharmaceutical industries. Send comments on this column to firstname.lastname@example.org