Manage Learn to apply best practices and optimize your operations.

Monitoring program data and internal controls for risk management

It's sad but true: Some employees are going to leak or even steal sensitive data. But what are the best ways to mitigate that risk? Learn the best ways to create internal controls for risk management in order to keep your data where it belongs.

When trying to prevent the loss of sensitive data, the ability to monitor data flow is an important -- if not the most important -- tool in your arsenal.

In the end, regardless of what you do, a sufficiently funded and skilled attacker will get the data he or she wants. Or, as the old Yiddish saying goes: "To a thief, there is no lock." That's not to say, however, that there's nothing that can be done. In this tip, we'll discuss ways to prevent the theft of sensitive enterprise data, specifically by insiders.

Step one: Hire trustworthy personnel
Insiders are tricky to deal with because they have and need legitimate access to a lot of critical data -- whether it be Personally Identifiable Information (PII), Personal Health Information (PHI) or corporate Intellectual Property (IP) -- in order to perform their day-to-day jobs.

As traditional background checks may be relatively ineffective at predicting future behavior, I prefer to rely on reference checks to get a better idea of whether the employee will fit in with the corporate culture and take security seriously. Though in theory references are people who will only give positive reviews, a talented interviewer can determine a lot about the character of the applicant on the basis of what the reference doesn't say.

Step two: Trust, but verify access rights
Next, discover who has access to what data, determine whether those people need it; then change access rights accordingly. For example, there are few employees outside of the finance department who need access to the day-to-day financial data of the company and, similarly, employees outside of the legal group don't need much (if any) access to data from the corporate law department. On the other hand, however, just about everyone will need access to internal portals or email.

For more information
Learn how to quantify business risk exposure to malware.

Choosing a general risk assessment? Read more.

Step three: Consider DLP, mandate data monitoring
Consider investigating data leak prevention (DLP) technologies. At a base level, a DLP product is essentially a sniffer on steroids. It can monitor files moving across a network or within an operating system and can be configured to look for particular types of data. It is popularly used to protect PII, PHI and IP. When used at the network level, sensors (generally an appliance of some sort) monitor all of the network traffic on a particular network segment or set of segments. These sensors can be deployed in passive mode -- where they monitor the network using a span port or tap -- or active mode -- where the sensor sits "on the wire" and all of the traffic flows through the device in real time. Active mode has the advantage of being able to stop leaks from happening but does necessitate requirements for redundancy so the sensor doesn't become a single point of failure.

Keep in mind, however, that DLP deployments typically help thwart accidental disclosure as opposed to deliberate disclosure, so the DLP network data "sniffing" sensors are generally placed where they can monitor things like outbound email. This isn't a bad choice given that accidental disclosures seem to outnumber deliberate ones by far, but this does mean there are other routes from which dedicated hackers can extract data by using custom software to pass data over the Internet inside encrypted tunnels or using portable storage devices.

Similarly, look at database activity monitoring (DAM) technology. This is a relatively young area of infosec, but is well worth investigating. Basically, it is an advanced form of database auditing. The monitoring can be done via several methods, including network sniffing, reading of database audit logs and/or system tables and even memory scraping. Regardless of the methodology chosen, the tools correlate the data in order to detect and get a more accurate picture of what's going on within the database. This correlation provides the ability to detect attacks as well as provide forensic evidence in the case of an actual breach.

When trying to prevent the loss of sensitive data, the ability to monitor data flow is an important -- if not the most important -- tool in your arsenal. The technology needed to monitor the data, however, may fall into several buckets. These technologies range from network flow analysis (e.g. Nagios, Netflow, Argus, etc.) to DLP, DAM and others. In many cases (especially network flow analysis) there are open source solutions that can provide substantive cost savings. While some of this technology may require a new investment, be sure your corporate management understands that this kind of data flow insight and the ability to recognize what is and isn't normal can help identify a data theft in progress, whether due to an employee copying a lot more data then usual to his or her machine, or noticing that data is suddenly going from an internal server to the Internet in an encrypted tunnel. It also is a great source of potential forensic evidence for a data breach investigation, as well as demonstrating an organization's compliance status to auditors.

About the author:
As CSO-in-Residence, David Mortman is responsible for Echelon One's research and analysis program. Formerly the Chief Information Security Officer for Siebel Systems, Inc., David and his team were responsible for Siebel's worldwide IT security infrastructure, both internal and external. He also worked closely with Siebel's product groups and the company's physical security team and led up Siebel's product security and privacy efforts. A CISSP, Mr. Mortman sits on a variety of advisory boards including Qualys and Applied Identity and Reflective, amongst others. He holds a BS in Chemistry from the University of Chicago.

The recent Verizon Business data breach investigation report is quite interesting. What stands out most is that the vast majority of the data reported stolen was due to outsiders (74%) as opposed to insiders (20%). That's not to say you shouldn't be concerned with insiders. As the report shows, when insiders were involved, the impact to the organization was significantly higher (by approximately a factor of three!) These access rights should be verified regularly (approximately every 3-6 months) to ensure they are current and appropriate. As individual employees change roles within the organization, their access rights should also be checked and changed as necessary. The infosec team should implement these changes, but the brunt of the work must be done in concert with business units, as they are the ones who have insight into what defines appropriate access for each employee.
This was last published in July 2009

Dig Deeper on Risk assessments, metrics and frameworks