Every day, compliance professionals make determined efforts to satisfy the litany of laws, regulations, and policy...
mandates that make up the information security world, all while fending off relentless attempts by faceless enemies with limited resources and varying degrees of institutional support. And yet, in this eternal pursuit of the two illusive (if not entirely theoretical) goals of data security and security compliance, there are common mistakes that tend to trip people up.
For those that are subject to PCI compliance requirements, at least PCI DSS provides some specific, if not prescriptive, requirements for internal systems and structures. But, it doesn’t provide the framework for a security program. There are practical ways for organizations to build on those technical specifications and focus their efforts on satisfying their compliance mandate, as well as building a robust, comprehensive information security program.
One of the most important points to realize is that “compliant” does not mean “secure.” This is often said, but it bears repeating in almost every context in which we discuss security compliance. Any given standard, including PCI DSS, should be considered the minimum standard, not the maximum. The blessing and the curse of PCI in particular is that it is very technology focused, almost to a fault. For example, the first validation section starts with firewalls; everything starts with policy. Organizations need to go beyond those minimum standards towards something much broader by developing a data inventory, conducting a risk assessment, implementing technical controls, and having a vendor management program. The simple and indisputable fact is, the stronger you build the program, the easier compliance gets.
There are three key principles that are common to all data security laws and regulations, including PCI:
- Data inventory: What and where are my information assets?
- Risk inventory: What exactly could go wrong?
- Risk controls: What exactly are we prepared to do about it?
Most of what is covered in PCI and similar standards, such as the ISO 27000 series, tends to focus on the third bullet -- risk controls. But, to truly build robust, dynamic programs, we need focus heavily on all three areas.
BUILD A DATA INVENTORY
A universal problem for organizations is that they don’t know where their data is. They know where most of it is, or maybe some of it, but not all of it. This is a serious problem. The fact is, unless you can truly identify your data assets (including applications, databases, and systems) you’re chasing a ghost, and compliance, let alone true security, becomes a frustrating, moving target. Imagine an army general attempting to engage in battle without knowing exactly what troops and technological resources were available to him or what his enemies possessed. How could he know how to strategize? And yet, organizations do it every day based on only a general knowledge of where data is kept while living with a gut instinct that there is data in places they don’t even know about. Here is a harsh truth: Data security does not mean securing most of the data, it means securing all of the data.
PCI in particular is very clear on this point. In the “Getting Started” section of the standard, it states: “To assess is to take an inventory of your IT assets and business processes.” And yet, oddly enough, many organizations skip over this step, instead going right towards vulnerability assessments.
You might be thinking about how much work a data inventory would take. But the fact is, how did you let data get there without knowing about it in the first place? Take PCI section 7 as an example (“Restrict access to cardholder data by business need-to know”). How could you possibly claim compliance if you don’t even know all of your data assets? You can’t point to a controlled process and accept, “Well, we’ll just manage access on a go-forward basis.”
Here are some practical suggestions for moving towards better data inventories, which, in turn, will translate to better compliance. If you don’t have a central inventory system, create one. Start at the highest level and work to more granular one as you gather more data. If necessary, start at a facility level, then move down to server groups, then applications and databases. Once you’re down to the application level, identify business owners and administrators. Data sources need to be classified in terms of their type of data and the level of sensitivity.
Once you have your base of data, you can start mapping other elements to it. What are the related internal controls for access rights, change management, planned obsolesce, data retention, fault tolerance, and perimeter controls? How many people (internally and externally) have touch-points to the data? The bigger the number, the bigger the risk. How close is each data asset to the Internet or other external points of connectivity? What monitoring controls are in place for major data assets? Use this information to validate assumptions about separation of duties (e.g., total separation between development, testing, and production regions, etc.) Going beyond the big -- and really obvious -- systems, what are users putting in personal and shared network drives? How freely are analytics groups allowed to replicate data and move it into other regions? What, exactly is being shared with third parties?
Developing this type of inventory is not easy and takes time. But if organizations did nothing else but this, they would discover a whole new level of confidence in the strength of their program. Once you know where the gold is, you can strategize how to keep it safe.
PERFORM A RISK ASSESSMENT
The second biggest problem with data security compliance is the area of risk assessments. The fact is that information security cannot be managed to perceived risk; it must be managed to “assessed” risk. It’s too easy to spend time focusing on controls that are intuitive and just “feel” like they should be in focus. An organization’s biggest risks often are never in exactly the same place they are guessed to be. Beyond the technical and structural aspects of risk assessments, there are a few related points to consider.
The first is cultural. Many organizations struggle with a cultural willingness to acknowledge that risk exists, as if not talking about it somehow makes it not so. The fact is, because confidential data must be used by staff and trusted third parties, we live with a great deal of residual risk. It’s critical for management to understand these risks so they can make informed decisions about how much risk to accept. For organizations that do struggle with a cultural challenge, the path to acceptance is to take an incremental approach.. Start with new project initiatives and develop an assessment around the incremental risk that the project represents.
Another step is to develop risk assessments around emerging threats. This not only gives management an appreciation for the scope of some of these threats, but it gives the information security staff a forum to talk about the preventative and monitoring controls that are in place, as well as the residual risk that must be accepted. Over time, management will learn to appreciate these assessments and understand why they should always be the basis of strategic decision making and a key part of rationalizing information security spending.
A sound risk assessment methodology is particularly critical when it comes to the more prescriptive standards, such as PCI. There may be a section that the organization chooses to opt out of or approach differently, and that decision must be backed up by a detailed risk assessment. Unfortunately, many people see risk assessments as a waste of time because they “already know” where their risks are. Mature organizations make them a part of everyday life.
DEPLOY DATA OBFUSCATION
Now we come to the subject of data obfuscation, including encryption, tokenization, and sanitization. Building on the data asset inventory and risk assessment, this is a process of looking at how data is stored, accessed, and transported, and then mapping strategies to restrict that access to those with a clearly defined “need to know” requirement. Unfortunately, too many times (particularly with third-party data sharing) convenience trumps protection, but that isn’t an acceptable model for data governance.
Hopefully, we are moving towards a model where all data, whether at rest or in motion, remains encrypted right up until the point it is used by an authorized party. PCI and most industry-specific guidelines are clear about mandating encryption of data outside the perimeter (either in motion to remote offices or trusted third parties or at rest on laptops, etc.) and continue to raise the bar with data at rest. Fortunately, advances in appliance-based encryption are supporting this trend. While continuing to pursue a strategy of encrypting everything possible, the analysis needs to be risk-based. If you’re going to allow clear text, you had better be able to rationalize it.
Tokenization, replacing sensitive key values with representative but meaningless strings, and sanitization, removing sensitive fields altogether, are strong, effective strategies for minimizing data exposure, even with trusted individuals. Too many times, companies provide live key fields to third parties, largely because it was the convenient approach. This is clearly not acceptable. There is good emerging technology for data tokenization and worth considering for any organization that has to extend data to third parties without the need for key fields. Again, this must be based on the data inventory and the risk assessment.
Part of that assessment is identifying not just who receives the data, but what data they need. Often this analysis ends at the table level for convenience, when in reality this should be done at the field level. Every field that is disclosed must be justified and documented. Where a key field is needed, a token should be used unless the actual key, such as account number, is needed in order to perform a service. Where sensitive fields are not needed at all, they should be completely removed (sanitized). There is no question that this is added work, but it simply must be done if the data is to be properly protected. Remember one of the few absolute truths in data security: The fewer instances of the data out there, the less risk.
Organizations should also ensure production data is never used in a development, quality control or test environments unless it has been fully obfuscated (via either tokenization or sanitization as appropriate.) This may require some work, but it is not negotiable.
The bottom line here is that gaining access to sensitive data should be an excruciating process. If not, something is wrong. This is the essence of data security compliance.
STRENGTHEN VENDOR MANAGEMENT
An effective information security program -- and true security compliance -- means actively and aggressively managing the information flow with trusted third parties. The fact remains that while they are telling you all the right things in your due diligence questionnaires, you can’t be there day in and day out to monitor their operations. But this is a key area where we can apply governance principles of a data inventory and risk assessment in a profound way.
We start with data inventory: Exactly what data does each third party have and why? It’s so much easier to send a whole data set to a third party instead of sorting out what fields that third party requires to provide its product or perform a service, and nothing more. To provide more in the name of simplicity is just bad management. At any given point in time, an organization should be able to run a report that indicates exactly what third parties have confidential data, what they have, and for how long. Not having this inventory is inexcusable.
Next is the third-party risk assessment: Every third party that receives any level of confidential data (or provides a key service) should have an associated risk assessment performed by the appropriate business unit. What attestation has the third party given on its internal controls? What monitoring controls are in place? What did the analysis of the due diligence conclude? What would be the impact of a data breach within a given third party? These are key questions that should be analyzed and updated annually, at least.
And finally, there’s third-party data sanitization. Data in motion to and from a third party must be encrypted; no exceptions. (I hate to tell you this, but somebody in your organization is emailing the information to a third party in a zip file or password protected spreadsheet and thinks he’s in compliance. I guarantee it. A little tip: your gateway appliance can catch this.) Data that requires key values for record identification, but not an actual account number, must be tokenized. Yes, this takes a lot of work, but it has to be done. Finally, data that does not require key fields (account numbers, card numbers, etc.) must be scrubbed (all key values removed.)
Part of the vendor management process, therefore, needs to include not just documenting what data will be exchanged, but why, how, how often, by whom, and for how long. This information must be re-evaluated at least once a year to see if the scope of the relationship has changed. On contract termination, a written attestation should be received from the third party confirming all sensitive data has been destroyed unless the contract provides otherwise.
WRITE IT DOWN
The final area involves documentation -- a notoriously undertreated area. Many of the relevant laws and regulations are vague on documentation, although the most recent PCI update did expand on this area with regard to testing materials. The fact is that you can never have enough documentation, and the rule needs to be: “If it isn’t written down, you don’t get credit.”
While organizations have gotten better at documenting internal controls (largely driven by Sarbanes Oxley) I am always amazed within any given program how much remains undocumented or under-documented, including:
- Asset inventories
- Risk assessment methodologies and findings
- Risk assumptions
- Security standards (particularly in source code)
- Testing methodologies
- Process flows
- Data handling protocols (typically based on data classifications)
- Lessons learned from errors and incidents
The simple fact is that an organization is not healthy until it develops an almost cult-like devotion to clear, comprehensive documentation. This not only makes the organization stronger, but makes the process of evidencing compliance status infinitely easier. It’s disappointing to hear a professional say, “We just don’t have the budget in the project for proper documentation.” Truly mature organizations make the time and do whatever it takes.
While security standards like PCI DSS and ISO provide some excellent guidance in specific areas to validate, we can never lose sight of the need for a larger program that involves detailed inventories, comprehensive risk assessments and active data governance that goes well beyond the systemic provisions. Taking these steps helps ensure organizations build strong, resilient -- and compliant -- information security programs.
Eric Holmquist is President of Holmquist Advisory, LLC, which provides consulting to the financial services industry in risk management, operations, information technology, information security and business continuity planning. Send comments on this article to firstname.lastname@example.org.