Analysis: PCI Tokenization Guidelines offer clarity, but questions remain

Expert Diana Kelley says the new PCI Tokenization Guidelines pave the way for CDE tokenization, but some technical specifications remain unclear.

For years, security experts have touted the value of credit card tokenization for limiting PCI scope. The National

Retail Federation (NRF) listed tokenization in its January 2009 “Key PCI Best Practices” document, and Gartner Inc. analysts John Pescatore and Avivah Litan explained how tokenization can be used to reduce PCI scope in their August 2009 research note, “Using Tokenization to Reduce PCI Compliance Requirements.”

Though tokenization is a great way to reduce PCI DSS audit scope, it’s not a way around PCI compliance overall or PCI validation.

Now, following the long-awaited release of its PCI Tokenization Guidelines in August 2011, the PCI Security Standards Council (SSC) has made it official: tokenization can reduce scope for PCI audits. Organizations that were waiting for the council’s opinion can now forge ahead with implementations, knowing that credit card tokenization is approved for use in a PCI DSS-compliant cardholder data environment (CDE). That in itself will be welcome news to many merchants. In this tip, we’ll cover

The basics of PCI tokenization
Tokenization isn’t unique to payments. Generically, it is the process of taking one data string, such as a 16-digit primary account number (PAN), and replacing it with a different string that is unusable for the original purpose. In the case of credit cards, this means replacing a valid card number with an invalid one that can’t be reused as a valid card number if stolen. The replacement string can have the same format as the original one so it can work with existing systems, but cannot be used in the same way the original string can. This means the token can look like a valid PAN but can’t be used to purchase things. The benefit of tokens is that since they aren’t PANs, they don’t need to be protected like PANs. For example, they can be stored in clear text and -- as long as they’re stored outside of the CDE -- they aren’t in scope of the PCI audit. 

Though tokenization is a great way to reduce PCI DSS audit scope, it’s not a way around PCI compliance overall or PCI validation. Organizations need to comply with the standard whether they’re using tokenization or not. What tokenization brings to the table is the ability to make both of these tasks easier by reducing the size of the in-scope audit surface.

What’s in the PCI Tokenization Guidelines?
The PCI tokenization guidelines do a good job of spelling out the basics of tokenization and how it works in a payment ecosystem. There is coverage of the different components of a tokenization system and graphic representations of how the process works. The guidelines are clear that if a merchant or retailer can recover the PAN, then the merchant’s entire CDE is in scope for PCI DSS compliance. This is a critical point for companies looking for tokenization to reduce PCI scope. To maximize scope reduction, review the proposed solution and implementation plan to make sure PANs are not retrievable by any component in your CDE, and the card data vault where PANs are stored is on a separate network, such as in the payment processor’s cloud.  If a vendor solution can’t limit component access to PANs and separate out the card data vault, look for another vendor.

The guidelines also do a nice job of showing how to reduce PCI scope by contracting with a token service provider (TSP) and clarifying the TSP relationship by explaining the PCI-related roles and responsibilities of merchants and TSPs. By outsourcing some of the tokenization work, responsibility for PCI compliance can be partially transferred to the TSP. Again, this isn’t a “free pass” out of PCI DSS compliance; rather, it’s just a question of how limited the PCI compliance scope and audit can become. A merchant that never views or store a PAN, and doesn’t have systems or access to systems that can view or store the PAN, will have a greatly reduced audit surface.

What the PCI tokenization guidelines don’t cover
The PCI tokenization guidelines are also notable for what they don’t include. The document doesn’t spell out the technical specifications for a PCI-compliant tokenization implementation, which is a shame since the Council admits that as “with many evolving technologies, there is currently a lack of industry standards for implementing secure tokenization solutions in a payment environment.” And the Council has no plans for a certification program like the existing ones for payment applications (PA-DSS) and PIN Transaction  Security Devices (PTS), or the proposed validation requirements for end-to-end encryption products (P2PE).  So, for now, merchants and retailers can use the guidelines as a baseline, but must perform their own due diligence on tokenization vendors and work with their auditors or QSAs to ensure the solutions are implemented in a compliant manner.

The other piece of information that would’ve been nice to see in the guidelines is a better definition of what is a token. This may sound a bit counterintuitive, but it’s actually an important point, because not all tokenization offerings are created equal. The tokenization guidelines themselves confuse the issue a bit by listing one-way hashes and mathematically reversible cryptographically derived values as tokens. An underlying tenant of token security is that the token is irreversible; in other words, it can’t be used to derive the original data value. Reversibility is different from de-tokenization (the process of recovering the original value using the tokenization server). Only the token is required for reversibility. Token systems that use easily reversible methods (such as overly simplistic one-way hashing algorithms) to generate tokens are akin to poor “encryption” (such as XOR) mechanisms that allow attackers to easily recover the original cleartext from the ciphertext. Arguably, if a “token” is reversible, it’s not a token at all, and shouldn’t be used to protect cardholder data. It is disappointing that the Council wasn’t explicit on this point.

While many tokenization vendors are doing the right thing, the merchant is responsible for evaluating the cryptosystem, technically vetting the implementation for acceptability, and  making an appropriate product choice, all in the absence of a specific technical reference standard.  If that sounds hard, it is: It’s hard for trained evaluators in the business of reviewing cryptosystems and their implementations; it’s harder still for merchants that may have overloaded (if they have them at all) information security staff. For maximum protection, look for vendors that generate tokens using strong, irreversible methods.

But don’t let the lack of a validation program or vendor certification scare you. Tokenization is an excellent way to reduce scope and can greatly reduce the audit burden for many retailers and merchants. Just be sure to follow the Council’s Guidance when assessing vendors’ tokenization products and pick one that can be implemented in accordance with PCI requirements. If done correctly, tokenization provides strong card data protection, reduces the size of the audit surface, and can save time and money during the audit process --   and that’s a good thing.

About the author:
Diana Kelley is a partner with Amherst, N.H.-based consulting firm SecurityCurve. She formerly served as vice president and service director with research firm Burton Group. She has extensive experience creating secure network architectures and business solutions for large corporations and delivering strategic, competitive knowledge to security software vendors.

This was first published in September 2011

Dig deeper on PCI Data Security Standard

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

SearchCloudSecurity

SearchNetworking

SearchCIO

SearchConsumerization

SearchEnterpriseDesktop

SearchCloudComputing

ComputerWeekly

Close