News

PCI Council issues long-awaited PCI tokenization compliance guidance

Robert Westervelt, News Director

Using tokenization technology to eliminate credit card data can reduce the scope of a Payment Card Industry Data Security Standard (PCI DSS) assessment, but merchants must be careful to avoid many pitfalls associated with the

    Requires Free Membership to View

technology, according to a new report issued today by the PCI Security Standards Council (PCI SSC).

The vendors left a lot of big chunks, saying they just can’t come to agreement on certain areas and they left it up to the PCI Council to resolve issues and come to some common ground.

Troy Leach, CTO, PCI SSC

The long-awaited PCI DSS Tokenization Guidelines (.pdf) outline how tokens can be used in merchant systems and ways to properly deploy the technology, which substitutes tokens in place of primary account numbers (PANs) to limit the movement of cardholder data in the environment. A properly deployed system in certain merchant environments can “potentially” reduce the merchant’s effort to implement PCI DSS requirements, according to the report.

In its tokenization guidance, the PCI Council stresses that an enterprise’s entire tokenization system -- and all the components connected to and working with the system -- remain within scope of PCI and require a full assessment against all of the controls of PCI DSS. Larger organizations with more complex payment systems and processes may not see a dramatic reduction in scope if tokenization is introduced, said Bob Russo, general manager of the PCI SSC. On-premise tokenization systems have the least potential for reducing scope, while hybrid and outsourced systems can reduce the PCI scope dramatically.“By buying one or more of these technologies, you don’t negate the need for PCI DSS compliance,” said Russo. “When you are scoping these things out, you need to consider the fact that the token itself must be rendered unusable if it is stolen.”

The tokenization document mirrors the Visa Best Practices for Tokenization (.pdf) report, which was issued last summer. Tokens used within merchant analytical systems and payment applications may not need the same level of security protection. Still, Russo said, a properly deployed system needs a minimum level of access controls and monitoring to ensure its integrity.   

“Having the Council say tokenization ‘can’ reduce scope matters,” said Diana Kelley, founder and partner of Amherst N.H.-based consultancy SecurityCurve. “My surprise is how long it took for the Council to come out and say it. I know they need to weigh each statement carefully, but it really did take a long time; a full year after the Visa best practices came out.”

There are no plans to create a certification for tokenization systems, although Russo wouldn’t rule out creating a certification process. The council oversees validation of payment applications under its Payment Application Data Security Standards (PA-DSS). It also validates PIN pad devices to help merchants better assess software and devices.

PCI tokenization: Credit card 
security policy guidance

Experts Diana Kelley and Ed Moyle 
discuss PCI tokenization guidelines,
and how the technology could aid your 
enterprise's credit card security policy.

The tokenization report also sheds light on ways to evaluate the plethora of tokenization vendors on the market.   Enterprises need to evaluate how a vendor secures its system, and the support and token algorithm used ,since every vendor uses its own formula, according to Troy Leach, CTO of the PCI SSC, who oversaw the special interest group (SIG) that created the guidelines.

Leach said the document took a long time to iron out because participants in the SIG, made up of a few merchants and many tokenization vendors, were at loggerheads over certain portions of the proposal. Disagreements ran the gamut, he said, because every vendor has a different way to tackle the same problem.

Vendors had difficulty coming to agreement on supported token formats and the cryptography of tokens. All the vendor systems were using the term “tokenization” but, Leach said, “they were so far apart that at times we questioned whether or not we were wrestling with one domain of methodology or the technologies were so diverse that they were actually addressing the same issue – trying to render cardholder information unreadable – but so differently that it was two or three domains.” 

The SIG had four feedback cycles and the final document was updated as late as Monday, said Leach, who said there were “significant conflicts at times” between various SIG members during the process of creating the document.

“The vendors left a lot of big chunks, saying they just can’t come to agreement on certain areas and they left it up to the PCI Council to resolve issues and come to some common ground,” he said. “We weren’t going to get unanimous decisions on everything, but the document shows a lot of dedication from all the participants.”


There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: