The Payment Card Industry Security Standards Council (PCI SSC) is nearing completion of a new PCI tokenization guidance document, outlining how merchants can use the fledgling technology in the payment process.
There is a distancing of the actual standard from the guidance to give merchants ... different methodologies from a technology standpoint to comply with the standards that are out there.
leader of the PCI SSC Tokenization Working GroupnuBridges Inc.
The guidance is expected to somewhat mirror a recent tokenization best practices document issued in July by Visa Inc. Gary Palgon, who leads the PCI SSC Tokenization Working Group, one of four working groups in the PCI SSC's Scoping Special Interest Groups, said the new guidance will be similar to the recently issued point-to-point encryption and EMV guidance papers. The PCI DSS tokenization guidance will eventually be followed by validation requirements, which can be used by qualified security assessors when evaluating a merchant's environment, he said.
In an interview with SearchSecurity.com, Palgon, who is vice president of product management at Atlanta, Georgia-based tokenization vendor nuBridges Inc., said ould move faster in codifying some of the available guidance, but instead it is moving in small stages so future updates to PCI DSS are well vetted.
"There is a distancing of the actual standard from the guidance to give merchants, service providers and others different methodologies from a technology standpoint to comply with the standards that are out there," Palgon said.
The PCI Council is releasing guidance documents around emerging technologies, including
tokenization. Once the validation requirements are in place, will the guidance eventually be
integrated into the standards or is it a goal to keep them out of the standards and issue guidance
documents moving forward?
Gary Palgon: One of the changes they announced, which will be released as part of the updated standards at the end of October, is that they're moving from a two-year cycle to a three-year cycle of review, and with that they're indicating it's because of the maturity of the standards. With the 2.0 update they're issuing with PA DSS, as well as PCI DSS, they're not increasing the number of requirements, but they're clarifying.
There is still a lot of ambiguity or interpretation to be made and that's where they're hoping that these guidance documents help. The first couple of guidance documents to come out -- the point-to-point and EMV documents -- are guidance documents to be followed by a validation document. From what they said, they don't intend on actually codifying the technology level into the actual PCI requirements. They are leaving the requirements as the principles or the objectives and then they are leaving the technology level as methodologies to address that. There is a distancing of the actual standard from the guidance to give merchants, service providers and others different methodologies from a technology standpoint to comply with the standards that are out there.
As a person leading one of these special interest groups how do you feel about the fact that
the standard won't likely be updated with specific information on some of these emerging
Palgon: The overall goal from the payment card industry's initiative is to reduce the risk and to secure the information. From a technology standpoint, they've called them emerging technologies, but the fact of the matter is that even tokenization has been around for several years and organizations are using it successfully to increase the level of security beyond that of the actual PCI DSS requirements. I think the guidance gives merchants options. From a security standpoint, companies need to really take it a step further and really understand how to best protect their organizations. The PCI Council could go further and faster in codifying or approving some of the decisions that are out there. We made great progress this past year, especially the last two and a half months were the technical working group, the group that actually releases the standards and these guidelines, began working in tandem with the special interest groups. From a personal standpoint, even from a nuBridges standpoint, we would like the council to move quicker in approving and blessing these solutions and technologies. I think they're just taking it one step at a time.
The recent guidance documents around point-to-point encryption and EMV technologies laid out
that although encryption itself is mature and has been used for years, the new products hitting the
market are immature in terms of implementation. Do you agree with that and how does it relate to
tokenization? Some of these new products have a tokenization piece in them.
Palgon: The tokenization solutions obfuscate the data and map it to a value but you are still storing the credit card number and handling key management as part of that, but you are removing the mathematical relationship between the token and the information being stored. So encryption is definitely a part of all tokenization solutions out there. It was end-to-end encryption versus tokenization, but we're now seeing a convergence.
Another area outlined in the recent point-to-point encryption guidance was a warning to
merchants that there is a possibility of vendor lock-in. Is vendor lock-in an issue for
Palgon: From the PCI Council's standpoint, they're saying they don't intend to radically change the actual requirements year after year, but they also want to caution that when you choose a technology to use, you don't get locked in. I think there are specific examples or circumstances. For instance, with tokenization, if you select a single outsourced tokenization provider for your payment processing and then decide to change providers later on, you should know the ramifications of that. From a token standpoint, when a merchant decides to change processors, the merchant has tokens in data warehouses and loss prevention systems that it has been able to use for analytical purposes. What happens when you switch providers? There's an example where a new provider won't know what value those tokens align with. From a tokenization perspective, it's important to caution merchants that they need to ensure they have the ability to export that relationship between the tokens and cipher text if they want to change providers.
When the supplemental guidance on tokenization is released and the validation requirements
are put in place, do you anticipate certain tokenization technologies codified in PCI
Palgon: They're hoping for an end of November or early December delivery of the tokenization guidance document. We don't have a date for the actual validation document. It will be my group's responsibility to begin pulling that together. They're very much interested in us collaborating with them to produce that. We're going to put together what we think are the right check boxes to ensure validation while they go off in the short-term and improve the guiding principles that we put in place there. For point-to-point encryption, they've said it will be out for comment in 2011. If I had my druthers, not only would it be out for comment, but it would be validated towards the latter stages of 2011 and moving forward companies could move down that validation checklist for EMV and point-to-point encryption to ensure they meet everything from both a requirement standpoint and a validation standpoint.
As an example of the lack of maturity with tokenization, some experts have pointed out
certain architectural deficiencies, such as the lack of application programming interfaces (APIs)
to work with certain payment applications. Is that an issue right now?
Palgon: From an overall application standpoint, I think the answer is yes. What is lacking out there is when we first started dealing with PCI DSS and encryption was the only game, we went through and dealt with probably more than 15 different vendors where you had to learn to exchange keys with point-of-sale and loss prevention systems. Now application vendors are looking to change their applications in order to remove scope or accept tokens. I think we're in a second wave here where it's a lot easier because they are dealing with tokens and a simple interface as opposed to exchanging keys. The encryption model is more difficult.