The PCI Council has issued its long-awaited guidance supplement addressing the use of tokenization technology to eliminate primary account numbers from merchant systems. Despite the release of the report, the tokenization market continues to be plagued with a number of problems, according to Ulf Mattsson, chief technology officer of Stamford, Conn.-based tokenization vendor Protegrity Inc. Mattsson said a myriad of token formats have created vendor lock-in, limiting the ability of merchants to change systems or use multiple payment processors. Other tokenization systems have scalability issues and can’t work with other forms of sensitive data types, Mattsson said.
There was a lot of turmoil that delayed the finalizing of the document.
Ulf Mattson, CTO, Protegrity Inc.
Mattsson, a member of the PCI tokenization special interest group, which helped develop the recent PCI tokenization guidance document, “PCI Tokenization Guidelines,” (.pdf) said it took about a year to work out the details of the document. Tokenization vendors have long disagreed on a number of details including the supported token formats and the methods used to create tokens. In this interview, Mattson explains that the PCI guidance document is a good first step, but the industry needs to iron out longstanding disagreements before the technology is wholeheartedly adopted.
How significant is the new PCI guidance document on tokenization?
Ulf Mattsson: I think it’s much needed. It’s a little bit late. It’s the first acknowledgement from the council about tokenization and it’s only the first step on the path of validating tokenization from a PCI point of view. We need to add a lot more steps beyond this supplement. Unfortunately, the supplement contains a lot of disclaimers. The council and the major card brands added the disclaimers at the last minute. The council also has no plans to create any certification at this point.
How difficult was it to come up with a working document?
Mattsson: It was very difficult, even though Visa was able to put out recommendations about a year ago. The council said it wanted to take the lead because there were a lot of disagreements. There was a lot of turmoil that delayed the finalizing of the document.
What is missing from the document?
Mattsson: I think in the final stages a lot of controversial issues were left out. [The guidance document] is starting to give some weight to the technology, but it adds more questions than we had before. I think it’s missing the vendor lock-in issue. When you start to do tokenization it’s much harder to replace tokens. For example, if you do tokenization with an outsourcing partner or a payment gateway, you are really stuck with that partner. These tokens are highly customized and they’re all over your applications and databases. They are sometimes deeply integrated into your applications because some applications will not tolerate the token format, so there has to be some customization and translation. There should be some words about that lock-in aspect in the document.
Why haven’t the vendors in this market been able to come together and create an industry standard to enable companies to change token providers?
Mattsson: I’m working in the ASC X9 standards body that is now addressing tokenization. We’re moving in the right direction, but some vendors are pushing their model. For example, one vendor is using a form of encryption and they call it tokenization. That’s an extreme example, but we’ve seen progress. Both Visa’s best practices and the PCI Council’s supplement clearly said is not an acceptable way of generating a token if you want to be out of scope. The council said that is simply an encrypted [primary account number] and an encrypted PAN is within the PCI scope.
What are some other issues holding up standards?
Mattsson: There are other controversial points about outsourcing tokenization. Some vendors are arguing that if you outsource your tokenization, you really have a vendor lock-in situation. So how can you do this if you need two payment gateways? How can you deal with switching from one payment solution provider to another? Larger merchants want to have that function in-house. If we look at upcoming data breach regulations, it’s not just about the PAN. This is an issue about PII data, PHI data and cardholder data and they have similar needs to tokenize many of those data types. The only model that will work for them is to have the tokenization server in-house.
The guidance document says there’s a possibility to reduce scope with tokenization, but, as you say, there are a lot of disclaimers. How do you adequately segment components from the tokenization system and the cardholder data environment?
Mattsson: If you take the disclaimers out for a second you see that simply your token server needs to be segmented. If you have an in-house tokenization server, you need to put that into a separate network segment. That is how many QSAs are looking at what is in scope and out of scope.