Tokenization has been billed as a cure-all for minimizing access to sensitive payment data and for making PCI compliance easier. However, skeptics say it’s unable to really do either. Does the truth lie somewhere in between?
At the 2011 Gartner Security & Risk Management Summit in Washington DC, SearchSecurity.com Senior Site Editor Eric Parizo chatted with Gartner IT1 Research Director Ramon Krikken about tokenization vs. encryption, PCI tokenization to reduce audit scope, lagging tokenization standards and what the future holds for tokenization.
Read the full transcript from this video below:
Ramon Krikken on tokenization vs. encryption, PCI tokenization
Eric Parizo: Hi. I'm Eric Parizo with SearchSecurity.com. Thanks for watching.
It's great to have you with us. Joining us today, is Ramon Krikken,
research director for Gartner IT1. Ramon, thanks so much for being with us
Ramon Krikken: You're very welcome.
Eric Parizo: Let's talk about tokenization and tokenization and PCI. Tokenization
has been billed as a cure-all for minimizing access to sensitive
payment data and for making PCI compliance easier. Skeptics say
it's unable to really do either. Does the truth lie somewhere
Ramon Krikken: I guess it depends on how you look at it. If you look at the
concept of tokenization, replacing a credit card number, in this
case, with a surrogate number that isn't the real credit card
number. If we're able to use that surrogate number all
throughout all of the applications that don't really need the
actual card number, we can certainly say that those applications
are no longer in the scope of PCI, of course, caveat that with
the fact that the PCI Council hasn't really come up with
guidance, much guidance, around that. It is certainly the case
that tokenization can help, in that case and reduce the scope of
all the systems that are subject to PCI compliance requirements.
With that, of course, comes the idea that you have simply
replaced a credit card number with something else. Now, that
something else, to some extent, needs to be protected as well.
So, let's say you use tokenization and these tokens float around
in all of your protection systems. There are still things you
can do it that. So, for example, you could, in some
implementations perhaps, use a token to fraudulently rebuild a
credit card number.
The real credit card numbers, of course, live somewhere.
Somebody has to have that mapping between the token and the real
credit card number. So, if you outsource it, it's going to be a
vendor who has that. They, then, have responsibility for keeping
all that secure but, if something happens to that, you may still
feel some of the effects of that.
If you have it in-house, as a product, you hold that vault, with
all of the translations and mappings. So, some systems will
still have the real credit card numbers in them. So, the
protection of those systems becomes critically important.
Finally, you need to move the credit card number through
different applications and processes to complete payment
anyway. So, the number of systems that are touched, are maybe
reduced, but it isn't necessarily easy to say, "Okay. We're
going to completely take everything out of scope." If you take
all of that together, and perhaps these skeptics are responding
in some way, some to defend their claims and some of the hyper-
tokenization, it is one of those. Yes, there are great benefits,
or potential great benefits, but there are still things you need
to worry about. It doesn't, necessarily, completely solve all
Eric Parizo: Tokenization is often compared to encryption. Are they even in the
Ramon Krikken: Again, yes and no. Both are meant to protect data, protect
confidentiality of your data and so you can say both encryption
and tokenization do that. The difference comes in that
tokenization, basically, allows you to take the replacement of
the credit card number, so the token, and use in all sorts of
business processes, right? If the format stays the same, and
especially if the check digit in the credit card number checks
out, then an application isn't any wiser to whether it uses the
real card number, or the surrogate.
Encryption, on the other hand, in most cases, will produce some
kind of binary output. So, you put the credit card number in
and binary double comes out. And, so that is
not usable by an application. So, any time you want to access
that data, you have to decrypt it. In reality, although a lot of
laws and regulations that deal with PII and PCI data and health
data, kind of, gives you this "get out of jail free" card, if you
just encrypt information, you'll find that encryption's kind of
a second choice to technologies like tokenization and data
masking, that transform the data in a different way, namely, in
such a way that it's no longer sensitive, but you can still use
it in your applications.
So, they kind of complement each other and, in fact, you could
say that encryption complements whatever data masking and
tokenization can't do. So, your first option, your first choice
would be to say, "Let's not use the real data at all. Let's use
substitute data," and then in cases where we have to use the
real data . . . so the token vault would be a great example,
right? The real credit card number still lives in the token
vault. So, that's where you would, for example, use encryption.
Eric Parizo: So, just to set the record straight, you wouldn't necessarily see a
tokenization implementation without encryption to accompany it.
Ramon Krikken: I don't think it can, because you would, in fact, kind of, be
in violation of all sorts of laws and regulations, even outside
of credit card payments. Ultimately, the real data needs to live
somewhere, and that data needs to be protected. You can't
tokenize the token. There are only so many steps you can take.
That's why you can't really do one without the other.
Furthermore, even if you just forget about the tokens and the
real credit card data when it's at rest, if I need to accept a
credit card number in a certain application before it is
tokenized, I somehow need to that credit card number to the
token server, because then I need to get my token back. So, in
order to protect that data while it is in transit, to the token
server, you would use data in-motion encryption for that.
So one can't live without the other. That's why, I think, it
makes a lot of sense that there are PCI special interest groups
concurrently running, one on tokenization and one around end-to-
end encryption, again, because the two have to work together,
somehow, to make things work.
Eric Parizo: According to a March 2011 survey conducted by SearchSecurity.com, 57%
of respondents -- enterprise information security pros -- said that
they don't currently use tokenization to reduce audit scope. Is
that a wise decision, or a missed opportunity?
Ramon Krikken: So, if they indeed have credit card information and they want
to specifically reduce PCI scope, rather than some other
regulatory scope in dealing with personal information, I would
say it's somewhat of a missed opportunity. I will caveat that,
though. It is not like you can just drop in tokenization, and
even though it preserves the format of the data, it's not like
you can just drop it in place and say, "Okay. We're good to go."
There are integration challenges that deal with, how do I get
the system to work, with the various applications that store
credit card data, process credit card data, except credit card
data. So you need to tie all of that in to what you have, into
the tokenization system.
So, it becomes an exercise in integration. So, we really have to
look at that trade-off of security versus the amount of effort
you need to expend. Now, I believe in the long run that
tokenization will become more accepted, as a very general-
purpose mechanism and you will find it pretty much . . .
Everybody who deals with payment processing will use it one way
or another. But, I think, to some extent, that people are still
kind of testing the waters on what is possible. I still get a
lot of questions around, "Should we do this in-house or should
we outsource it to a vendor?" People are still at that stage of
the decision-making process. So, there are a lot of moving parts
and it's still relatively new.
Eric Parizo: In the past, you've discussed the need for greater tokenization
standards. Where does the tokenization standards effort remain
Ramon Krikken: They're still pretty much nonexistent. Unfortunately, it is
something where we still need to make some progress. There are
some folks working on it, some folks in the cryptography and key
management world, who build tokenization systems as a natural
extension to what they already have, and, also, the participants
in PCI special interest group, a lot of them who are, in fact,
in cryptography and key management.
The biggest things around that are, how do you build
tokenization systems and architectures that are secure and
also, what are the actual mathematical algorithms that you use,
the mathematical functions, to go from a credit card number to a
secure token. The security properties around that are not very
well understood yet. It would be most straightforward to say
that every time you get a credit card number, you completely
generate a random number, and then replace it with that, but as
your database grows, as you have distributed systems that all
need to do at the same time, you run into problems there. So,
there still some fundamental research going on there.
Eventually, I think we will get there but, I think that one of
the obstacles is that although we have a very well-developed
vocabulary when we deal with cryptography, we don't have the
exact words and terms, yet, to describe what we're trying to
achieve in the tokenization world. There are a lot of parallels
but all of the connections haven't been made yet. But we are
working toward them.
Eric Parizo: Finally, what does the future hold for tokenization, from a
technology standpoint, and what guidance do you predict or hope
the PCI Security Standards Council will put forth regarding
Ramon Krikken: I think that the way that we're going right now, technology has
already developed quite a bit. We see most of the major payment
gateway vendors and payment processors offer some kind of
tokenization service, as an add-on. So, that's good news for all
of the merchants who can't do it themselves, and particularly,
for the smaller merchants, right? So, if you're a "mom and pop"
store, on the street corner, you're not going to put your own
tokenization system in-house. So, that's really where the
service vendors come in, and really help out a lot.
On the product side, we are seeing some developments as well
but, of course, we have to remember that it's those product
vendors who, kind of, supply the others with their underlying
technology. So, aside from perhaps the standardization aspect,
and perhaps things like management interfaces and all sorts of
extensions, the basic technology is there, at least for credit
Now, when we go to different types of information, so let’s say
telephone numbers, which are numbers, so it could still be
relatively easy, but what if you wanted to tokenize a name? How
exactly do you do that? How do you keep names so that they
actually still make sense, so that it doesn't become some random
kind of strange thing? In protecting these other types of data,
things other than credit card numbers, I think we still need to
prove the technology, and whether it's applicable.
I said for a while now that credit card processing is, sort of,
the ideal use case for tokenization because, the number of times
that you actually need the real credit card number is very
small. So, you can, kind of, put it into the tokenization
system, get your token out and then for most purposes, you can
use that token. Only during charge-backs and recurring charges
do you need something else. But, is that going to be the same
if, for example, you tokenize drivers license numbers or Social
Security numbers? Medical information, how would you even do
that? But the vendors are looking at that.
So I think that we should really look forward to see what
they're going to develop, what some of their initial customers
might be able to do there. The advice I would give right now, is
if you have a pressing PCI problem, do, certainly, look at it
because it is mature enough and, on top of that, I don't believe
that the PCI Council will come out with guidance that would
invalidate all of these things that have currently been
developed and installed. So, my hope would be that they do take
a good look at how we do this securely but also be broad enough
to say that we can encompass a lot of these current
implementations in it.
For the other use cases, perhaps, I would say take a good look
at what things are. What is the business use of your data? And
just consider tokenization as just one arrow in the quiver of
your protection mechanisms.
Eric Parizo: All right. Ramon Krikken, research director for Gartner IT1. Thank
Ramon Krikken: Thank you.
Eric Parizo: And thank you for joining us as well. For more information security
videos, don't forget, you can always check out
SearchSecurity.com. I'm Eric Parizo. Stay safe out there.