The future of PCI DSS encryption requirements? Tokenization for PCI

Can tokenization help reduce the scope of PCI DSS? How does tokenization interact with PCI DSS encryption requirements? Learn more about this technology and whether it's right for your enterprise.

Tokenization allows information to be reformatted in a fashion that renders it useless to outsiders ... . It may even replace encryption as the preferred method to protect credit card data.

 

,

Imagine a television commercial: A person comes on the screen and says, "Hi, my name is Bob Jones. My Visa number is 4567 8903 2890 4598 and the expiration date is 12/12." Frightening, huh? Now imagine he goes on to say, "I should also mention that these numbers are only valid if you are a processor or issuer because they've been tokenized, so good luck using them!"

These two sentences sum up why tokenization is invaluable to any organization handling credit card data and looking to achieve PCI DSS encryption requirements Tokenization allows information to be reformatted in a fashion that renders it useless to outsiders -- thus reducing the risk of unauthorized access to, or breaches of, the information -- while still allowing organizations, their business partners and their underlying applications to use this sensitive information to process transactions in a format they expect, with little modification to business workflows. In fact, in the not-too-distant future, it may even replace encryption as the preferred method to protect credit card data.

However, before we explore the topic of tokenizing information further, let us discuss why its counterpoint -- encryption -- has had issues, and thus, why modifications are needed to deploy encryption technologies and meet PCI encryption requirements.

Encryption for applications
Today's business flows for processing information may require a large number of systems to work in lockstep with each other. With encryption, each application must be modified to decrypt incoming information and then encrypt outgoing results. Modifications to these systems require creating new modules, writing new scripts, testing, and in some cases, updating to newer systems that can take advantage of encryption technologies, even if the in-line applications only pass along the information for processing.

Encryption keys and standards
In order to encrypt and decrypt information, a software key -- whether symmetric or asymmetric -- must be used. These keys have to be issued, revoked and eventually, expired. Since manually managing large numbers of encryption keys can be cumbersome, enterprises need to deploy a key management system to ensure that production workflows don't stop due to a system's encryption key becoming invalid.

There are many standards for encryption -- e.g. IBE, PKI, PGP, and openPGP -- so enterprises must select a common set of encryption standards for both themselves and their partners to ensure interoperability.

Tokenization for PCI DSS compliance, data protection
The Payment Card Industry Data Security Standard (PCI DSS) encryption requirements mandate that credit card data be protected during processing and processes must be modified to include these changes.

Because it's difficult, and modifications are costly, encryption still struggles to get a foothold in many organizations. This has led organizations that have tried to deploy encryption to start looking at tokenization for PCI DSS compliance and data protection.

With tokenization, the idea is not to encrypt the data, but to use known algorithms to render key components of the data useless to outsiders while still retaining the data's value and format for processing. For example, if a birth date is 1/1/1989, tokenization can apply an algorithm to increment the month by 1, the date by 10 and decrease the year by fifteen (with wrap around for month/day when the results are invalid). This results in the birth date becoming 2/10/1975. Unlike encryption, where the results may look like "#$%^%$$@#" as the workflow passes this information for processing, applications in the processing path can confirm the fields contain valid values (to ensure the record hasn't been corrupted during transmission) and pass the information on to the application that will eventually reverse the tokenization and process the information.

When an application receives tokenized data it needs to process, there are two choices: It can either have the tokenization algorithms added to its code to reverse the process or, more commonly, contact a secured corporate token server that tokenizes information on a large scale and maintains the original information values; no key management required. In this case the application sends predetermined tokenized data elements through secure communications channels to the token server and receives the original information in return. While encryption may seem like it takes a similar level of effort, when comparing the workflow of a number of systems that pass sensitive data to the number of systems that must process this data, one finds that tokenization is much more efficient. See the figure below.

------------------------------------------------------------------------------------------------------------
Encryption flow
Encrypt (E)
Decrypt (D)
Process (P)

Data entry -- system1 (E) -- (D) system 2 (E) -- (D) system 3 (E) -- (D) system 4 (E) -- (D) system 5 (E) - (D) system 6 (P)
------------------------------------------------------------------------------------------------------------
Tokenization flow
Tokenize (T)
Request Data from Tokenization Server (or algorithms are known) (R)
Process (P)

Data entry -- system 1 (T) -- system 2 -- system 3 -- system 4 -- system 5 - system 6 (R) (P)
------------------------------------------------------------------------------------------------------------

As you can see from the workflows above, each system must be modified for encryption, while only the initial entry system and processor must be modified for tokenization. If dozens or hundreds of systems are involved in the workflow, an organization can realize substantial savings using tokenization over encryption.

Tokenization also provides a number of ways to protect information. These include:

  • Object replacement: This is a direct replacement of some or all of the data in the data set with similar data types (usually from tables), i.e. given name, surname, street name, city, state, etc. so the name 'Joe' might become 'John' and 'Smith' becomes 'Jones.'
  • Character replacement: This is a replacement of some or all of the data in the data set using known reversible algorithms, i.e. primary account number (PAN), CCV, etc.
  • Masking: This is the replacement of some or all of the data in the data set with a single character. For example: 5009 9087 8793 5642 (original credit card number); xxxx xxxx xxxx 5642 (masked credit card number).
  • Randomizers: This is the replacement of some or all of the data in the data set with ranges of similar data types, i.e. "increase exp. date month between 2-6 months" or "modify birth date year by decreasing between 10-15 years."

Keep these things in mind when deciding which method to use to protect sensitive information: Does the tokenization process for the information need to be reversed for processing? What rules are needed to ensure the information remains valid?

When it comes to processing, data sets that have been tokenized by object and character replacement, and in some cases, masking -- in conjunction with other data set elements -- can be reversed. However, randomized data cannot be reversed without a tokenization server, which maintains a mapping between the original data and the tokenized output. Also, one must be careful of the rules used, especially for development or testing activities.

Instead of using production data in development or testing environments, tokenization protections can be used to rapidly create unique production data that simulates production data sets but has been modified to protect the sensitive information. This not only reduces the level of security required to protect non-production environments, but it also eliminates the risk of insider loss of sensitive information from these systems. But, as mentioned in the last paragraph, the rules used shouldn't invalidate testing the processing of information. For example, tokenizing a "state" value may invalidate your test workflow if the maximum credit interest charged varies by state. An example of tokenized production data to similar test data is shown below.

  Production Data  Test Data
Given name James Bob
Surname Smith Jones
Acnt. # 5009 9087 8793 5642 4567 8903 2890 4598
Exp. Date 3/11 12/12
CCV2 567 342
Birth date 3/12/78 3/12/67

In this case, tokenization random algorithms were used to ensure the data can never be tracked back to production data.

Tokenization issues: Security standards adoption
So why isn't tokenization more widespread? The main issue with tokenization is security standards adoption. For example, the current version of the Payment Card Industry Data Security Standard (PCI DSS) only formally recognizes encryption technologies for protecting credit card transactions. While the Security Standards Council (SSC) for PCI DSS has recognized the value of tokenization -- and is studying its capabilities -- it has yet to put it in the PCI DSS as an acceptable substitution for encryption. This means that organizations that must comply with PCI DSS, and similar standards requiring the protection of information in storage or transit, must look at the value of tokenization and whether it increases the security of their processing and storage architectures.

Tokenization will be the way of the future for protecting credit card and other sensitive data; the question is: Has the future arrived for you?

About the author:
Randall Gamby is an enterprise security architect for a Fortune 500 insurance and finance company who has worked in the security industry for more than 20 years. He specializes in security/identity management strategies, methodologies and architectures.


PCI DSS AND TOKENIZATION

  Introduction: PCI DSS and Tokenization
  PCI DSS compliance benefits of tokenization
  Future of PCI DSS encryption requirements
  Tokenization and PCI compliance
  PCI DSS compliance help


This was first published in January 2010

Dig deeper on Disk Encryption and File Encryption

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchCloudSecurity

SearchNetworking

SearchCIO

SearchConsumerization

SearchEnterpriseDesktop

SearchCloudComputing

ComputerWeekly

Close