Tokenization is the process of replacing sensitive data with unique identification symbols that retain all the essential information about the data without compromising its security. Tokenization, which seeks to minimize the amount of data a business needs to keep on hand, has become a popular way for small and mid-sized businesses to bolster the security of credit card and e-commerce transactions while minimizing the cost and complexity of compliance with industry standards and government regulations.
Payment card industry (PCI) standards do not allow credit card numbers to be stored on a retailer's point-of-sale (POS) terminal or in its databases after a transaction. To be PCI compliant, merchants must install expensive end-to-end encryption systems or outsource their payment processing to a service provider who provides a "tokenization option." The service provider handles the issuance of the token value and bears the responsibility for keeping the cardholder data locked down.
In such a scenario, the service provider issues the merchant a driver for the POS system that converts credit card numbers into randomly-generated values (tokens). Since the token is not a primary account number (PAN), it can't be used outside the context of a specific unique transaction with that particular merchant. In a credit card transaction, for instance, the token typically contains only the last four digits of the actual card number. The rest of the token consists of alphanumeric characters that represent cardholder information and data specific to the transaction underway.
Tokenization makes it more difficult for hackers to gain access to cardholder data, as compared with older systems in which credit card numbers were stored in databases and exchanged freely over networks. Tokenization technology can, in theory, be used with sensitive data of all kinds including bank transactions, medical records, criminal records, vehicle driver information, loan applications, stock trading and voter registration.
See also: PAN truncation
Continue Reading About tokenization
'tokenization' is part of the:
View All Definitions