NuBridges Inc. has released an updated version of its enterprise tokenization product, seeking to eliminate a key pain point for large companies implementing tokenization: coordinating the issuing of tokens among multiple data centers.
The Atlanta-based vendor today announced Token Manager 2.0. The on-premise software generates format-preserving tokens and inserts them in place of the sensitive data in databases and applications. It now enables unique tokens to be generated in multiple data center locations simultaneously, adds configuration options to support tokenization of personal identifiable information (PII) and personal health information (PHI) and supports data encryption of payment data at the point of sale.
Tokenization has drawn interest from merchants in particular because the process substitutes payment card data with a unique value or token after a transaction authorization takes place. It supplements data encryption and in some cases limits the networks and systems that fall within the scope of the often onerous Payment Card Industry Data Security Standard (PCI DSS).
Gary Palgon, vice president of product management for NuBridges, said large enterprises
Compliance benefits of tokenization: Tokenization not only keeps confidential data out of the hands of malicious hackers, but also offers a less expensive strategy for achieving PCI compliance. Identity management and access control expert Joel Dubin defines tokenization, examines whether or not it's effective and unveils how the technology can be used as a tool for PCI compliance.
Token Manager 2.0 enables up to 10 data centers to generate unique tokens simultaneously. It ensures each data center uses its own set of tokens, and tracks the issuing of all tokens via a centralized data vault.
In addition to ensuring a scalable tokenization process for a large infrastructure, Palgon said the feature supports disaster recovery and business continuity and preserves centralized key management.
The new PII/PHI feature adds additional configuration options to break the 1:1 relationship between a token and its actual value. By default, all instances of a value are replaced by the same token, which Palgon said in some instances might make it possible to figure out a token's value.
"For instance, if I'm a DBA, and I know my own salary, I can query the database, find the token for my salary, and then I can look for that token elsewhere to know who else has the same salary," Palgon said. "Now we offer the opportunity to further obfuscate, so that there might be multiple tokens to represent the same figure."
He added that the new product allows for field-by-field configuration, so that highly sensitive information like dates of birth or health data can be given unique tokens, but less sensitive types of data can reuse tokens to increase efficiency of the system.
Finally Token Manager 2.0 also allows for data to be encrypted at the point of capture or point of sale, eliminating the risk of data exposure during transit from point of capture to the Token Manager. While Palgon said the data is decrypted in the data center in order to be tokenized, Palgon said only happens within the memory of the tokenization server, so the data value isn't written out anywhere other than the server's memory.
NuBridges released Token Manager last spring. The vendor said it has more than 100 enterprise tokenization customers. Palgon said customers commonly seek to apply tokenization to payment data, Social Security numbers, drivers' licenses and passport data, but in recent months there's been a flurry of interest among customers and prospects regarding PHI and how the technology can help with HIPAA and HITECH compliance.
John Pescatore, vice president and research fellow at Stamford, Conn.-based research firm Gartner Inc., said that NuBridges was a first-mover in supporting enterprise tokenization, but most of the major key management vendors have since added tokenization capabilities.
"That early lead has given them a head start on other features that large enterprises look for," Pescatore said via email, "and the idea of token coordination across different physical locations is one of those features."
Pescatore noted that Gartner continues to see a high level of interest in tokenization among organizations, primarily as way of reducing the scope and costs of the PCI DSS audit process.
"However, we are also seeing companies that do implement tokenization for PCI reasons start to look at being able to offer encryption and tokenization as internal services, to deal with sensitive data (mostly forms of PII) where encryption will be needed and tokenization augments encryption," Pescatore said. "In general, we see steadily increasing demand for data encryption and see tokenization as a needed capability for the majority of uses of server-side encryption."