Access your Pro+ Content below.
Understanding tokenization: What is tokenization and when to use it
This article is part of the April 2012 issue of Information Security magazine
What's the best way to protect sensitive data from being stolen? Remove it entirely: If data is not present in a system, it can’t be stolen. A steady stream of data breaches demonstrates that IT systems are under attack and underscores how even firms with good security knowledge get it wrong. Just ask Sony, RSA, HBGary Federal, or Stratfor. Information security is hard to do, and the complexity of IT operations hinders our ability to protect data. So why don't we just get rid of sensitive data? No, I am not saying you should delete all your data in order to protect it. I am saying replace sensitive information with something that, if stolen, doesn’t matter. That’s precisely what tokenization does. Tokenization technology removes sensitive data and replaces it with a worthless token. IT systems use the token placeholder as a reference, continuing to function as before, but the risk of leaking information is greatly reduced. There has been quite a buzz around tokenization in heavily regulated industries such as payment processing....
Access this Pro+ Content for Free!
Features in this issue
UTMs aren’t just for SMBs anymore. Here are four requirements for enterprise-grade UTM.
Tokenization protects sensitive data to reduce the compliance burden.
PKI components in smart grid and AMI infrastructure introduce new hazards.
Accuvant analysis and hacking contests illustrate browser security improvements.
Columns in this issue
It’s easy to be cynical about the latest security buzzword, but don’t be so quick to dismiss it.
Security expert Marcus Ranum talks with Peter Kuper, a partner with In-Q-Tel focused on funding compelling startups to accelerate innovation for the intelligence community.
New technologies and business models are rapidly changing the role of the security pro.