Access "Understanding tokenization: What is tokenization and when to use it"
This article is part of the April 2012 issue of An expert guide to tokenization and using it effectively
What's the best way to protect sensitive data from being stolen? Remove it entirely: If data is not present in a system, it can’t be stolen. A steady stream of data breaches demonstrates that IT systems are under attack and underscores how even firms with good security knowledge get it wrong. Just ask Sony, RSA, HBGary Federal, or Stratfor. Information security is hard to do, and the complexity of IT operations hinders our ability to protect data. So why don't we just get rid of sensitive data? No, I am not saying you should delete all your data in order to protect it. I am saying replace sensitive information with something that, if stolen, doesn’t matter. That’s precisely what tokenization does. Tokenization technology removes sensitive data and replaces it with a worthless token. IT systems use the token placeholder as a reference, continuing to function as before, but the risk of leaking information is greatly reduced. There has been quite a buzz around tokenization in heavily regulated industries such as payment processing. Credit card numbers are a ... Access >>>
Premium Content for Free.
Unified threat management devices for the enterprise
by Joel Snyder, Contributor
UTMs aren’t just for SMBs anymore. Here are four requirements for enterprise-grade UTM.
AMI networks: PKI security considerations
by Seth Bromberger, Contributor
PKI components in smart grid and AMI infrastructure introduce new hazards.
- Unified threat management devices for the enterprise by Joel Snyder, Contributor
Understanding tokenization: What is tokenization and when to use it
by Adrian Lane, Contributor
Tokenization protects sensitive data to reduce the compliance burden.
Web browser security features make attacks harder
by Robert Westervelt, News Director
Accuvant analysis and hacking contests illustrate browser security improvements.
- Understanding tokenization: What is tokenization and when to use it by Adrian Lane, Contributor
Don’t turn security Big Data analysis into a forgettable cliché
by Michael S. Mimoso, Editorial Director
It’s easy to be cynical about the latest security buzzword, but don’t be so quick to dismiss it.
Marcus Ranum chat: Security startups and security innovation
by Marcus Ranum
Security expert Marcus Ranum talks with Peter Kuper, a partner with In-Q-Tel focused on funding compelling startups to accelerate innovation for the intelligence community.
Information security roles and technology shifts
by Paul Rohmeyer, Contributor
New technologies and business models are rapidly changing the role of the security pro.
- Don’t turn security Big Data analysis into a forgettable cliché by Michael S. Mimoso, Editorial Director
More Premium Content Accessible For Free
In this special issue, we are revealing the winners of our Security 7 awards. This is the ninth year we've handed out the Security 7 awards, which ...
Cloud and mobility in the enterprise has caused a heightened need for organizations to take a closer look at next generation authentication ...
Virtualization and cloud computing are part and parcel of enterprise networks today. Virtualization security, however, is still a bolt-on affair ...