Requires Free Membership to View
In essence, Chip and PIN was meant to turn every credit card into a smart card, and enable strong "two-factor" authentication, The PIN would serve as the second authentication factor: the card being "what you have and the PIN being "what you know."
Chip and PIN was also meant to replace the magnetic stripe currently found on credit cards. In practice, however, the stripe remains on the card as a backup, reserved for those transactions when the chip can't be read properly. The technology was first rolled out in the UK in 2003, and within three years the UK government required all card holders to use only their PIN.
Similar technology -- based on the EMV chip card standard -- has been successful in France, where the country has reduced credit card fraud by about 80%, but the UK program has had problems from the start. The implementation of expensive smart card readers at the point of sale has been an issue for smaller businesses, which led to Chip and PIN cards continuing to include magnetic stripes. Research has also suggested that Chip and PIN cards aren't any more secure than traditional cards, as PIN numbers can be stolen and readers can be tampered with.
Would the Chip and PIN system work in the U.S.? Perhaps, but its security kinks would first need to be resolved. Remember, a PIN isn't inherently more secure than a signature. It's basically just another credential that can be stolen or "nicked" as the British would say. There are also cultural issues. While smart cards have been adopted within enterprises, they have yet to hit the American market, which tends to be more resistant to these types of technologies.
For more information:
This was first published in April 2007
Security Management Strategies for the CIO
Join the conversationComment
Share
Comments
Results
Contribute to the conversation