This article can also be found in the Premium Editorial Download "Information Security magazine: Security Readers' Choice Awards 2008."
Download it now to read this article plus other related content.
Disclosure fails as an incentive to protect data.
For as long as data has been portable, it has always gone missing. Laptops and tapes have always disappeared, and undoubtedly, Mesopotamian scribes were concerned over lost cuneiform tablets. Certainly the more convenient a storage medium and the greater its capacity, the more likely large amounts of information can be misplaced.
The unfortunate impression given by the general media, if not government officials, is that all of this missing data is ending up in the hands of identity fraudsters and data brokers. But outside of a few well-documented remote hacks, there is no real evidence to suggest that ordinary people are being harmed by any of the so-called "breaches" that are so stridently
| written about in newspapers and security vendor press releases. Still, worried friends and family want to know if they are at risk, and all we can say is, "Well, they should have encrypted it."
All these "breach" disclosures seem like a backward way to provide enterprises with incentive to secure personal data. And today's expectation that any involved persons should be warned of such data disappearances is a messy way to deal with the problem. An individual who receives a worrisome notification that his or her data has been lost doesn't know what to think. How are they supposed to evaluate the relative risk to themselves, and just what are they supposed to do about it?
The cynic in me says institutions are just dumping their responsibility on the little guy, making an empty gesture of apology and then considering the matter closed.
When I was 9, I broke a pop bottle in the street, and the mailman's personal car ended up with a flat tire. When my mom learned about it, she marched me down the street, made me apologize in person, and I had to do chores to earn the money to pay for repairs. It was embarrassing all around, but I learned something about personal responsibility. Apparently, our institutions must be forced to publicly apologize until they stop taking a childish approach to data protection.
Will the social pressure of embarrassment actually result in a real change in institutional data controls? Hopefully, media encryption will soon be considered a normal practice, but I'm seeing some pretty silly gut-level responses. As hastily applied encryption efforts cause more harm than good, and once budget holders realize that most data loss didn't actually result in personal loss, will the pendulum swing back?
Carefully protecting personal data should be considered a normal corporate responsibility, without the need for exaggeration and hype. Unfortu-nately, initiatives based on FUD and hysteria tend to look a lot less important once the dust clears. Let's hope that permanent changes are made before we all become so fatigued with the breach of the week that losing someone else's data is no longer considered a shameful act.
This was first published in April 2008