Face-Off: Schneier, Ranum debate security regulations

Do federal security regulations help?

This article can also be found in the Premium Editorial Download: Information Security magazine: Symantec 2.0: Evaluating their recent acquisitions:

Bruce Schneier

Point Regulation is about economics. Here's the theory: In a capitalist system, companies make decisions based on self-interest. This is good; we don't want companies acting as public charities, we want them acting as for-profit entities. But, there are effects of company decisions not borne by companies—these are "externalities."

Companies don't always take externalities into account because, well, they're someone else's problem. If we want externalities to factor into company decisions, we have to make externalities internal. Then, the natural engine of capitalism will take over.

An easy example: A company pollutes a river, and people downstream die. No one in the company lives downstream, no customer lives downstream, so the company doesn't care. It's a classic externality. If society wants the company not to pollute the river, it has to remove the externality. Liabilities (allowing the people who live downstream to sue) and regulations (making it illegal to pollute the river) do that. A rational company will spend more money so as not to pollute the river.

What does this have to do with computer security? Everything.

If ChoicePoint has lousy security and someone steals our identity information, we are harmed. But to ChoicePoint, it's an externality. ChoicePoint isn't a charity, and it's not going to improve security out of the goodness of its heart. If we want ChoicePoint to protect our data, we're going to have to force it. We need to raise the cost of its having lousy security so it'll be cheaper for the company to have good security.

At least that's the idea behind regulation. Unfortunately, the devil is in the details.

Take disclosure laws: On the face of them, they're smart. By forcing data breaches public, we're raising the cost of breaches. Unfortunately, that cost was in public shaming, especially in the press. But as more companies lose data, the press becomes less interested and public shaming diminishes. Good idea, but temporary.

Or, take Sarbanes-Oxley: I'm not sure how it pertains to computer security. But, everyone seems to think it does, and companies have poured money into computer security—the cost is still cheaper than the potential liability. Some money has gone into computer security, but most has gone to large auditing firms that produce reports that are only useful to defend against liability claims. Good idea, but expensive for what you get.

A much better example is the credit card law that limits personal liability for fraud to $50. Before the law, credit card losses were an externality to credit card companies, so they didn't do all that much to improve security. After the law, we got online verification terminals, systems for card activation and data-mining systems to detect fraudulent spending patterns.

So what are the characteristics of good regulations?

  • They're targeted at a specific externality.
  • The penalties are large enough to make the alternative more attractive.
  • They put the entity able to fix a security problem in charge of the problem.
Federal regulations help if written well. Unfortunately that's the exception; I prefer liability, instead of regulation, as the mechanism to reduce externalities.

Marcus Ranum

CounterPoint Regulations are a good idea, but they need to have teeth—serious consequences for noncompliance, not just cheerful slaps on the wrist. Every year the Department of Whatever gets written up with a D- in FISMA compliance, but someone quickly points out that D- is a huge improvement over last year's F.

I guess we're supposed to be impressed taxpayer dollars are achieving marginal improvement, but I'm not. The idea of regulation is to establish a minimum consistent practice. I'm sorry if I sound like a hard case, but "attaboys" should not be handed out for compliance with a remedial baseline. This isn't a politically correct feel-good game in which every child wins a prize. Agencies are spending serious dollars, and moving from an F to a D- is not evidence of accomplishment; it is evidence of incompetence, mismanagement and waste.

Moving toward liability is attractive. But you can't train an animal by punishing it into doing the right thing. Holding companies, agencies and individuals liable is simply punishing them; you need a specification of that minimal baseline you can communicate effectively.

Unfortunately, that baseline may be going lower. One of the clouds on the infosecurity horizon is the idea that the already watered-down FISMA, Sarbanes-Oxley and HIPAA are likely to get more watered down. One IT executive opined that it would have been a successful strategy to ignore HIPAA for a couple of years because compliance costs have dropped faster than the downside for noncompliance has risen. It's naïve to believe that people who think like that are going to be held liable—they're smarter than the regulators and have superior understanding of arbitraging and deflecting risk.

On the federal side, when the message to agency IT managers and executives is "Comply, or we'll, um, tell you to comply some more!" you can see why the entire regulatory exercise has resulted in a toothless, clawless paper tiger.

I'm in favor of federal IT regulation. In fact, I'd love to help write some. Unfortunately, mine would make people cry with rules like: "If your agency has to admit that 10-plus terabytes of data has left your network headed for China, and you just noticed, every manager in your IT organization from the CIO down gets a pink slip." I know, nobody ever gets fired from a government job—no matter how incompetent—but maybe that has something to do with why things are such a mess. We need federal IT security regulation that reads as if it were written by Napoleon Bonaparte and enforced by Vlad the Impaler, not "What, me worry?" Alfred E. Neuman.

The balancing point between regulation and liability is the one place I disagree most with Bruce: Regulation is about economics, but so is liability. The problem with adopting an economic perspective on security is that it encourages people to believe there are trade-offs where, perhaps, none exist.

Allowing security to be driven by liability means you've still turned it into an economic problem, only the economics are under the control of lawyers and liability quants. None of the folks who want to approach security as an economic problem get it—intelligence warfare presents costs that may not be measurable, or may be measurable only in a generational scale through the fall of a republic. You just can't put a price tag on that.

My feelings about federal security regulation mirror Gandhi's famous comment regarding western civilization: "I think it would be a good idea." But, please, let's make sure the regulations we enact have sharp teeth.


Please send your comments on this column to feedback@infosecuritymag.com

Coming in January: Does secrecy help protect personal information?

This was first published in November 2006

Dig deeper on Information Security Laws, Investigations and Ethics

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

SearchCloudSecurity

SearchNetworking

SearchCIO

SearchConsumerization

SearchEnterpriseDesktop

SearchCloudComputing

ComputerWeekly

Close