Information Security

Defending the digital infrastructure

iSTOCK/GETTY IMAGES

Manage Learn to apply best practices and optimize your operations.

Coder hubris: Learning security before practicing it

Microsoft demonstrated progressive security practices by reeducating its coders; other should follow Microsoft's lead, promoting learning security before putting it into practice.

I can't remember the last time I used Microsoft as an example of good security practices. But the recent news that Bill Gates is sending 8,000 of his coders back to security school demonstrates his understanding of a fundamental programming principle: Being good doesn't always mean you're good enough.

There's something about being handy with a computer that tends to go to your head -- Gates more than anyone probably recognizes this. The power of being able to manipulate all those bits makes you think you're invincible. Such bright and ambitious coders are just the type Microsoft likes to hire. By sending all his young techno-bellybumpers back to school, Gates is finally admitting that the process of creating secure code is not intuitively obvious, but must be learned.

The phenomenon of "coder hubris" takes many forms. Some attempt to create things before they've acquired the necessary expertise. Others try to reinvent the wheel rather than learning from the mistakes of the past.

My first observation of coder hubris came very close to home. Seventeen years ago, my dad decided to create an electronic record of our family genealogy. When he couldn't find the right software, Dad -- a PC power user since the early '80s -- bought an expensive database program, explaining that he would simply develop his own program from scratch.

I humbly suggested that his needs were probably not uncommon, and if a commercial product didn't meet the requirements, it was probably because it was harder to create than he thought. I advised him to wait for the market to solve his problem; instead, he spent several months tinkering with it before giving up in frustration.

A decade later, I was working for a security software firm when an inventor pitched his "revolutionary" idea for an unbreakable security product. An hour into the pitch, it became clear that it was a one-time pad system that allowed both the sender and the receiver to generate their pads on the fly, thus eliminating the problem of securely sharing randomly generated keys that have the same number of bits as the messages they encrypt.

Of course, if two different computers derive the same key, that key cannot possibly be considered the result of a random operation; this "breakthrough" was essentially the cryptographic equivalent of a perpetual motion machine. Yet, in spite of an obvious lack of mathematical experience, the inventor had worked for years on this project, assuming he had the competence to make it work.

Coder hubris is often embarrassingly apparent in the proprietary cryptosystems used by some security products. How many times have you heard a vendor promote its "super-secure" in-house encryption algorithm? In the cryptology world, robustness is the result of years of peer review by some of the smartest cryptographers on the planet. Claiming that a privately developed, untested algorithm is strong is a classic case of "wishing don't make it so."

A recent, highly visible example of this is the entertainment industry's promotion of DVD copy protection via DeCSS. Designed to protect billions of dollars of intellectual property, DeCSS was created by security amateurs with little knowledge of the process of developing and testing an encryption algorithm. A bit of reading on the subject would have made it clear that multiple failures are inevitable, and that robust encryption implementations are only built with the support of a large team of experts who are constantly testing each other's assumptions.

In reality, the process of creating secure code isn't intuitively obvious at all, which sets it apart from many programming tasks.

T.S. Eliot once wrote, "Immature poets imitate; mature poets steal." The lesson here is that creative people don't start from scratch; they stand on the shoulders of their predecessors. The only way to avoid costly mistakes is first to assume that there may be important things you need to learn, and then have your work repeatedly tested by security experts. One thing mature programmers know is that they don't know everything.

About the author:
Columnist Jay Heiser works for a large European bank in London. His most recent book is Computer Forensics: Incident Response Essentials(Addison-Wesley, 2001).

Article 10 of 13
This was last published in March 2002

Dig Deeper on Secure software development

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

The big problem I think is coders are taught the syntax of coding and a few patterns, they aren't taught things like security, performance, or even necessarily how to do emerging design ideas in college.  Much of these things have to be picked up once someone enters the workforce.
Cancel

Get More Information Security

Access to all of our back issues View All

-ADS BY GOOGLE

SearchCloudSecurity

SearchNetworking

SearchCIO

SearchEnterpriseDesktop

SearchCloudComputing

ComputerWeekly.com

Close