Does an EULA make it truly illegal to decompile software?

Michael Cobb explores a legal minefield: the legality of software decompilation.

Are all of the vulnerabilities being found these days located via software decompilation or disassembly, despite the fact that every end-user license agreement (EULA) I've ever read states that these practices violate the terms of agreement, punishable by any and all means of nastiness they think their lawyers can uphold in court? Does this mean security researchers must stick to trying to de-obfuscate Javascript and leave binary disassembly to those that have a "get outta jail free" card, or does industry simply look the other way when such a vulnerability is found, preferring to give the hallowed few a gratuitous "atta boy!" and let them go about their business?
Your question touches on an interesting area and one which is a legal minefield. Basically: Is it legal to decompile software?

Most computer programs, other than those issued under an open source agreement, are covered by copyright laws,...

giving the author various rights to the program, including the right to make copies. This makes decompilation, or the conversion of program code into readable, high-level programming language, illegal without the copyright holder's permission; decompilation, after all, involves making a copy.

Copyright laws in the United States and Europe, however, allow for limited decompilation of a copyrighted program. Decompilation can be used to gain an understanding of a program's unprotected functional elements if it's necessary to achieve software interoperability. In 1992, for example, Sega Inc. lost a case against video game developer Accolade Inc., which was deemed to have lawfully engaged in decompilation in order to circumvent the software-locking mechanism used by Sega's game consoles.

End-user license agreements (EULAs) are a different story, though. The violation of a EULA is the violation of a contract, and most EULAs state something along the lines of "agree not to reverse engineer or decompile, decrypt, disassemble or otherwise reduce the software to human-readable form . . ."

Breaking such an agreement isn't going to deter software pirates or hackers that decompile software, but it can deter valid research and responsible discovery and disclosure by security experts. If someone found a flaw in software running a nuclear power plant, I'm sure we wouldn't want them to keep quiet for fear of prosecution.

Other laws, such as the Digital Millennium Copyright Act (DMCA), have also stifled authentic research. One security consultant, for example, declined to publish information he discovered about vulnerabilities in the Intel secure computing scheme for fear that he would be arrested under DMCA when he traveled to the U.S. Interestingly, all contracts, including EULAs, are subject to being declared void if they are "against public policy."

Many vulnerabilities are found through black-box analysis rather than decompilation of a program. Take Windows XP, for example. The source code is estimated to include more than 40 million lines, and no decompiler or debugger can make much headway when faced with such a large program. But by carefully finding all the possible ways to pass inputs to it, hackers, both good and bad, can try to determine whether they have the potential to crash Windows or evade its security. This process is certainly not illegal; the danger is in how the knowledge of any new flaw is used.

The law certainly doesn't encourage software decompilation for piracy or other illegal purposes, but there are strong public policy reasons to allow security researchers to analyze code to find vulnerabilities. I'd recommend obtaining professional legal advice before contemplating any form of reverse engineering.

This was last published in August 2009

Dig Deeper on Penetration testing, ethical hacking and vulnerability assessments