Narcissistic vulnerability pimp: Baker on researchers and bug bounties

Narcissistic vulnerability pimp: Baker on researchers and bug bounties

Date: Jun 03, 2011

In a blog post, Verizon Director of Risk Wade Baker proposed a new title for a security researcher looking to get attention who releases bug information before a patch is released: Narcissistic vulnerability pimp

In this video, Baker defends his point, and also discusses bug bounties, and the real point of vulnerability disclosure.

For more information:


Read the full text transcript from this video below. Please note the full transcript is for reference only and may include limited inaccuracies. To suggest a transcript correction, contact editor@searchsecurity.com.    

Narcissistic vulnerability pimp: Baker on researchers and bug bounties

Eric Parizo: Hi I'm Eric Parizo from SearchSecurity.com. It's great to have you with us. Joining me today is Wade Baker, Director of Risk Intelligence for Verizon. Wade, thanks so much for being with us today.

Wade Baker: Absolutely. Glad to.

Eric Parizo: Wade, let's talk about responsible disclosure for a few minutes. You wrote a widely read and highly provocative blog post last year about your disdain for the overuse of the term security researcher. Tell me about that.

Wade Baker: Alright. You are right, it was widely read and widely hated in some circles I think, but some people liked it. You didn't hear about the like as much as the hate, but I guess that's the way things always go right? I honestly can't claim sole authorship of this blog post. We published it as an admin.

I wrote some of this blog post. It has a lot of input from the rest of the team, but in general it does state our position on the matter, and this has been like this for 10 or 15 years. This has been almost a religious discussion in security communities, but how should disclosure be done and who should be the players and how should that be handled and all that kind of stuff.  But the term security researcher is what we were picking at and some of it was kind of tongue in cheek- but I think it's a valid question right? Is someone who is finding information that's potentially harmful to users of computing resources and they're publishing that at will, for many cases their own glory. They want to go talk about it, they're not really interested in fixing the problem. I know that's not all of them, but I'm just using an example. Is researcher the right term to use there? Why do we do that? Why not something else, you know?

Eric Parizo: Now in that post you focused on the term “narcissistic vulnerability pimp”, which you defined in the post as, and I quote, "One who solely for the purpose of self-glorification and self-gratification harms business and society by irresponsibly disclosing information that makes things less secure or increases risk." Do you regret making that statement and how do you feel about NVPs today?

Wade Baker: So I actually again, I'm not trying to back out of this, but this is a term that existed before. Marcus Ranum was a colleague of ours at one time, and I think the first time I ever heard it was from him. So I think he might have coined the term and it stayed in our group. I think it's an interesting description. I can't say that I regret and I don't know if I should, but no, I don't regret it. It's obviously a tongue in cheek description, but there's some seriousness in it. I think a lot of this is driven by narcissism, more than an actual desire to make security better.

Eric Parizo: To play devil's advocate with you for a minute, some would say the public disclosure of a vulnerability is the best way to alert those who may be at risk of a particular vulnerability and to encourage a vendor to address it promptly. Are there times when public disclosure is necessary and appropriate?

Wade Baker: Honestly it depends on how it's handled. This is not my area of constant research. I have some opinions on this matter and my view on it is pretty simple. The rule is simply if you're going to release information, whether it's a vulnerability or anything else, there needs to be, right when you release that information, something else done to where that information doesn't increase risk.

If I release, "Hey, there's a security hole in whatever piece of software!", and there's millions of people across the Internet using that software, there's no possible way that the vendor can create a patch and deliver it out to protect all of those people, right when that information is released. And as soon as it's released, all of the attackers have the jump to be able to start creating exploits if they want to. It's not a good way to do things in general. If the release of information increases the likelihood that an attacker will target that vulnerability or increases their likelihood of success in being able to do so, I can't see that it doesn't increase risk.

Eric Parizo: What's you take on bounty programs? Does it essentially depend on how they're managed?

Wade Baker: Yeah, I think it does. It seemed like there are some of those that certainly work better than others. And there is absolutely nothing wrong with searching for flaws in software and companies should pay more people to do that. They should hire more software engineers. They should hire more people to do that for their own software before they release it. I'm absolutely in favor of that.  Those bounty programs, if you want the view them in some cases as you're paying contractors right? It's a business decision at that point and the company is willing to do that. What I disagree with and I'm not convinced. A lot of this is I'm a data guy and I haven't seen data that makes me believe that what our current method of disclosure is working.

So let's take for instance, getting away from the bounty program, when I release information and I just choose to do that, I'm saying “This vulnerability that I've found is more important to fix for that vendor than anything else they're working on right now. They need to concentrate on this. They need to stop what they're doing. They need to fix this problem.”

I don't think someone has the right to do that, and it's probably counterproductive. What if that vendor is fixing other more important flaws at that particular point in time that the world doesn't know about yet and they're honestly trying to do a good job. I know often they're not, but you see my point. It's forcing them to concentrate on this right now, which, as often as that's done now there's sometimes hundreds, thousands of these interruptions, and there's probably a better way to do it. But those bug bounty programs are at least an acknowledgement and a program started by the vendors where they're wanting to participate in that. Right?

Eric Parizo: And finally, if you could change one thing about the security research community or even how it's perceived, what would that be?

Wade Baker: I think that has got to be that we need to stop chasing flashy new products and answers that are not there. We had this history of some new technology is going to save us and that's going to make security work in the real world and we've been doing that for years and years and years now and it hasn't worked yet and I don't think it ever will. I think the one realization would be that if companies, no matter who they are or what they are, if they seriously concentrated on the basics of security and made sure, triple checked, all of these kinds of things we'd have a healthier internet. we'd have healthier internet and healthier companies in terms of security and I honestly think that is where it is. It's like quality management. It's a daily process of checking the same old stuff, making sure that what we say in policy is actually reflected in what we do, and that sounds very simple and it's not about “Yes, we need new products; yes, we need innovation." We absolutely needs those things, but as far as all the data that I've ever looked at that's the one thing that I think would really improve matters.

Eric Parizo: Wade Baker, Director of Risk Intelligence for Verizon. Thank you so much for joining us today.

Wade Baker: Absolutely. Thank you.

Eric Parizo: And thank you for joining us as well. For more videos remember you can always visit SearchSecurity.com. I'm Eric Parizo. Stay safe out there.

More on Vulnerability Risk Assessment

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: