SearchSecurity recently asked its members to comment on the issue of full disclosure and to let us know where they...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
stand on the issue: Do you believe that fully disclosing vulnerabilities only arms crackers with the information they need to create havoc. Or, do you believe that full disclosure of details forces vendors into quick action and ultimately, safer software.
We got an overwhelming response, and many comments were used in a news story on the topic. Below are the remaining comments:
"Lets face it, when a hacker does a number on you because of a well known or a recently discovered flaw, the company usually loses a bundle trying to find out what happened and control the damage. I say... if you sell me software it better be tested for buffer overflows, etc. I say this because most of the problems seem to be BUFFER OVERFLOWS which sets up a condition to let a hacker do his/her thing. I truly believe that vendors should be held accountable for their buggy software and the problems that arise as long as the application has been set up according to their (the vendors) documentation. I think this would be a good place to start fixing these issues at the source, ie., the software vendor."
-- John Blank, network services, North Arundel Hospital, Maryland
"The only problem in withholding vulnerability information is the lead window allows for extended exposure for the IT community as a whole. If a vulnerability existed that could not be quickly addressed, the local (organizational) IT department must have the all-pertinent information to make an informed decision to protect their information assets. All the cool-off period (lead window) does is allow those who are aware of said vulnerabilities to potentially exploit those vulnerabilities while the IT world as a whole is unaware that any vulnerabilities exist. Full immediate disclosure is paramount to the protection of our information assets."
-- Bob Waldron, manager information services, Arrow Terminals
"Maybe some paranoid software marketing groups should get their beta code to these people and pay them consulting fees to Q&A their product. We all realize that software vendors no longer have good Q&A departments of their own, as they put out bad code and full of holes on a continuous basis. These problems are beta testing problems and should never make it to Release 1. Software vendors are letting greed run their life instead of responsibility. "Get it out the door and get the bucks ...we will worry about fixing it later", unfortunately is the order of the day."
-- Mark Bruch
"I think that this double-edged sword can be turned so that it doesn't cut so deeply on either end. First, I think a protocol should be adopted where the bug discoverer notifies the software developer with the specifics of the flaw. If after a couple of weeks, the software developer does not at least update this person on their status regarding the flaw, the discoverer goes to defcon 2. This means they announce publicly that they have discovered this flaw in this particular vendor's software, yet they don't divulge any specifics. They could also state that they will announce the specific vulnerability at such and such a date. This might give the vendor a little more impetus to address the issue, while still keeping the true nature of the vulnerability from falling into the wrong hands. And if after this gentle nudge, the vendor still refuses to acknowledge the flaw or at least to contact the flaw discoverer about this discovery, by all means, disclose everything to everyone. Unfortunately, we have learned that it is the desire to avoid bad PR that drives some, rather than the aspiration to do right. In these cases, there isn't a whole lot left to do, short of a class action lawsuit."
-- Bryan MacLeod, Policy-Studies, Inc.
"Though I can see both sides of this issue, there are several things that concern me: 1. Depending on the level of vulnerability, one week may not be enough time for the vendor to fully test and certify a patch. We could end up having the patch create a hole somewhere else in the vendor's code. 2. If a vendor has a history of responding to vulnerabilities, then we have no reason to change. 3. If a vendor has a history of dragging their feet on posting patches, I think David Litchfield's comments are valid. My biggest fear in trying to force vendors to respond in 7 days is that we end up with shoddy work. I would also not be adverse to going public with a list of vendors who have in the past made a habit of not being responsive to repairing vulnerabilities."
-- Eric Willemstein, Systems Implementation Manager, FASCOR, Inc.
"After years working in information and physical security I would first weigh the severity of the problem. If the impact of a breach in the security could cause catastrophic damage, I would not hesitate to inform all that would be affected so they might take immediate actions to prevent a problem. I find that far too many people/businesses base actions on immediate monetary impact and not for the protection of their customer. Knowing that decent hackers are everywhere and many excellent hackers are always looking for a challenge, I would put out the warning immediately. If the vendor wants to become a hero then let them put out a patch fast enough to get those customers back up on their software quickly. If the data that could be compromised is not that important just ask the board of directors that stand to loose because of the breach. Bottom line, I would say that if the damage form negative publicity is more important than the customer's data vulnerability then just possibly the customer needs to consider another vendor? I would advise clients this way as well."
-- Jesse Wilcox
"First, this is an extreme idea, however, I believe a "happy medium" may occur from taking the two extreme sides of an argument and moderating the output to something acceptable. Manufacturers Responsibility. It is the manufacturers responsibility or the responsibility of the development community (if open source) to develop patches for their product. By holding the manufacturer accountable is the only way you can achieve compliance to "acceptable behavior". My idea is to create a "registrar" of manufacturers that are agreeable to produce a proper patch within "X" number of days after a vulnerability is found for a particular piece of software. Not producing this patch is subject to penalties (pre-determined based on the classification of the security threat and the overall characteristics of the software as it relates to how wide spread the software is used and what exactly the product does, ie. "Networking", "Accounting" and/or if the software is used by the general Internet community). The registration of the software/manufacturer would be on a voluntary basis, the manufacturer would pay a fee to be registered and the monies acquired would be used to cover the expenses of the registrar (setup as a not-for-profit agency). Any and all penalties paid would be used to further the cause of the registrar. Users, Administrators, Developers, etc, that are looking to purchase a particular piece of software would go to the registrar, do a simple search for either the name of the software or the name of the manufacturer and cross reference to see if the manufacture/software is registered.
Penalties would be accrued after the software manufacturer is unable to provide a valid, workable patch that eliminates the vulnerability within "X" number of days. Furthermore, for every day the patch goes unavailable, additional penalties are applied. The manufacturer and all responsible parties, including the person making the notification about the vulnerability would be asked to notify the registrar of the vulnerability. The registrar would be restricted and not allowed to divulge the information for an acceptable amount of time, giving the software vendor a chance to patch. If the patch isn't produced within "X" days, the registrar would be free to publish the information concerning the vulnerability, proof of concept, test results, and possibly the code used to exploit the vulnerability.
Basically, this would be setting the registrar up as a "third party", enforcing policy on contracted vendors, the vendors would be seen as more reliable because of their acceptance to the registrars policies, users would be more at ease concerning software purchases from vendors contracted to provide proper patches. The registrar would be seen as a non-partisan, non-profitable company providing a strict service with no influence on their bottom-line by big-business interests.
In the United States, we have Consumer Protection Agencies, Underwriters Laboratories (for electrical appliances), Buyers Guides, etc., to protect users purchases, why don't we already have the same thing for software?
One week should be sufficient. I see no reason what-so-ever that a software manufacturer shouldn't be able to provide a patch within that time frame... if they can't, I would think the software is of poor manufacture to begin with, probably about time to find something new anyway :-) I know I did. I got tired of Microsoft's problems and switched my servers to Unix and I haven't looked back."
-- John Holstein
"Prompt (certainly within a week!) disclosure that a vulnerability exists must be made; however, the full mechanics of how the vulnerability was exploited should not be openly published. A week in the computer software business, especially regarding destructive viruses, is a very long time! To make full disclosure will only encourage and enable those criminal vandals who refer to themselves as "hackers" to improve their arsenal of weapons to use against the innocent, especially against those who do not update their anti-virus software on a daily basis (most of us).
Along with the disclosure should be published information about any new files (e.g., *.exe, etc., file names & sizes) that are created by the virus (Trojan, etc.) and how to check for the existence of such and to correct (usually by deletion of the executable file, etc.) the infestation. At the present time, the great majority of computer users are more-or-less totally at the mercy of anti-virus software vendors who claim to be protecting their systems. Having appropriate correction information would enable most of the computer-using public to check their own machines and take appropriate corrective action: it is simply unwise to assume that anti-virus software has completely taken care of all problems.
The above would revolutionize, in a very practical sense, the defensibility of computer users' property and enable a lot of small businesses to continue to survive against the unending onslaughts of malicious vandals, worldwide."
-- Lindsey V. Maness Jr., Chairman, Resources & Technology Symposia of Colorado, Llc.
"If a bug finder fails to report what he knows, who is being protected? Over the past couple of decades, I've reported numerous bugs to various vendors, often without even an acknowledgement, let alone a fix. Bug reports seem to go into a big recycle bin and when the vendors feel like issuing a fix, they do. Personally, I'm tired of paying premium $$$ to vendors like Microsoft, Symantec, etc., for buggy, poorly tested products. And just try calling their tech support dept without a credit card firmly in fist."
-- Glen Moulder, U.S. Navy
"Will those who reveal the security flaws be subject to prosecution under the DMCA and UCITA? A SW vendor could (and some have) call the flaw a feature and claim under DCMA and UCITA that the reporting party is disparaging their software without permission and that they must have reverse engineered parts of the software to discover and understand the flaw? The vendors may yet be able to silence the critics!"
-- Rick Burke, Senior Network Administrator, Defense Microelectronics Activity
"It seems unreasonable to assume that a vulnerability in software could be fixed in as little as one week. Depending on the application you might have to change half the code. To list the vulnerability to soon would give the bad guys a open door. Anyone who irresponsibly would do that should be held at least partly accountable for any damages resulting from their irresponsible disclosure. This all comes down to accountability. It is my opinion that some accountability for any media sharing that information to quickly should be held accountable also. You may not be robbing the bank, but you left the vault open. You then broadcast to the world you did."
-- Ron Hizey, Potlatch Corp.
"The debate about vulnerability disclosure focuses the searchlight on the point of failure, rather than its root cause. One root cause of failures in code for security reasons can be traced to the cavalier way in which modern systems have been developed. We have abandoned analysis and design in favor of extreme development (never mind the overall system -- just get this bit working and don't bother documenting it). Known problems - buffer overruns - segregation of stored data - control of executing code, continue to be the problems reported in CERTS time and again. History - old code we are still running - also catches up with us. Layered architectures have left us with more layers to fix than skins on the onion. The forbidding of layers to talk to each other (thank you OSI), and therefore to know each other's security contexts and security results, merely adds fuel to the fire.
Quality control that finds it acceptable to let systems through which are then shown to have major errors 'out of the box' coupled with the arrogance of saying that users will do 'whatever I want' also have their part to play. Security 'experts' may also find themselves in this group.
Let us recall that the most powerful weapon of all is enlightened self-interest. Litchfield berates the manufacturers for lack of diligence, but whilst there is a cloud of secrecy over the whole affair it matters little to either the manufacturer or the customer. A better-organized approach may be to run a league table.
Bugs can be reported to manufacturers, initially with xx days grace. Their response speed can be monitored (perhaps they want to publish back how quickly they respond) with respect to the user perceived severity of the bug. Whilst they fix within schedule the number of days does not decrease (except on some overall quality scheme, say by x days per annum on some stated basis). If they do not fix in time their number of days grace decreases by x days and their position in the league table moves accordingly. The table can be published monthly without fear of litigation.
Hopefully enlightenment will achieve a state of grace. Whilst all this happens, others must look to their laurels and fix the underlying nasties that keep coming back to bite us.
-- Steve Mathews
"Security through obscurity doesn't work. Never has, never will. We need to collectively tell software vendors they need to fix security bugs before they leave the `factory.' This is the only way do to it after the fact. My hope is that announcements of the bugs are sent far and wide.
-- Ted Frohling, security incident response team, University of Arizona
"Hey, there's a new game in town. Now as a hacker I don't have to spend hours finding holes in programs and then if I do find a hole exploit it. The new game will be a race between el hacker and el software company(and anyone using the program). Can el hacker exploit the now known hole before el software company and users fix it. Basically, I think it's a bad idea and vendors should be given a longer lead time. Could Litchfield be a closet Linux(or its variants) guy and just be wanting to slam Microsoft a little harder/quicker?"
-- Tony Kasyan, Director of Information Systems, Individualized Care Management, Inc.