Threat Level: green Handler on Duty: Xavier Mertens

SANS ISC: Internet Security | DShield SANS ISC InfoSec Forums

Participate: Learn more about our honeypot network

Sign Up for Free!   Forgot Password?
Log In or Sign Up for Free!
Responsible Disclosure or Full Disclosure?

The Google Online Security Blog posted a brief article on their opinion the full vs responsible disclosure debate... likely in the wake of the controversy of one of their researchers publishing a security vulnerability.  The debate on publishing security vulnerabilities has been and remains a hot one.  Almost all vendors support "responsible disclosure" (a term that I absolutely detest) where a researcher discloses the bug only to the software vendor who then (hopefully) patches the bug.  Full disclosure is publishing the vulnerability publicly once it is discovered (or in some cases, once a PR firm has been hired to manage the hype).

There are pros and cons to both approaches.  Responsible disclosure really only works when there is responsible software development.  However, if the good guys have the vulnerability, the bad guys have it and at least 12 more.  With the exception of the few vendors which buy vulnerabilities, responsible disclosure does not allow the security community to develop counter-measures to protect against the threat while a patch is being developed.  For instance, it took about a week for software to be developed to detect the LNK vulnerability and there are still problems with it.  On the other hand, full disclosure hands the details to the bad guys in public so they can immediately exploit the vulnerability.  It does, however, get vendors and researchers to move quickly.

What are your thoughts on how disclosure should be handled? 

John Bambenek
bambenek at gmail /dot/ com


262 Posts
ISC Handler
Jul 27th 2010
Related to the term, Microsoft's MSRC has introduced a new term: Coordinated Vulnerability Disclosure

I would like to see a sensible meet-in-the-middle approach where the researchers you would call "the good guys" disclose to the vendor first and give a deadline when they will release full disclosure. This gives developers a lead-time to get on the patching cycle. If the developers find there are extenuating circumstances and that full disclosure is likely to cause significant disruptions, they should be able to request a reasonable delay and "the good guys" should honor it (but it should only be a temporary postponement.)
There really needs to be some 'whistleblower' type legislation to address this; including bounties.

My personal experience is that most vendors, when you "responsibly disclose" a bug, ignore or threaten you. In these cases, I choose to be "irresponsibly" and I opt for a full disclosure, releasing all informations about the vulnerability and a working exploit. Sometimes they patch, sometimes no...
See rain forest puppy's October 2000 RFPolicy 2.0 at
Quoting from "The Purpose of this Policy":
This policy exists to establish a guideline for interaction between a researcher and software maintainer. It serves to quash assumptions and clearly define intentions, so that both parties may immediately and effectively gauge the problem, produce a solution, and disclose the vulnerability.

See also the 2002 IETF draft (expired) Responsible Vulnerability Disclosure Process at . Especially pay attention to Section 5, and the community commentary references within.

The arbitrary upper bound suggested by Google may not be desirable or helpful. Google's post misses rfp's essential point that contact by a reseacher is the opening of a conversation about a unique set of circumstances. RFPolicy's goal is a framework to conduct frequent and honest communication between the parties that recognizes the good faith of the researcher and the complexities that may be faced by the vendor.

The frustrating thing about Google's post is that it raises no new issues. The community has been down this road before. Google's post, despite citing Bruce Schneier's excellent 2001 Crypto-Gram, reflects an ignorance of prior quality work on this topic.

4 Posts
Best practice: Lead by example and policy:

160 Posts
I have to agree with Shinnai. I've been burned too many times trying to be helpful, and it's just not worth the pain or the professional trouble. I post the vulnerability and the exploit, and if I can work up a patch or develop a mitigating measure for a vulnerability I post that, too.
No Love.

37 Posts
I share Shinnai and No Love's sentiment. As an example on one occasion, a big software developer, who had 3 previous vulnerabiltities published, refused to fix a 4th identified by a pen test. Not until I got the vendor on the phone with the pen tester and told the latter to publish the vulnerability that any progress was made. There seems to be a built in arrogance nowadays on the parts of vendors who either don't care or view fixes as inconvienient. I'm in the middle of yet another situation right now where the same issue has popped up again. Disclose!
I have mixed feelings about this subject. Full Disclosure does make some vendors take vulnerabilities more seriously. They work faster to get a fix out for it. But on the other hand, I work in an EDU environment that is decentralized. It is VERY difficult to get systems patched or workarounds implemented quickly. Your average user at home is more at risk than say a large company that is using centralized management.
Dana Taylor

5 Posts
I am with JW. somewhere in the middle with "Here is the bug. If you do not patch by xyz date, I will disclose publicly.

If, as in Shinnai's scenario, the company ignores or threatens you. I do not agree that full disclosure is the answer. Shinnai even states "sometimes they patch, sometimes no".

I say in this case, disclose the bug to the security community and let them make their various patches for it. if you disclose publicly, the software company is STILL NOT GOING TO PATCH if they threatened you in the first place...and now the bug is out there and there will be no patch coming. keep it quiet and fix it yourself.

now this is where the whistleblower part comes in. the company who threatened the researcher or security expert should absolutely be held responsible in some way for their *irresponsible* approach to the bug.

23 Posts
I believe the problem is that the researchers get placed in a spot where they must disclose eventually if they want to protect the online community. The system as it stands is totally flawed.

What should be done, is the vulnerability should be submitted to CERT or a similar entity, who would then contact the vendor to initiate a fix, and make a vague public statement that there IS a vulnerability. This would let the public know immediately that something should be done, and force the vendor to fix the problem.

If the problem is not fixed within a reasonable time, the vendor should be held responsible for notifying ALL of it's customers who have registered the product of the situation, where the customer would then be allowed to sell back the software if no fix is available.

Watch how fast things get fixed, and how fast you get real software writers again. It is true, the cost of software would go up a bit, but the world would be a lot safer.

A lot of the software written today is done by inexperienced writers. You get what you pay for, but it is still not right! I worked on Sendmail a loooong time ago, so I am fully aware bugs exist and need to be squashed. We had a few :-). Yes I am a dinosaur, but still loving it!

Just my two cents.. -Al
Al of Your Data Center

80 Posts
I agree with the idea that a hybrid approach is needed, and I especially like what "Al - Your Data Center" has proposed.

Simply relying on the vendor, with no assurances of a timeline, is not enough. But, blasting the details of the vulnerability to the planet is also not in the best interest of everyone, as it still favors the bad guys who have more time to spare than the rest of us.

Having a neutral, but influential authority like CERT (or an alternative) involved, and providing somewhat mutually agreed upon timelines by which a fix/workaround must be in place before subsequent disclosure might be pursued, is a good course.

And, of course, this should be accompanied by some generic information which indicates a potential problem, so that customers can start getting prepared to mitigate an issue.

I also like husaragi's suggestion of a whistleblower penalty...
Al of Your Data Center
10 Posts
Full disclosure - "plain and simple" We all would enjoy a world where code was written flawlessly, but it isn't and will not ever. Why? Because of the $$$. The faster the production, the faster the profits. So, attention to detail is waived to release dates and profitability.

Arrogance has been in the tech industry since well before the "Legion of Doom", and it will never stop. Some folks are phenomenal at what they do in tech, and their ego is that of a seven year-old with a new bike. We're talking about tech-folks people... not Rico Suave.

Openly posting vulnerabilities, known exploits, and other related specifics can threaten the profits of some companies. (MS and Intel excluded) If the results end up in the televised media, (not those boring Nightline editorials), then a hype can produce quicker results if the company starts seeing their income threatened. Let's face it - the majority of upper-management don't focus on a more refined product, but a higher profit yield by less maintenance over time of the original product. Yes, it's sad... especially when some third-party company is hired to write the apology letter to the masses... but... work can begin rather quickly to resolve the actual issue, in these circumstances... some of the time. If the company doesn't stand to lose that much from one area of its overall production, they can discontinue or sell the product or service to another company cheaper than added costs to salary and wages for overtime or third-party contracting. It's the age-old question: "Is the risk worth the reward?" And, in today's business landscape... no one takes "real" risks anymore.

And, some will believe that once these vulnerabilities are openly published, then we all endure it in our respective areas of expertise... and, we all search frantically for a solution. In my experience, this same thing happens after any type of computer convention, industry "hype" over new tech in the marketplace, or even illegal downloading sites. If it's a new exploit, the usual happens: 1) Very few if anyone knows 2) All hell breaks loose 3) We turn to the manufacturer 4) Instantly disappointed resulting in no fix, and 5) Do whatever is necessary to protect our systems and wait for a fix, "or" 5a) Do whatever it takes to find our own solutions and screw the manufacturer. I'm not saying that they shouldn't know... but, chances are - if you know and there are topics in Google searches, they know, too.

Internet hype is always dependent on mass media hype as a result, so... posting 40 topics in the Forums of the respective "vulnerable" company can leave you watching posts of other folks as upset as you are instead of resolving the situation or finding a solution. Most times you get useless rhetoric from some site-admin of a page they copied from the companies excuse-manual detailing a message that could be apologizing for vulnerable code or a broken washing machine. The jargon usually applies to either. I'm not saying that it's not as valuable to us as admins and users, but a vast majority of folks are not looking for technical solutions on the internet. It's either porn, Facebook, or illegal downloading. To those of us who manage hundreds or thousands of those users and machines, then the battle can be compared to kicking water uphill. (Anyone remember the MySpace chat program? /cry)

Personally, I don't think trying to bargain with the manufacturers to find some "neutral ground" is a solution. Some host their own creation in-house and others sub-contract the writing of the code. If I'm paying for a product from this manufacturer and all the accompanying support, then I want results as soon as possible, like everyone else. My clients like their services and all the bells and whistles that come with it. But, if I decide to buy the newest car on the market and in ten days the injector pump goes out... I still need to get where I need to go. I don't have time for them to schedule a meeting for next Thursday to sit down and discuss what they think needs to be done... I need what I paid for... a working car. It's not like one guy sits in a room and does this all buy himself... there are many employees that take on multiple roles in the differing areas of the product... so, it's not like there isn't someone already responsible for it. We can take the defense and explain that there is no comparison, but... if you pay for something - no matter its technical merit... you expect it to work as advertised. If we can manage numerous locations, software titles, services, networks, and implement whatever the market asks for, then so should our providers... "plain and simple"
Al of Your Data Center
1 Posts
Great post, skinny.

There's another facet not yet mentioned - namely, Retribution. Perhaps I'm alone in remembering people being arrested and having their houses raided for reporting gross flaws in various bank websites, or things as trivial as iPhone email address exposures.

Responsible disclosure is little more than suicide, and you truly are a moron if you do it; if you're lucky, you'll only be charged with attempted extortion. And in this environment, which the apologists completely ignore, most of your options for reporting demand that you screw them before they can screw you - either by anonymous Full Disclosure, or use a 3rd party broker who cannot know your identity.

"While Auernheimer’s arrest for drug charges is obviously warranted[...], it’s hard to escape the fact that the Feds shouldn’t have even been at his house."

Go ahead, tell me YOU will ever report a flaw to AT&T, knowing that within 72hrs you'll need to replace your front door, rear door, and by the way - every computing asset you own is gone, which includes any backups (since those are assets too, right?). Got a business? That's gone, too. And you have no recourse, and they have ZERO accountability for their abuse.

Vendor contact is almost NEVER an option, and no amount of PR will change that. Plain and Simple.

42 Posts
Well, I will disagree on that one a little. Granted the big boys will melt you, a but small business may welcome the information.

I remember back when the Internet was all text based and was just starting to have graphics and browsers. At that time I was working for a company which will remain nameless, on Long Island. Basically doing what I do now..

We received a call one day. The caller said "I am in your BSDI systems now". He then gave us the names of a few files and their contents. It was true.

Guess what? We hired him!!

Al of Your Data Center

80 Posts
Oh, a little more info.. That hacker graduated MIT and now designs processors for Intel!
Al of Your Data Center

80 Posts
Oh, a little more info.. That hacker graduated MIT and now designs processors for Intel!
Al of Your Data Center

80 Posts
When I've reported problems, I have been threatened or ignored far more than thanked. I no longer report problems, since I was insulted, harassed, and threatened in one call. If it wasn't almost frightening, it would have been funny (being threatened for "dimka violations" (DMCA for everyone who doesn't pronounce it).

18 Posts
Well, I can't hide that each time I sent information about a bug, I thought to two people
to read, and they say:

"The question is: what is a manàmanà?"
"The question is: who cares?"

Obviously I'm joking :-)
I once submitted an issue to Microsoft where I felt an out of cycle security patch was only partially correcting a vulnerability. I was emailed back and told that to address my issue further, I would need to visit PAID SUPPORT because the operating system in question was on extended support. So you want me to pay for you to fix a security vulnerability properly? Huh?
Fortunately an additional patch on the next Patch Tuesday addressed my concerns.
Ken B

4 Posts
Steven: I've gotten a bit of retribution for reporting bugs in the 'responsible' manner and I really don't feel like dealing with that unique pain again. The messenger gets shot for trying to do the right thing all too often.

doj8: Yep, the DMCA sledgehammer gets used all too often. Official cease and desist orders a bit less so, but they still happen. I still cringe every time I receive an envelope with a legal firm's return address in the corner.
No Love.

37 Posts

Sign Up for Free or Log In to start participating in the conversation!