[My comments below are in response to “Full Disclosure is Irresponsible” by Andy The IT Guy, who I still respect, just disagree with on this one]
“Full Disclosure is irresponsible and usually hurts more people than it helps and I still believe that is the case.”
What evidence do you have to support the above statement? First of all, define hurt? Who does it really hurt and how? If a vendor makes a mistake and has to feel a little “hurt” in order to fix it, this is a good thing. What if the vendor never intended to fix it and the public never found out? Isn’t that worth a little bit of hurt? Of course, take this on a case-by-case basis, I don’t think we can treat all vulnerabilities and vendors the same as there are so many variables.
“Most vendors, especially the big ones, will work with the researcher and release a patch in a timely manner.”
Lets not forget the vendors that threaten researchers with lawsuits, launch smear campaigns, and flat out ignore researchers. What about them?
“Releasing vulnerability details puts people in danger of having their lives screwed with by others in ways that can drastically impact them in negative ways.”
You are completely ignoring the positive affects and trade-offs. Having more details about a vulnerability allows work-arounds to be published, IDS/IPS signatures to be developed, and even potentially more problems with the same code to be uncovered. So, there are two sides to the coin.
“Also the argument that many in IT use saying that by knowing the details prior to a patch allows them to be able to test their systems and put controls in place doesnât hold much water either. Why? Because many if not most companies donât do this.”
I totally disagree (and wonder where you got the above information). I’ve personally participated in collaborative efforts (crossing multiple organizations) to develop workarounds and signatures prior to a patch.