Websites are doing a better job at keeping vulnerabilities classified as “serious” off their code base, according to WhiteHat Security’s annual study released Thursday.
But improvement is a relative process. While the number of serious flaws per website fell from 79 in 2011 to 56 in 2012, 86 percent of the tens of thousands of sites analyzed still retained one of these bugs. The sites that were evaluated belonged to 650 organizations that are customers of WhiteHat, which makes website risk management solutions.
Serious vulnerabilities are defined as those whereby an attacker could take control of a website, compromise user accounts, access sensitive information or breach compliance obligations.
And of the serious weaknesses that were discovered, 61 percent of those were eventually fixed, though it took organizations an average of 193 days to patch the issue. When they did plug the hole, compliance, perhaps surprisingly considering the havoc that vulnerabilities could reek, served as the primary driver.
The most common security defect was information leakage, found in 55 percent of websites, followed by cross-site scripting, witnessed in 53 percent. Noticeably absent from the list was SQL injection, which declined from No. 8 to 14.
The study also investigated whether commonly used preventative and reactive processes and technology, such as training, code review or web application firewalls (WAFs), actually help to limit bugs.
While instructor-led or computer-based software security training may have had an effect (organizations that employed such education experienced 40 percent fewer vulnerabilities and fixed them faster), the same can’t be said for static code analysis and WAFs. Organizations that perform code checks experienced 15 percent more flaws and remediated them slower, while those that implemented a WAF had 11 percent more vulnerabilities and were also slow to patch the issues.