Monday, September 1, 2008

WhiteHat website vulnerabilities stats

| Armando Romeo |
WhiteHat security showed some numbers based on the stats collected by real word assessments they carried out on 687 custom-code websites (so no "known" publicly available vulnerabilities stats here)

Jeremiah's webinar was quite interesting and showing such numbers is not a common practice either.
So before any comments, kudos to Jeremiah.


The most interesting of the stats are the type of vulnerabilities found that I wish to add my comments to.


67% of sites suffer from Cross site scripting
Not a big surprise here. I'd have expected even bigger numbers here. Probably .NET framework xss prevention played some role here.
XSS is still on top of top of owasp 10 vulnerabilities a lot of research has been made and a lot of discussion on xss prevention is going on but developers still fail to sanitize and encode input/output.

The only solution I see here is to prevent developers from writing buggy code through the use of robust frameworks since years of tutorials, conferences and best practicea don't seem to have worked.


17% of sites suffer from SQL Injection
SQL injection is easier to prevent. Remediation/development units of an organization tend to prioritize this kind of vulnerability cause its danger is felt more real compared to xss.

The attack to organization's data has more "emotional impact" over the management and executive units than xss.
Xss takes more priority when it is persistent anyway.

SQL Injection prevention is also better understood from a remediation report. With prepared statement or proper input sanitization the vulnerability is away.
Forcing developers to use prepared statement would drop this percentage near to zero.

As Jeremiah said CSRF made its appearance for the first time in the stats.
But honestly speaking, 8% for this vulnerability had something suspicious to me.


Someone asked more information on this before me.

Jeremiah says explicitly that these numbers are based on the effectiveness of automated tools they use.
And adds that real numbers for CSRF are approaching 75%.

This is what I thought at first indeed.

At now all the companies are relying upon automated tools that are not able to cover such kind of vulnerabilities that require a manual testing for their discovery.

This is where the industry is stuck.
Relying upon tools that are not able to catch them all (logic flaws here too) and keeping manual testing at a minimum to keep low fares and win competition.

Not that this is a sin by security services vendors, but the 8% of CSRF is symptom of a commonly accepted failure of such tools.

Accepted by companies that cannot afford a manual reviews (when feasible anyway) and accepted by security services providers who earn money with much less efforts.

My question is: How could have, a tool, discovered a logic flaw like the latest big Joomla exploit?
Probably only a source code audit could have uncovered it (probably). But who is auditing souce code anymore when you have tools?
Smells like the old "false sense of security" here.

Free Security Magazines