News Stay informed about the latest enterprise technology news and product updates.

XSS bugs, information leakage top list of website vulnerabilities

Companies are moving more rapidly to correct errors by feeding virtual patches into Web application firewalls, according to WhiteHat founder and CTO Jeremiah Grossman.

Cross-site scripting (XSS) continues to top the list of vulnerabilities plaguing websites, according to the latest trend report from website vulnerability assessment vendor, WhiteHat Security Inc.

To get security news and tips delivered to your inbox,  click here to sign up for our free newsletter.

WhiteHat said about 70% of websites it scans are likely to have at least one critical website vulnerability, while another 63% are likely to have flaws that are in need of attention.

The security vendor found that the websites it scans have a 65% chance of containing XSS bugs followed by information leakage (47%) and content spoofing errors (30%). The firm said business logic website vulnerabilities, which enable hackers to take advantage of the functionality of a site, occupied more than half of the top spots. Other errors in its top ten list to be released tomorrow include insufficient authorization, SQL injection, predictable resource location, session fixation, cross-site request forgery, insufficient authentication and HTTP response splitting. 

Cross-site scripting vulnerabilities:

Hackers broaden reach of cross-site scripting attacks: An explosion of AJAX-based applications has increased the damage that cross-site scripting (XSS) attacks can inflict on machines. A new tool uses XSS flaws to create a botnet.

Can fuzzing identify cross-site scripting (XSS) vulnerabilities? Fuzzing may find weaknesses in software, but the testing process can't find every flaw. Ed Skoudis explains what other tools are necessary when looking for cross-site scripting vulnerabilities.

"These are real, live production websites that showed a whole range of errors," said Jeremiah Grossman, founder and chief technology officer at WhiteHat Security.

The WhiteHat Website Security Statistics Report pulls together statistics based on more than 1,000 websites the vendor scans with its Web-based Sentinel vulnerability scanning software. The latest report contains data collected between January 1, 2006 and March 31, 2009.

Social networking sites topped the list of most vulnerable websites with an 82% chance of having an urgent, critical or high severity vulnerability. They were followed by education websites with 76% chance of containing flaws and IT websites came in a close third with a 75% chance of containing flaws.

Gross man said the state of website security is improving as companies with high profile websites use scanning tools to find flaws and deploy Web application firewalls to apply virtual patches quickly to defend against cyberattacks.

"When you are able to assess on a weekly basis you can see what's working and what's not and adjust accordingly," Grossman said. "Virtual patches are an effective tool to address serious vulnerabilities quickly." 

The security vendor said it took on average about 58 days for companies to correct an XSS vulnerability. It took firms 85 days to correct website information leakage errors and about 71 days to fill content spoofing holes. Insufficient authentication, likely found in about 10% of websites it scans, take the longest to correct at about 125 days. Virtual patches, which allow companies to shield vulnerabilities through a Web application firewall can significantly increase the time it takes to patch a critical hole.

WhiteHat labels content spoofing, insufficient authorization, HTTP response splitting, directory traversal and SQL injection flaws as needing the most urgent attention. The vendor said it uses the Web Application Security Consortium (WASC) Threat Classification as a baseline for classifying vulnerabilities and the Payment Card Industry Data Security Standard (PCI-DSS) severity system to rate vulnerability severity.

The vendor will hold a webinar on Tuesday at 2 p.m. ET to discuss its study's findings. Grossman said the firm takes two approaches: how to treat sites that haven't been created with a more mature software development lifecycle and ways companies can secure websites already in full production.

Dig Deeper on Web application and API security best practices

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.