By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
We have seen tremendous growth in two areas. Specifically, cross-site scripting and cross-site request forgery have both exploded within the Web 2.0 world. That's because Web 2.0 and Ajax rich internet applications require Java script to be turned on. That means that cross site scripting becomes a much more viable threat. SQL injection is an old attack that has been around for many, many years, but it's being revived with new life because of the way Web 2.0 applications are being deployed. Before, you had large single controllable transactions. In Web 2.0 that attack surface has very much fragmented. Instead of a single transaction, there might be three or four asynchronous transactions, one after another. So the application receives input in many different places and it's interacting with the database and that leads to SQL injection vulnerabilities and all kinds of other vulnerabilities associated with that fragmented attack surface. What is the risk of these threats? It seems like it would take a sophisticated attacker to exploit these vulnerabilities.
At the beginning of any kind of era it does take the skilled attacker to put together a complex and sophisticated attack kit. But what happens over time is that these packages are produced and distributed on malware sites and in the underground. So script kiddies pick them up over time. The barrier to entry becomes less and less over time.
One thing that we can't forget is that we talk about all these vulnerabilities and Web. 2.0. But attackers are not focusing on them. We're not seeing it anyway in the exploits out there. We're not seeing the attackers focused on Web 2.0 right now because there are so many problems in the original, Web 1.0 world, without Ajax, SOAP, XML, and remote procedure calls. We're still seeing attacks against Web 1.0 because it is so porous from an attacker's point of view.So, at least for now the bad guys are still having fun in the Web. 1.0 world and haven't even gotten to Web. 2.0 vulnerabilities yet?
That's correct. If my mother is still willing to respond to an email and give out her banking information then there's no need for them to take more time to develop a very complex exploit kit. As those lower level attacks become closed off and organizations become more aware, there's no doubt at all in my mind that attackers will shift towards the Web 2.0 type attacks and now is the time to address it. How has Watchfire's research group changed since it was acquired and integrated into IBM?
Since the acquisition we have gained considerable focus at looking specifically at Web applications, Ajax and Web 2.0 technologies. … we had a number of issues in the hopper that had to do with Web applications, some that had to do with networks and some to do with hosts. We have passed off some of the network and host-based research off to the ISS team and in reverse they have handed off to us some of the Web application security research that they had in their hopper. The information that we are providing feeds the broader set of security issues that IBM looks for.
Our focus is being very heavily involved in the solutions for the Web applications security space. What is changing for us quite rapidly is our very deep integration into the IBM Rational product set. Instead of just doing end-to-end security risk analysis, now what we want to do is give that comprehensive platform for governing software quality. The security issues are really just about the quality of the software being delivered. So we're focused on building that software security analysis into the software development lifecycle. With the whole movement toward a service-oriented architecture, Web services are being introduced into many company environments and it seems like attackers taking advantage of this. Is that the case?
Organizations we see are focusing on Web 2.0 and Web-based technologies primarily because of four factors. The first is social demand—there's demands on an organization for more collaboration. Everyone is both a producer and a consumer. Next is market pressure—on the business-to-business front there are demands. For example, if I was hosting real estate application there is a demand to integrate that with mapping technology. There is also the competitive edge—if my competitor is giving a very rich interactive interface, then I need to keep up with that same experience or I can lose my clients. Then lastly, the technologies—we now have these new, very rich interactive frameworks to provide an experience through the browser very much like the desktop experience, which has been so dynamic but not available to us in the past. If a company is going to be using some of these Web 2.0 technologies, what should they be considering?
One of the interesting things about applications, which is very different from the networking infrastructure, is that we assume it's secure unless it's proven otherwise. We assume software is secure until we hear there's a vulnerability and then we patch it and it's secure again. It's kind of a backwards mentality. If we're talking about networks and infrastructure, we assume it can be broken until it is proven that it's been built on a very solid methodical way that has been documented. Organizations need to approach those different ways that applications are hosted differently. If it is a commercial off-the-shelf product, I would like to see some level of assurance that the product is secure. That is going to be driven by customer demand. If you're talking about software-as-a-service or software that has been outsourced or offshored, then it becomes an imperative that the organization considers service level agreements that not only factor in security, but also include some level of assurance, including both metrics and methodology, to prove security was considered during the development lifecycle and during the build process. For software built in-house, analysis has to be done earlier in the lifecycle, you can do it incrementally and you do it iteratively and you come out with better software.