These attacks are harder to detect; they're more stealthy, they don't generate a large network bandwidth but they're equally capable of taking down a network.
vice president for global sales engineering and operationsArbor Networks
Arbor Networks surveyed the single largest attack bandwidth in 2010 at 100 Gbps. This is the first time the attack bandwidth broke that barrier, and it represents a 102% increase over the largest attack in 2009, according to the report. It also represents a 1,000% increase in attack size from when Arbor Networks began its survey in 2005, according to Carlos Morales, vice president for global sales engineering and operations at Chemlsford, Mass.-based Arbor Networks.
Cybercriminals can generate a large volume of traffic due to the proliferation of technology, Morales said. Between smartphones with 3G or 4G capabilities and wired broadband networks, the equipment available for botnet exploitation ranges in the billions of devices and represents far more than the amount of bandwidth available to most network operators, Morales said.
Application-layer DDoS attacks, similar to the attacks used in the WikiLeaks debacle by members of the "Anonymous" group, are also becoming more prevalent. These are more difficult to detect and utilize sophisticated tools to generate traffic that firmware handles in some way.
Arbor Networks is tracking an increase in application-layer attacks against critical infrastructure. HTTP and DNS servers are the primary victims; however, these attacks also target SMTP and VOIP infrastructure and are much more serious, Moralles said.
"The challenge with [application-layer attacks] is these attacks are harder to detect; they're more stealthy, they don't generate a large network bandwidth but they're equally capable of taking down a network," Morales said.
The "threat to defense gap" is also as wide as it has been since the inception on DDoS, Morales said. This is largely due to the current practice of defenders using traditional means, such as firewalls, to defend their servers and datacenters.
Firewalls specifically protect against infiltration attacks and block unsolicited connections. While this is useful to a consumer or a business, when used in a server or data center environment, they become chokepoints. All of the requests coming into such a location are unsolicited. This consumes the resources of the firewall, which causes it to fail under the load of the attack, Morales said.
"In 2002 the largest single attack was 400 Mbps, the largest single attack in 2010 was 100 Gbps, that represents several orders of magnitude growth in terms of attack size," Morales said. " What that projects to if you look at 2015 or 2020 is just astronomical … so you have to kind of project out what the future may lay from this and say that this is something that is going to have come to a head and we're going to have to take steps to resolve."