Distributed denial-of-service attacks (DDoS) saw a significant uptick among Akamai customers in the fourth quarter of 2013 — the number of attacks growing 23 percent over the previous quarter and 75 percent over the same quarter in 2012, according to the company’s latest “State of the Internet Report.”

Using its Akamai Intelligent Platform, the company gathered data on attack traffic and mobile connectivity as well as broadband adoption and internet connection speeds and found evidence of 346 DDoS attacks, up from 281 the quarter before.

“I think that we’ve come to expect that, for better or worse, DDoS and similar types of attacks will continue to be a significant issue, and unfortunately, will likely continue to increase in volume going forward,” David Belson, senior director of industry and data intelligence, told SCMagazine.com in an email correspondence. “As the cost and complexity of executing such attacks approaches zero, DDoS remains firmly entrenched as the weapon of choice for malicious individuals and organizations, including so-called activists.”

Escalation of attacks in the Asia Pacific region helped push up the overall number of DDoS events, with many executed in retaliation to the Singapore government’s decision to create an internet licensing framework.

“This framework would require some sites to register themselves if they have more than 50,000 unique visitors, and the requirement there has met with extreme resistance from internet activists,” Belson said.

Attacks rose slightly, just three percent, in the Americas over third quarter 2013 figures and dropped slightly in the Europe, the Middle East and Africa (EMEA) region.

Once again, China topped the list for attack traffic, accounting for 43 percent, more than twice the number generated in the U.S.

The report results also cast web vulnerability scanning tools, and the way they are used to target companies for attack, into the spotlight. Akamai noticed a flurry of attacks aimed at the financial industry that exploited the Vega and Skipfish scanners, using them to find the vulnerabilities in targeted companies’ websites. Typically used by security professionals to assess their own sites, the scanners were employed by attackers to search for Remote File Inclusion (RFI) vulnerabilities identified by the string www.google.com/humans.txt .

“The idea is that those responsible for the security of a website would use these scanners to identify vulnerabilities in them, so that they can be patched before they are exploited by the bad guys. Unfortunately, not every web site/application owner is as responsible, so many publish sites and applications that have vulnerabilities that can be exploited,” Belson said. “The bad guys are using these tools to find those vulnerable sites, attacking them in an attempt to exploit the known vulnerabilities.” 

When a site accepts a URL from another domain and then loads its contents in the site — for example when a site owner doesn’t validate which URL is allowed to load — an RFI vulnerability is created. Attackers exploit by loading a malicious URL into a site and users, believing they are on a trusted site, may give out personal information.

The attacks observed by Akamai turned out to be part of a larger threat, with more than two million RFI attempts made over a two weeks in the last month of the quarter.

To better protect their data assets, Belson urges companies to “maintain a proactive security posture, which includes making sure that all server and application software is patched to current levels, as well as leveraging IP and Web Application Firewalls to limit inbound traffic to only that which is necessary. In addition, leveraging scanning tools like Skipfish and Vega can help them find (and ultimate close) holes in their perimeter before they can be exploited.”