Researchers simulate a botnet of 1 million zombiesannounced this week that they have been able to create a simulated botnet consisting of more than one million machines.
Researchers at Sandia National Laboratories in California, headed by Ron Minnich and Don Rudish, were able to boot more than one million kernels, or the central component of most operating systems, as virtual machines in a massive botnet simulation. Previously, researches had only been able to create a simulated botnet of up to 20,000 nodes.
Right now the foundation of the botnet has been created, but researchers have not actually run it yet, Minnich told SCMagazineUS.com on Friday. With the next phase of the project, which will begin Oct. 1 and then run for three years, researchers will actually fire up the botnet they have created.
“At a minimum it would be nice to find out how to slow them down or stop them,” Minnich said.
Minnich added, perhaps ruefully, “The people who write botnets are just brilliant people in my view and they put a lot of work into making these things very hard to detect.”
The task of analyzing botnets is difficult since infected computers are distributed all over the world, researchers said.
“The more kernels that can be run at once, the more effective cybersecurity professionals can be in combating the global botnet problem,” Minnich said in a statement.
Researchers used Linux kernels for the simulation. To achieve the virtual botnet, scientists used virtual machine (VM) technology, emulating real machines in software, and a powerful supercomputing cluster at Sandia called Thunderbird. Essentially, they were able to spin up a million VMs on one supercomputer.
And now that researchers have established a simulated botnet of one million, they are setting their sights on loftier goals.
“Eventually, we would like to be able to emulate the computer network of a small nation, or even one as large as the United States, in order to ‘virtualize' and monitor a cyberattack,” Minnich said.
If their next goal is met, researchers should be able to construct models of parts of the internet to better understand and analyze overall processes.