Researchers on Wednesday said a so-called “nameless” undetected malware stole a database in the cloud that contained some 1.2 terabytes of files, cookies, and credentials that came from 3.2 million Windows-based computers.
In a blog post, NordLocker said the virus escaped with 6 million files that it grabbed from desktop and downloads folders. Screenshots made by the malware revealed that it spread via illegal Adobe PhotoShop software, Windows cracking tools, and pirated games. The malware also photographed the user if the device had a webcam.
According to NordLocker, the hacker group responsible disclosed the database location accidently and the cloud provider hosting the data was notified so they could take it down. The data was stolen between 2018 and 2020 and included 2 billion cookies.
Malware has dominated the threat landscape in one form or another for decades, and yet it’s the same story of poor security hygiene, lack of properly configured security controls, and just general lack of user awareness that seems to be the main problem, said Vishal Jain, co-founder and chief technology officer of Valtix.
“With cloud computing growing at 40%, the malware problem has shifted to target this new frontier,” Jain said. “All security controls are ultimately fallible. As the saying goes, if there were perfect defenses you would have security vaults, but no security guards and auditors since the vault is perfect. Organizations need to focus on defense-in-depth at the network layer. The network is common ground across all these attacks and exploits. Some of these network defense concepts like anti-virus, DLP, and firewalling are pretty well understood and still applicable in the public cloud.”
Sean Nikkel, senior cyber threat analyst at Digital Shadows, said we will continue to have problems with exposed data as long as people are not using all the good security practices at their disposal. He said if companies store critical data in the cloud, there are numerous options for cloud-native security from every large cloud provider, as well as third-party vendor solutions.
“The question should also be asked if that data is even necessary or if it should be stored in perpetuity,” Nikkel said. “Tie any data stored to a specific time-to-live based on need or compliance, and audit the environment regularly for access and vulnerabilities. At the very least, build databases with secure coding principles and other best practices and tested periodically. Also, patch the servers regularly.”
Law Floyd, director of cloud services at Telos, added that security pros should apply strict access controls to any database and ensure the inbound ports the database gets open to are restricted to only the absolute minimum needed. Floyd advised to create strict policies that are written and dictated, as well as ensuring personnel are properly educated on these policies.
“A quickly thrown together security plan is the first step in a failed security implementation,” Floyd said. “Take the time to properly analyze key vulnerabilities and create an in-depth security plan that mitigates these vulnerabilities, as well as strengthens the overall security of the environment.”