In the aftermath of the massive WannaCry ransomware attack last month, the American Civil Liberties Union (ACLU) chastised the U.S. government for its malware management policies, which it said put users around the world at risk.
“This happened in no small part because of U.S. government decisions that prioritized offensive capabilities — the ability to execute cyberattacks for intelligence purposes — over the security of the world's computer systems,” Daniel Kahn Gillmor, senior staff technologist, and Leigh Honeywell, technology fellow, both at the ACLU Speech, Privacy, and Technology Project, wrote in a blog. “The decision to make offensive capabilities the priority is a mistake. And at a minimum, this decision is one that should be reached openly and democratically.”
The duo noted that while a bill to improve oversight has been proposed, the Protecting our Ability to Counter Hacking (PATCH) Act, “oversight alone may not address the risks and perverse incentives created by the way they work.”
Acknowledging that all software has flaws, some of which are considered vulnerabilities, Gillmor and Kahn said that the National Security Agency (NSA) long knew of the flaw in Microsoft Windows that WannaCry exploited and chose not to report it to Microsoft but rather “developed or purchased an exploit, [based on code called EternalBlue], that could take advantage of the vulnerability.”
It “allowed the NSA to turn their knowledge of the vulnerability into access to others' systems,” the two wrote. “During the years that they had this weapon, the NSA most likely used it against people, organizations, systems, or networks that they considered legitimate targets, such as foreign governments or their agents, or systems those targets might have accessed.”
Each time the NSA used it, the agency “ran the risk of their target noticing their activity by capturing network traffic — allowing the target to potentially gain knowledge of an incredibly dangerous exploit and the unpatched vulnerability it relied on,” the blog explained.
The two compared the societal risks of the exploits to balancing “risk with biological weapons and public health programs.”
Like creating a vaccine for a disease, developing a patch for a software vulnerability takes time as well as “time and logistical work to deploy the patch once developed.”
And like a disease-generating micro-organism a vulnerability can be weaponized. “A vaccinated (or "patched") population isn't vulnerable to the bioweapon anymore,” Honeywell and Gillmor wrote.
“Our government agencies are supposed to protect us. They know these vulnerabilities are dangerous,” they said. “Do we want them to delay the creation of vaccine programs, just so they can have a stockpile of effective weapons to use in the future?”
“Someone exposed to a germ can culture it and produce more of it. Someone exposed to malware can make a copy, inspect it, modify it, and re-deploy it. Should we accept this kind of activity from agencies charged with public safety?,” they asked. “Unfortunately, this question has not been publicly and fully debated by Congress, despite the fact that several government agencies stockpile exploits and use them against computers on the public network.”
The NSA, they said, “failed as stewards of America's — and the world's — cybersecurity, by failing to disclose the vulnerability to Microsoft to be fixed until after their fully weaponized exploit had fallen into unknown hands.” Honeywell and Gillmor contended that “policy around how these decisions are made should not be developed purely by the executive branch behind closed doors, insulated from public scrutiny and oversight.”
The PATCH Act establishes a Vulnerabilities Equities Review Board seated by members of the NSA, the Department of Homeland Security (DHS) and a host of other government agencies that would evaluate vulnerabilities and recommend whether they should be disclosed or remain secret.
“If the government plans to retain a cache of cyberweapons that may put the public at risk, ensuring that there is a permanent and more transparent deliberative process is certainly a step in the right direction,” the two wrote. But it is only one step. “The government must also take steps to ensure that any such process fully considers the duty to secure our globally shared communications infrastructure, has a strong presumption in favor of timely disclosure, and incentivizes developers to patch known vulnerabilities,” according to the blog post.