Security researchers are preparing for the new normal that they will soon face in light of the cybersecurity legislation that was signed by President Obama last week within the omnibus spending deal. In recent years, security researchers have faced an escalation of threats, including legal injunctions and corporate censorship.
Now, researchers are bracing for the new challenge that an environment of automated information sharing would bring to an already challenging role. Sean Tierney, vice president of threat intelligence at IID, the cybersecurity company that runs ActiveTrust, a commercial cyberthreat data exchange, told SCMagazine.com threat sharing is most successful when it becomes automated. “Human nature is that if it requires three extra steps to share threat information, people won't always do it,” he said.
“The machine to machine part of this legislation is essential,” he said. In that light, researchers are concerned that metadata identification would lead to researchers being falsely flagged as threats.
Wesley Wineberg, a senior security research engineer at Synack, Inc., told SCMagazine.com, he expects there will be many cases in which the results of a security researcher will look very similar to the results of a malicious attacker. “If it's an automated system,” he said, “they can't know good versus evil, in most cases.”
These concerns come as industry professionals are already recoiling from targeted government surveillance. In October, DARPA set off a firestorm after the Electronic Frontier Foundation published a screenshot of a $500,000 contract awarded to Virginia-based Kudu Dynamics for the ICEWARD (“Internet Cyber Early Warning of Adversary Research and Development”) project, to gain “actionable insight into the intent of a cyber adversary.” The program description, which has since been scrubbed, mentioned that “vulnerability researchers make use of public information and resources (such as search engines and websites) that are relevant to their missions, targets, and techniques in such a way that it is possible to glean part of their intent”.
When SCMagazine.com contacted the agency in October, a DARPA representative responded by email, that the goal was “not to ‘spy' on security researchers.” He wrote, “But we do think it would be smart to see if it's possible to recognize online vulnerability-search behaviors that typically precede a hack.”
Meanwhile, researchers and the security firms they work for already increasingly face the threat of legal recourse from companies displeased with their discoveries or methods. Wesley Wineberg wrote in a blog post last week that Facebook's CSO Alex Stamos called Synack CEO Jay Kaplan to threaten legal action over a disputed million dollar Instagram bug that Wineberg discovered. (Stamos disputes this claim. He wrote on a Facebook post, “I did not threaten legal action against Synack or Wes nor did I ask for Wes to be fired.”)
Similarly, in September, FireEye filed an injunction against the German firm ERNW when it appeared that the security consulting company was about to reveal intellectual property in a vulnerability disclosure.
Of course, the work that security researchers engage in has never been particularly popular. However, companies are reacting with increasingly aggressive tactics.
“Security researchers do not have full immunity,” said Yehuda Lindell, chief scientist at Dyadic, in speaking with SCMagazine.com. “Yet, many companies threaten legal action to make it go away.”
Scenarios like these create an environment in which researchers do not always display their most diplomatic nature. While the bold methods car hackers Charlie Miller and Chris Valasek and independent researcher Samy Kamkar – who all share something of a ‘bad-boy' reputation in the infosec community – attract unwanted attention, companies are definitely motivated to at least try to fix vulnerabilities that are announced in a public, and often sensational, manner.
Although researchers who go full throttle against corporate targets often discover that it is an uphill battle, and some are taking heed. A troublesome pattern has emerged in which research firms announce vulnerabilities without publishing the specific companies in which the vulnerabilities are found.
This situation is “very disturbing,” according to Lindell at Dyadic. “There is an information disbalance,” he said. “My assumption is that the attackers will always find the flaw, and by not disclosing this information, this is effectively saying, ‘This drug has some very serious side effects, but I'm not telling you what they are.'”
Unfortunately, developments like these may increase under the new cybersecurity legislation; the final text of the legislation grants companies more liability relief in data sharing scenarios, such as from anti-trust laws. The legislation will also provide government entities – including non-federal organizations– with greater access to user information. U.S. agencies, meanwhile, have not demonstrated an ability to secure the private user information that they store. For instance, a survey published by Ponemon Institute and IID last month found that 47% of companies and government agencies have been breached in last two years.
The new legislation mandates government entities to provide “reasonable efforts to protect the distribution of PII (personally identifying information) unless that information is relevant to the cybersecurity purpose,” although IID's Sean Tierney told SCMagazine.com, “I don't know who will define ‘reasonable efforts'.”
Morey Haber, VP of tech at BeyondTrust, compared the program to SCAP (Security Content Automation Protocol), an exchange created for government agencies, military branches, and government contractors to share vulnerability protocols. “That data could not be sanitized back then,” he said. “Now we're expanding the data set outside of just vulnerabilities.”
Concerns over being targeted by the cyber legislation may cause researchers' to rethink online habits – and may affect career choices. “I think researchers will want to move to a system where they're not doing any research from their home IP address,” Wineberg said. He said that it is safer to only work through a system set up by a company that a researcher is working with.
“It adds a lot of steps and is a burden to everyone,” Wineberg added.