When Microsoft temporarily doubled its maximum bug bounty prize to $30,000 earlier this month, it was hard to not to notice the timing. After all, the software giant had just been burned twice by Google Project Zero researchers who publicly disclosed Windows vulnerabilities before they could be patched.

This was not the first time Google played hardball with software developers who failed to patch flaws within the company’s strict 90-day disclosure window. So it would certainly be understandable If Microsoft’s newfound generosity was intended – at least in part – to incent ethical hackers to discreetly find future Windows vulnerabilities before Project Zero and like-minded research groups can, let alone black hat actors with truly dishonorable intentions.

But Microsoft is far from the only enterprise feeling the pressure. After all, bug bounties have not only been validated as an effective cybersecurity tool, but they are outright changing the way cybersecurity is practiced, as companies are now financially competing for white-hat hackers’ time and resources.

The stakes and rewards have been raised

Like Microsoft, Google also recently upped its bug bounty rewards – specifically for finding remote code execution flaws and unrestricted file system or database access issues. Josh Armour, Google’s security program manager, writes in a company security blog that the increase was an acknowledgement that “high-severity vulnerabilities have become harder to identify over the years” and “researchers have needed more time to find them.”

Even Apple, after stubbornly sitting out of the bug bounty game for years, went “all in” in August 2016, announcing that that it would begin offering up to $200,000 in rewards.

“Recent trends have shown a rapid increase in bug bounty participation in the private and public sector alike,” says Johnathan Hunt, vice president of information security at InVision. “As organizations continue to realize a company can be breached even with the largest of security teams and budgets, a mature program and strong security posture, they will begin looking toward more non-traditional approaches to strengthening their security practice.”

A solution provider for product design, InVision has so far awarded over $100,000 in payouts to over 500 different researchers through bug bounty company Bugcrowd, rewarding hackers for finding weaknesses in third-party plugins, integrated SaaS tools, DNS, “and even the most obscure application functionality not picked up by traditional security tools,” Hunt says.

One of the most surprising recent bug bounty developments was when the U.S. Department of Defense approved its own bug bounty program, the first in the history of the federal government. Before this, the notion of a U.S. agency openly encouraging researchers to probe its networks would have sounded absurd.

“Initiatives like ‘Hack the Pentagon’ will happen more and more frequently,” Belgian security researcher Inti De Ceukelaire, says. “It could be extra interesting for the government to offer payouts for vulnerabilities their political opponents could use against them.”

Another bellwether moment that augurs the near future of bug bounties was in July 2016 when FCA US, the American subsidiary of Fiat Chrysler Automobiles, became the first full-line automaker to offer financial rewards for the discovery of vehicle vulnerabilities. (Tesla notably rewards hackers for their research as well.)

Indeed, connected cars and other Internet of Things (IoT) products will likely be the catalyst for a whole new slew of entries into the bug bounty landscape, as already evidenced by such companies as Fitbit. Hunt says that bug bounty scopes, in general, “will continue to evolve beyond traditional application vulnerabilities to all areas of this global system of network connectivity we call the internet, including IoT, IIoT, M2M and more.”

Gurkirat Singh, a software graphics engineer with an independent security research team, has a similar vision: “The majority of the bug bounty programs today are geared toward web security and it is going to stay that way at least for the next few years. But as we see more and more IoT devices come into the hands of the consumers, a stronger switch from website to mobile app/network security will be required and bounty programs will have to cater to those needs,” Singh told SC Media.

De Ceukelaire predicts that within a decade, IoT device manufacturers will begin to include bug bounty policies in their instruction manuals. “It’ll become a new standard,” he says. “Sure, there’ll always be exceptions, but having a bug bounty program will be considered a pro when it comes to product comparison.”

Renwei Ge, senior director, product security at Qualcomm, a semiconductor and telecom equipment company, agrees that having a bug bounty program in place could have a bottom-line impact on sales.

 

SC Media Q&A with Sean Malia
You’ll have to forgive Sean Melia for being only the number-two-ranked researcher in the 100,000-member HackerOne bug bounty community. After all, he splits his time between hunting for vulnerabilities and performing his day job as a pentester for Gotham Digital Science. 

“Consumers are becoming more aware of the security of the devices they depend on every day, be it a smartphone, home router or connected home camera,” says Ge, whose company uses the HackerOne bug bounty platform to crowdsource its vulnerability testing. “In such a connected world with so many products to choose from, security is becoming one of the leading differentiators when it comes to the purchase decision.”

Bug bounties even appear to be impacting how business is conducted at its highest levels. “We’re talking to more and more CISOs who are reporting their bug bounty findings to the board,” Casey Ellis, founder and CEO of Bugcrowd, says. “It won’t be long before this is an expectation. And with cyber insurance seeing more momentum…it’s only a matter of time before risk assessment becomes mandatory for businesses at all levels: fundraising, during M&A, and prior to filing S-1. Bug bounties will be essential to providing this data.”

Still, not every industry is necessarily ready to lay out the welcome mat for hackers. For instance, says Singh, “Financial and health institutions don’t want researchers tinkering with their system because it is far more than a nuisance for them.”

Singh adds that internet-connected infrastructure has spread to such industries as transportation, coal, agriculture, health and manufacturing, “but you won’t see a bug bounty program for these any time soon due to product accessibility and risk factor.”

 Therefore, Singh explains, “only the companies that are willing to take some risk, have their core focus on web security, and [whose] product is easily accessible by others will be the ones shaping bug bounty programs in the future.”

Optimizing your program

Organizations must take into account their own specific needs when developing their bug bounty policies in order to optimize resources and ensure the best possible results from their participants.

“In order to have a successful bug bounty program, companies will need to sure they have the resources and ability to successfully validate, triage and fix issues within a reasonable timeframe,” says Vivek Raman, engineering manager, security, at Yelp, which worked with HackerOne to take its private bug bounty program public in September 2016. Six months later, the company reported that the relaunched program has so far resolved 52 bugs and paid out $17,200 in bounties, with an average disclosure response time of two days and an average resolution time of one month.

“Bug bounties, especially on launch, can create a significant workload for security and engineering teams, so being prepared for that is critical,” Raman says.

Efficiency, says Qualcomm’s Ge, is an especially key objective. “It is a prerequisite to have a well-oiled incident response process that is able to handle large amount of incoming vulnerability reports,” says Ge. “An established secure development lifecycle within the organization is also needed to turn reactive work into proactive vulnerability prevention and early detection.”

Qualcomm launched its program in November 2016, not long after researchers discovered that Android devices containing Qualcomm chipsets were affected by a group of serious vulnerabilities known as Quadrooter. In three months’ time, the company has dished out around $80,000 in rewards.

 

Among the important decisions companies must make when offering bug bounties is the openness and inclusiveness of the program. Qualcomm decided to make its program invitation-only, though it has not set a cap on the number of researchers it can usher into the program.

Qualcomm chose a private program in order to filter the pool of researchers to include only those with specialized knowledge in embedded devices. Also, “we wanted to keep the noise low,” explains Ge. “Any public bug bounty program will have quite a large number of unrelated or out-of-scope submissions.”

These reports still require manual work to sift through, Ge says. “But by working with talented and trusted security researchers, we can minimize the noise and increase our own efficiencies.”

 “Turning the program public too soon can cause an influx of reports you are not equipped to respond to swiftly,” Michiel Prins, co-founder of HackerOne, tells SC Media. “I recommend everyone considering a bug bounty program to start small in a private environment and slowly scale the program as you get a hold of things. This private stage allows you to gain experience in working with hackers, work out any unforeseen kinks in your internal processes, and keep the influx of bug reports at a volume you are equipped to handle.”

Starting with a controlled, private program also allows companies to test uncertain waters. “The idea of opening your system to hackers can be frightening,” says Bugcrowd’s Ellis. “A private program allows organizations to harness the power of a team of external security researchers without inviting the full crowd, allowing time to clear that trust hurdle and grow comfortable interacting with the researcher community.”

Some companies won’t even publicly acknowledge the existence of a bug bounty program. “This can be beneficial for a company that is just starting out and trying to refine its new program without ultimately staining its reputation,” says Singh.

Companies must also carefully consider and clearly communicate their policies pertaining to both program scope and rewards. “A lot of the companies eventually learn by experience that the majority of the bugs that are submitted to them are not potentially in their scope of interest,” says Singh. “Setting up strict guidelines over what kind of bugs are eligible for consideration is very important for both the company and the security researcher.” Singh once found himself in a dispute with Facebook over the severity of a vulnerability he discovered, and the resulting compensation he received.

As for rewards, “Finding bugs and figuring out a way to exploit them takes a lot of time and creativity, hence every individual wants to achieve the most utility out of their work,” adds Singh. “Being conservative and paying below standard compensation for a specific category of bug will not make researchers flock to help make your infrastructure better.”

GM, for instance, opened itself up to criticism last year when it created a bug bounty program that lacked any monetary awards.

De Ceukelaire advises companies to invest time in their policies and their researchers. “The rules need to be clear and not to be misunderstood,” he says. “Expect researchers to write a decent report. Set up a baseline. Ask researchers how they would tackle a certain bug and ask for more information if needed.”

“I personally like companies that establish an informal [or] friendly relationship with their hackers,” De Ceukelaire says, noting that some companies will go the extra mile, sending thank-you notes and gifts. “When I was in Vegas last summer, I got a Samsung Galaxy S7 out of the blue – a gift because my other phone broke the day before.”

Companies also face a question as to whether their program should be self-managed or outsourced to a bug bounty platform provider. Originally, InVision began its program in-house, but quickly found itself overwhelmed.

“We were unprepared to manage the number of submissions, the validation, the communication with the researchers, and the ability to effectively assess and prioritize the findings,” says Hunt. “We talked about hiring additional security staff, but past experience has taught me urgent internal needs always take priority and eventually your bug program decays, which is worse because you’ve set expectations with your customers now not being met.”

Placing InVision’s bug bounty program in the hands of Bugcrowd, however, allowed the company to accept and process thousands of submissions without straining company resources.

In-house versus outsourced, public versus private, big scope versus small scope. Presumably, a great many more companies will be weighing these same options in the coming months and years. Because it’s starting to look like not having a bug bounty program to crowdsource your vulnerability testing may be the biggest vulnerability of all.