Facebook and other companies have turned to the research community for help finding flaws, reports Angela Moscaritolo.
Standing on stage at the Facebook F8 developer's conference in September, founder and CEO Mark Zuckerberg (left) boasted that the social media site he invented in his Harvard dorm room back in 2004 – the same site which now has more than 800 million users – recently hit a milestone: Half a billion people used Facebook in a single day.
There is no denying that the behemoth that is Facebook has become ingrained into users' everyday lives. But even giants can fall. If members believe the information they post on Facebook is unsafe, they will move on – plain and simple.
This reality is not lost on those who work for the company. In fact, it's quite the opposite. Within the walls of Facebooks's headquarters in Palo Alto, exists a culture dedicated to providing users with a secure experience, says Joe Sullivan, the company's chief security officer.
“Trust is fundamental,” Sullivan says. “That's something we think about every day. There is never a situation where the company trades off security for something else. If there is a security issue, we drop everything and deal with it.”
One of the necessities in running a web presence used by hundreds of millions of people each day is ensuring its code is free of errors – security vulnerabilities – that could allow an attacker to gain access to private accounts. By any measure, coding errors are extremely prevalent, not just in websites spanning the internet, but also in commercial computing products and custom-developed systems.
“Vulnerabilities are dangerous, and people outside of the [computer security] industry aren't aware of how many latent vulnerabilities there are in products they use every day,” says Dino Dai Zovi, an independent security consultant who started bug hunting to find such issues in 1999, and who has disclosed flaws in products made by Apple and Sun Microsystems (now owned by Oracle).
While Sullivan estimates that hundreds of employees across Facebook work on security issues, there are two primary groups dedicated to preventing, finding and fixing vulnerabilities. The platform integrity team, within the software engineering department, works to ensure that every single engineer in the company follows secure-coding practices. Then, the six-person product security team, which is part of the security department Sullivan manages, works to “poke holes” in the code that has been created, scouring it for vulnerabilities.
In addition to the internal bug finders, the company also calls on external auditors to review code for weaknesses before it is released online.
And, to ramp up its efforts to find holes that could be abused by attackers, Facebook recently followed the lead of several other major web companies – including Google and Mozilla – to launch a so-called “bug bounty” program. Such initiatives offer independent researchers monetary incentives for the private disclosure of vulnerabilities and exploits.
Since rolling out the program in July, Facebook has already doled out $70,000 to researchers around the world for the discrete disclosure of 72 vulnerabilities, all of which have since been fixed, Sullivan says.
“I think it is a good thing to have more people testing our site, and I believe that because we launched the program we have encouraged more people with expertise in security issues to help us,” he says.