With black hat brokers able to outbid even the likes of Google and Apple for vulnerabilities, Davey Winder explores the possibility that the bug bounty model is fundamentally flawed
Apple recently joined the growing number of corporates to launch a vulnerability reward program, better known as a bug bounty scheme. Initially limited to a couple of dozen researchers already known to Apple, it will pay as much as $200,000 for a critical security vulnerability, which sounds a lot, until you learn that a small private firm called Exodus Intelligence offers as much as $500,000 for zero-day vulnerabilities in iOS.
While Exodus says its customers - who pay subscriptions starting from $200,000 per year to access intel on these vulnerabilities - are defensive rather than offensive, the security industry needs to consider whether bug bounty programs are a broken concept needing regulation.
Show Me the Money
Apple isn’t alone in offering financial reward to bug hunters who uncover vulnerabilities, although it’s maximum payout is at the upper end of the bounty program spectrum. The days of insulting researchers with t-shirts by way of reward (yes, Yahoo, we’re looking at you) are long gone, but what are the bounties out there, and how do they differ between large corporates and the smaller business?
The US Department of Defense launched a pilot ‘Hack the Pentagon’ program through HackerOne earlier this year and paid out $70,000 in bounties to 58 researchers for 134 vulnerabilities. The first was found within only 15 minutes.
Facebook launched a bounty program in 2011 and has paid out $4.3m to 800 researchers in 127 countries, with $1m of that being paid in 2015 alone.
Microsoft offers a top bounty of $100,000, while Uber has a maximum payout of $10,000. Pornhub sits in the middle with a $25,000 cap, and HackerOne itself will pay $10,000 for severe vulnerabilities found.
As Ken Munro, a partner at ethical hacking outfit Pen Test Partners, told us “details of bug bounty programs are readily available on sites such as HackerOne and BugCrowd, and since launching its bug bounty program in 2010, Google has paid over $6m to security researchers. Rather than sit on its laurels, Google recently doubled the reward for its Chromebook from $50,000 to $100,000 with no maximum reward pool when it realized that no exploits had been found to further incentivize researchers.”
The key is that companies offering bounty rewards generally do so based on the size of their business, so smaller companies will have lower caps. Unfortunately, zero-day brokers selling to the highest bidder will typically pay more than even the largest company can justify, especially if the bug has a very high Common Vulnerability Scoring System (CVSS) score. The CVSS is the standard for assessing security vulnerability severity, calculated on a formula including metrics to approximate ease and impact of exploit. “Those with a very high CVSS score that impact a large number of devices, or have specific use-cases of interest to nation states, typically start at $100,000, and could be sold for many times that,” says Neil Cook, chief security architect at Open-Xchange.
However, these kinds of vulnerabilities are pretty rare. “Individuals working entirely on the dark-side might spend a very long time before they find one”, Cook adds, “it’s not the gold-rush of popular imagination.” Even so, given that such brokers can typically outbid the likes of Apple and Google, we have to ask if the bug bounty concept as it exists is therefore flawed.
A Broken Concept?
David Gibson, VP of strategy & market development at Varonis, certainly doesn’t think bug bounty programs are pointless. “Many SME’s don’t have the staff required to search for vulnerabilities quickly enough with the breadth of techniques,” he explains. “It’s effective to crowd source instead of, or in addition to, hiring a firm or penetration tester who typically charge a flat fee rather than you paying for what they find."
Cody Mercer, senior threat research analyst at NSFOCUS (International Business Division) told Infosecurity about his experience of running a bug bounty program at a previous manufacturer. “We had a few findings discovered by researchers that led to a major flaw found in one of our product lines”, he said. “Had this vulnerability gone unnoticed, the potential could have been an APT that could have dire consequences if left un-checked. The ROI far exceeded the budget needed to run this program so it proved to be invaluable.” Also, as Gavin Millard, EMEA technical director with Tenable Network Security, reminds us: “by motivating researchers and developers through bug bounty programs, more issues will be discovered and dealt with, leading to a reduced attack surface.”
Not everyone agrees that monetary-based bounty programs have a positive effect on responsible vulnerability disclosure though. “It sets the bar for the lowest available price for vulnerabilities and legitimizes the further selling of such vulnerabilities to other third parties that offer higher compensation,” argues Amichai Shulman, CTO at Imperva. While not agreeing that the concept is broken, Thomas Richards, senior security consultant at Cigital, said zero-day brokers existed long before the current programs offered by companies, and as such they have “a long uphill battle to compete against the zero-day broker market.”
Of course, whether brokers are a bad thing per se depends on your definition of a broker in the first place. Just because someone sits between bounty hunter and impacted vendor, party to a trade, that does not make him a bad guy. In fact it could be the very opposite, ensuring that both parties get a fair deal. Which begs a further question: does the bounty hunting industry need an official regulator?
Regulation Matters
Neil Cook looks to the likes of HackerOne, a conduit between researchers and companies, as an argument for voluntary rather than compulsory regulation. “As part of the rules of the scheme, companies publish vulnerabilities found through a process of responsible disclosure; it’s an open, transparent system.” Individual companies determine the bounty rates, not HackerOne, and researchers are well aware of the rules before choosing to sign up.
What we do need, perhaps, is a change in the way disclosure happens as a country. In the US, where disclosure affecting customer data is mandatory in 47 out of 50 states, there is much less incentive to brush an incident under the carpet. “Disclosure practices there have fuelled bug bounty programs which is why so many American companies have them,” argues Munro. “Foster a more open culture here in the UK, which hopefully will happen when EU GDPR or its post-brexit equivalent comes into force in 2018, and we should see bug bounty programs receive far wider acceptance as a consequence.”
Will these need to be adjudicated? “As a commercial undertaking”, he adds, “this type of practice is best self-regulated by the industry.”
Interview: Marten Mickos, HackerOne CEO
Created by former security leaders from Facebook, Google and Microsoft, HackerOne was the first vulnerability coordination and bug bounty platform. Infosecurity spoke to CEO Marten Mickos to find out more.
Infosecurity: How much have HackerOne customers paid out?
Marten Mickos: HackerOne customers have rewarded over $10,000,000 for 30,000 resolved issues. In the first 100 days, Uber’s public bug bounty program has awarded $345,120.48. Google spent $1.5m on Chrome and Google bug bounties in 2014, and Facebook has paid more than $4m since launching in 2011. Companies on HackerOne are offering bounty rewards as high as $50,000 for a severe issue. The most active customers on HackerOne’s platform spend about $1m per year in bounties. Small customers get by with as little as $50,000 per year, and some programs run purely as vulnerability coordination programs where no bounties are paid for any type of submission.
Infosecurity: So are the motivations of the bug bounty hunters purely financial?
MM: Bug bounty programs are competitive but do not aim to compete with the black market directly. The vast majority of hackers participate for recognition and purpose, as much as for financial reasons. HackerOne just released the ‘2016 Bug Bounty Hacker Report’ which found that while 72% reported they hack for money, 70% said they hack for fun, 66% reported hacking to be challenged, 64% hack to advance their career and 51% reported hacking to do good in the world. Money is a key driver in bug bounty programs but it is not everything and this helps explain why 57% of bug bounty hackers reported participating in programs that do not offer bounty rewards.
Infosecurityc: What about regulation of the market?
MM: The great thing with bug bounty programs is that it is a voluntary market. Companies set whatever bounties they like and hackers participate in whatever programs they like. The bug bounty market is a self-regulating system. It is in the interest of companies to maintain the trust and loyalty of every hacker so they will continue to find vulnerabilities. It is in the hacker’s best interest to maintain trust and loyalty with companies so their reputation grows, qualifying them for future bug bounty invitations. When hackers submit valid reports and companies pay for them, we are all winning. Any process or regulation that would slow down vulnerability disclosure and resolution has the potential to put all of us at risk.