In an effort to prevent Cambridge Analytica-type scandals from happening again, Facebook has launched a new initiative: a data abuse bounty. Internet sleuths who discover a company is abusing their access to Facebook data could net themselves a payday of up to $40,000.
Facebook outlined its data abuse bounty program in a new page on its website Tuesday, coincidentally the same day its CEO Mark Zuckerberg was testifying before Congress over how the social media network failed to protect the data of millions of users.
For a data abuse situation to count for a monetary reward, the issue must involve at least 10,000 Facebook users. It must also be a “definitive abuse of data”—not just data aggregation. The award only applies to cases that Facebook itself is unaware of and not yet actively investigating.
Facebook details that some scenarios are also not covered by the bounty, including data scraping, malware (or otherwise tricking users to install malicious apps), scenarios that rely heavily on social engineering, and non-Facebook cases that involve other properties such as WhatsApp or Instagram.
One more stipulation that’s common to bounty programs is that you give Facebook time to investigate the issue itself before revealing your findings publicly.
If the situation meets all these criteria (and then some), whistleblowers could get anywhere from $500 to $40,000.
Such bounty programs aren’t unusual. Google has a well-known bug bounty program for those that find bugs or malware in its apps and services, including third-party Google Play apps. In 2015, United Airlines began a bug bounty program to ensure its customer and company data was secure. And in 2016, Instagram paid a 10-year-old $10,000 for spotting a particularly nasty bug in its app that allowed users to delete the comments of others.
Facebook’s data abuse bounty program comes a little late for those affected by the Cambridge Analytica scandal—or perhaps just in time, if Congressional representatives ask about how Facebook plans to mitigate the threat of data abuse in the future.
H/T Mashable