Researchers at Facebook are using artificial intelligence to “de-identify” individuals in videos by slightly altering their appearance.
The technique, which can be used to thwart facial recognition systems, was outlined on Sunday in a paper from the Facebook AI Research team.
“Face recognition can lead to loss of privacy and face replacement technology may be misused to create misleading videos,” the paper says. “Recent world events concerning the advances in, and abuse of face recognition technology invoke the need to understand methods that successfully deal with de-identification.”
The machine learning system works not only on videos but on live videos as well, producing a “quality that far surpasses the literature methods.”
The changes made are meant to be just subtle enough that humans will still recognize the subject but facial recognition systems won’t.
Speaking with VentureBeat, Facebook AI Research engineer Lior Wolf added that the technique could be used to alter other identifying characteristics such as an individual’s voice.
Facebook’s research is set to be presented at South Korea’s International Conference on Computer Vision next week. There are currently no plans from Facebook, however, to implement the system into any of its apps.
The de-identification system is somewhat ironic given that Facebook is currently facing a class-action lawsuit after the company used facial recognition on its users without their consent.
READ MORE:
- Amazon’s facial recognition misidentified Boston athletes as criminals
- U.S. blacklists Chinese facial recognition startups for human rights violations
- Google used BET Awards, homeless communities to diversify facial recognition
H/T VentureBeat