Updated: Jul 19, 2021
Intelligence agencies have been using facial recognition software to identify people for a long time. The danger now lies in the fact that this power has started belonging to everyone.
A company called Clearview had earlier provided US law enforcement agencies with a facial recognition tool that could scrape photos of people on the internet to uncover their identity. They currently have a database of 3 billion images sourced from online platforms like social media, news media and other websites.
The war against facial recognition software
It’s not all gloom and doom though. There is a way to fight this new wave of AI trained facial recognition softwares. Emily Wenger, a computer science student at the University of Chicago, co-developed Fawkes and created a great tool to protect oneself in a digital world. It works by shifting your pixels and confusing AI-based facial recognition softwares.
According to Fawkes’ creators, “At a high level, Fawkes ‘poisons’ models that try to learn what you look like, by putting hidden changes into your photos, and using them as Trojan horses to deliver that poison to any facial recognition models of you. Fawkes takes your personal images and makes tiny, pixel-level changes that are invisible to the human eye, in a process we call image cloaking. You can then use these ‘cloaked’ photos as you normally would, sharing them on social media, sending them to friends, printing them or displaying them on digital devices, the same way you would any other photo. The difference, however, is that if and when someone tries to use these photos to build a facial recognition model, ‘cloaked’ images will teach the model a highly distorted version of what makes you look like you.”
Fawkes vs Clearview
Till now, Fawkes has been downloaded as many as half a million times. There’s also hope that it could be picked up by a social media giant like Facebook who would do well to cloak the images uploaded by their users. In an interview with The Verge, Ben Zhao (a computer science professor who contributed to the tool) said, “Adoption by larger platforms, e.g. Facebook or others, could in time have a crippling effect on Clearview by basically making (their technology) so ineffective that it will no longer be useful or financially viable as a service.”
Clearview’s CEO Hon Ton-That too didn’t have kind words for the creators of Fawkes. “There are billions of unmodified photos on the internet, all on different domain names. In practice, it’s almost certainly too late to perfect a technology like Fawkes and deploy it at scale,” he told The New York Times.
When Microsoft Azure saw through Fawkes
According to a report in MIT Technology Review, Microsoft Azure had once become robust to Fawkes’ cloaking methods. Perhaps, so many people started using Fawkes that Microsoft Azure could see through the cloaked images.
Fawkes’ team got to work and released a version that worked against Microsoft Azure too. Like almost everything on the internet, it is a cat and mouse arms race. “There’s always going to be a disconnect between what is legally acceptable and what people actually want. Tools like Fawkes fill that gap,” Emily Wenger (who created the software) told MIT Technology Review.
LowKey – another tool that protects you
There’s another tool that does something similar called LowKey but works differently. It’s made by a team consisting of Mathematics and Computer Science students across University of Maryland and US Naval Academy.
In their paper, Valeria Cherepanova and six others wrote, “We design a black-box adversarial attack on facial recognition models. Our algorithm moves the feature space representations of gallery faces so that they do not match corresponding probe images while preserving image quality. We interrogate the performance of our method on commercial black-box APIs, including Amazon Rekognition and Microsoft Azure Face, whose inner workings are not publicly known.”
The Urruda take
The very thought of an AI system scanning through our personal photos for information scares us. We believe that there should also be more awareness about softwares like Fawkes which can help cloak our images from being spied on.
When we post an image on social media, we want our friends and relatives to see the photo. We don’t want an AI based software to scan it. In this digital world, we are losing something very precious to us – privacy. Even though there are tricks to keep things hidden, most people don’t know about them and give away their data indiscriminately. Internet’s aim may have been to disseminate information easily, but it’s actually shortened our attention spans, dissuaded us from physical activity and restricted our lives to a screen.
1. How to stop AI from recognizing your face in selfies (MIT Technology Review)
2. Clearview AI
4. This Tool Could Protect Your Photos From Facial Recognition (The New York Times)