Intelligence agencies have been using facial recognition technology (FRT) for a long time. Now though, this capability will be given to Indian Railways, Airports Authority of India and public sector utilities. Even offices can use it to track employee attendance and colleges can use it to track student attendance.
FRT works by creating a mathematical formula based on your face’s geometry. The distance between your head and chin, the depth of your eye sockets, the length of your nose and various other measurements are calculated to create a unique faceprint. It’s a lot like a digital version of your thumbprint.
Using this will help law enforcement agencies find wanted criminals easily. In an interview with The Indian Express, Andrei Telenkov (CEO of NtechLab) said, “Our video analytics technology employs high-precision, real-time face recognition mode, in the video stream. Images are compared with a database of wanted individuals. If there is a match, it notifies law enforcement immediately. The entire process, from the appearance of the person in front of a camera to law enforcement receiving a signal, takes less than three seconds. This enables a fast response to situations as they develop.”
NtechLab is a Russian company commissioned by Western Railways to produce 470 video cameras. These cameras will be certified by Research Designs and Standards Organisation (RDSO). Already, 30 stations are using these cameras.
The technology can also be found to track missing people.
Without a doubt, FRT has several advantages but there’s a caveat. By easing the availability of this technology, it can be misused. Currently, there is no law pertaining to FRT and everything is seen under the purview of the Information Technology Act 2000. According to experts, consent is not needed when it comes to using this technology which is a major concern. Ideally, there should be a law pertaining to data collected and stored by FRT systems. If a police officer misuses it, he/she should be punished.
Facial recognition technology is changing the game in several sectors and recruitment is one of them. A lot of companies are using AI to analyze prospective employees in job interviews. Provided by HireVue, this technology uses facial and linguistic information collected from previous interviews with successful employees to ascertain whether or not they should hire someone.
Critics say that such an AI algorithm is based on biases and prejudices. It could weed out candidates who are nervous during an interview but may be great at the job.
A little while ago, we had done an article on how you can stop AI from reading your face online. All you have to do is download a tool called Fawkes which will cloak your images. Your data will look the same, but it cannot be analyzed effectively by AI algorithms. “Fawkes takes your personal images and makes tiny, pixel-level changes that are invisible to the human eye, in a process we call image cloaking. You can then use these ‘cloaked’ photos as you normally would, sharing them on social media, sending them to friends, printing them or displaying them on digital devices, the same way you would any other photo,” the tool’s researchers (at University of Chicago) wrote in their paper.
Is it going to be the new normal?
The FRT movement has its share of supporters too. John Mears, Vice President of Leidos, believes that privacy is not the same as anonymity. Unless you have committed a crime, the cameras and facial recognition system doesn’t really invade your privacy which is your right to be left alone.
He added that when Eastman Kodak launched the Brownie Camera, some people thought that it would invade people’s privacy. It was banned in a lot of places. With time, the device has evolved to an extent that every phone is equipped with a high-quality camera.
“Technology evolves, laws and policy follow, and society adapts,” he said in an interview with Leidos’ official website.
At Urruda, we are all for the installation of cameras at public places to avoid crime. That said, sharing this data with organizations that have nothing to do with law enforcement may turn out to be risky. A law pertaining to this technology should be passed as soon as possible so that there is no misuse.