Safety over Privacy? Lucknow Police Plan to “Track” Women in Distress with AI Cameras

Gender

Safety over Privacy? Lucknow Police Plan to “Track” Women in Distress with AI Cameras

Illustration: Robin Chakraborty

Women’s safety is a major concern in India. Crimes against women, already high in volume, have further increased during the course of the coronavirus pandemic. Tragic stories of sexual assault or murder, whether in print, on TV or social media are almost a daily occurrence. The pressure on administrators has mounted with the sharp increase in crime. And with the rising pressure, some questionable ideas have been floated around to tackle the issue.

The Lucknow Police is set to equip public places with artificial intelligence (AI) enabled cameras that will click pictures of women in distress on the basis of facial expressions and alert the nearest police station. Lucknow police commissioner DK Thakur stated that they’ve identified 200 hotspots where movement of girls is maximum, and from where most complaints are received. “We will set up five AI-based cameras which will be capable of sending an alert to the nearest police station. These cameras will become active as soon as the expressions of a woman in distress change. Before she takes out the phone and dials 100 or UP 112 for help, an alert will reach the police,” he said.

The decision has rung alarm bells among experts at multiple levels. Firstly, concerns have been raised around privacy and the tracking and surveillance of women through tech tools whose data can be easily misused. We know of multiple global cases where data captured by various companies or agencies was compromised and it has infinite capacity to be used for devious means. Furthermore, people have questioned the wisdom of clicking pictures of, and tracking women, who are the potential victims of the crime, and not men who might commit it. When the system itself has consistently displayed insensitivity towards dealing with crimes related to women, the potential for technology of this kind to be misused for the purpose of harassment or quid-pro-quo is scary.

Subject matter experts claim this idea of using AI cameras simply doesn’t work on any practical scale.

AI scientists have pointed out that facial expressions say nothing about the internal mental state of humans, and that machine learning capabilities which require facial data are in violation of constitutional rights. Subject matter experts claim this idea simply doesn’t work on any practical scale. This is just one of a series of bad ideas that have recently been advocated to tackle the problem.

A couple of weeks back, Madhya Pradesh Chief Minister Shivraj Singh Chauhan advocated a police station registration for women who are leaving their homes to work, and subsequent tracking by law enforcement and the state to keep them safe. Not only were female users on social media horrified by this policy suggestion, but many outright stated that they would not get themselves registered for something like this. Not only is it invasive of privacy, but why must the state track working women instead of focusing on men who commit those heinous crimes? A measure directed towards women that is not co-opted by them, but on the flipside, something they feel threatened by, is a bad idea.

India has a long way to go, when it comes to making its public and private spaces safe for women. While it is important and necessary that we think of policy decisions to make our streets safer, it is extremely important that the primary stakeholders, that is women, feel safe and are an integral part of the process. A top-down approach alienating the supposed beneficiaries is bound to fail. And this is a battle, we simply cannot afford to lose.

Comments