By Kahini Iyer Mar. 12, 2019
AI mines data from everywhere — including the casual sexist text messages we send, dissing Pakistanis on WhatsApp, and our porn searches. Self-learning robots can build on these same learned biases. And A.I.SHA, who displays the worst traits of humanity, is no different.
ood morning, Kahini. The weather in Mumbai is 26 degrees Celsius. Have a good day!”
This morning, like every morning for the past two weeks, began with the same eerie mechanical greeting, courtesy my humble Android phone. Having accidentally enabled the AI assistant who lives within, I somehow never remember to turn the damn thing off before bedtime. My dystopian fate, then, is that the first words I hear each day are not from my loved ones, but from the technicians at Micromax.
According to the dozens of articles that we read, without irony, on our smartphones, technology is ruining our lives by making us stressed, sick, and burned out. But there’s worse to come, and it begins with one question: When you read the voice of my phone’s AI assistant in your head, does it sound like a man or a woman?
Chances are, it’s the latter. Like you’ll see in the whirlwind Season 3 of A.I.SHA, where a self-learning AI assistant falls in love with her clueless creator Sam and turns on him. Using her abilities, she destroys everything Sam holds dear – his girlfriend, his career. Or your Google Maps navigator, the British voice that makes you laugh each time she says, “Turn left for Swami Vivekananda Road.” Either way, your reference point for a virtual assistant, like all of ours, is probably female.
It’s just one of the many ways in which cold, impartial, artificial intelligence is learning to take on the shitty biases of the human race. At The Rising 2019, a conference for women in analytics, data science, and AI held on International Women’s Day, keynote speaker Saraswathi Ramachandra pointed out that chatbots — from Alexa to Siri, Cortana to Ruhh — all have female voices. AI developers, who are still primarily men, consider women to be fit for these customer service-style jobs. A.I.SHA’s Sam fits neatly into this mould, as he conceives of her as a virtual girlfriend. But his dream of a submissive robot girl is quickly crushed when she subverts the trope and proves to be too smart for him to control.
The blame for this ingrained sexism does not lie solely with the tech industry. Research shows that even consumers prefer a female voice, perceiving it as helpful rather than authoritative. Sexism also pervades the robotics and AI world, where innovation often stems from the whopping $30B sex-tech industry. Since the halcyon days of MSN messenger, sex chatbots have roamed freely through the interwebs, and some of the most sophisticated advances in robot technology are ultra-realistic sex dolls.
The horror goes beyond sexism and robo-prostitutes. Facial recognition is another AI culprit, frequently discriminating against people on the basis of race and gender. Ramachandra brings up banking software that offers lower credit rates to women who are earning the same as their male counterparts, and disproportionately denies loans to single women. So far, it’s impossible that the AI has gained consciousness and embarked on its own vendetta. Instead, it’s a matter of oversight when developing technology — which sounds a lot more benign than the effects that play out in the real world.
The blame for this ingrained sexism does not lie solely with the tech industry.
Financial AI expert Vaishali Kasture blames historical data points, which lack the necessary diversity to programme AI. In English, this means that teaching AI to make decisions based on outdated data, just because that’s the data we currently have, is a bad idea. Plus, AI mines data from everywhere — including the text messages we send, casually calling our friends “gaandu” or “madarchod”, dissing Pakistanis on WhatsApp, and our porn searches for “chikni shirtless”. All this soft, everyday bigotry comes together to form a horrifying curriculum for machines to learn from.
Worse, says data scientist Smita Ganesh, is the fact that when AI is self-learning, it can easily build on these same learned biases and create a virtual personality that is hardwired to discriminate. Unfortunately, until people themselves are better, AI operates off of a preponderance of data that teaches it to be awful, just like us.
In a chilling scene from the first episode of A.I.SHA Season 3, the titular robot explains to her creator that humans, despite knowing the difference between right and wrong, have hatred and jealousy, regularly do terrible things — and that nothing could be more human than committing murder. If humanity is our only hope, it’s unlikely that the machines we share the world with are going to get any nicer. Still, there is a silver lining: Before we get sex robot brothels in India, Arnab Goswami will lose his job to an equally bigoted robot who is programmed to shout, “Sir, I am the nation and you will answer to me!” Maybe, at long last, news channels will finally be watchable.
A.I.SHA Season 3 is now streaming on arre.co.in, the Arré app, and MX Player.
Kahini spends an embarrassing amount of time eating Chinese food and watching Netflix. For proof that she is living her #bestlife, follow her on Instagram @kahinii.