It's Not Giving Women a Voice

Kimi, the Zoë Kravitz-starring thriller film gave us more than just a glimpse of how creepy artificially intelligent Voice Assistant (VA) devices can be. The story revolves around the evidence of a crime recorded by the victim’s VA device, Kimi. Angela Childs is a tech worker who stumbles upon this recording and battles against her fears and vulnerabilities to bring the matter to light.


While the movie can be said to endorse girl-power, my thoughts gravitated towards the Kimi device, a mimic of today’s voice assistants like Siri, Alexa and Google Assistant. All of them have one thing in common – a female voice.


The world’s first VA launched in 1994 was called Wildfire and was also female. Come to think of it, most of them have female names too – like Siri, Alexa and Cortana. Cortana, Microsoft Windows’ VA, has a tagline dripping with devotion, “Your assistant for life.”


Given that the Pew Research Center revealed in the beginning of this month that 9 in 10 Indians think women must obey their husbands, and the obvious fact that 100% of users want their VAs, which are essentially ‘female’, to obey their orders, it rubbed me the wrong way.


But it is not just me who thinks that a voice assistant being female is a big problem when the world is grappling with multiple crises. We must address this as an issue that needs sorting. Target 5.B of Sustainable Development Goal (SDG) 5 Gender Equality, is to “promote empowerment of women through technology.” The way AI has been developed does not empower women. Women, be they humans or robots, are expected to be polite, unquestioning and obedient even in a technology-driven world.


First things first: why are most digital voices female? Be it Alexa or the lady who says, “The number you’ve dialled is currently busy. Please try again later.” Scientifically speaking, people seem to pay more attention to high-pitched voices. Also, women’s speech is widely considered easier to understand. Announcements aired in cockpits too were in a woman’s voice so as to ensure clarity among the (male) pilots. Moreover, human brains are developed to prefer female voices. Foetuses recognise and react to their mother’s voice against other women’s voices but show no inclination towards their father’s voice.


The advent of the digital female voice dates back to 1878 in Boston. Telephone companies were getting frustrated of using boys as telephone operators because they often spoke rudely and even swore at customers. So, Alexander Graham Bell, the famous inventor of the telephone, hired a young woman named Emma Nutt, who was then working at a telegram office.


Emma Nutt became the world’s first woman telephone operator. Her refined speech left people so impressed that her sister too was hired as an operator within a matter of hours. Soon, the telephone operator’s job became exclusively that of a woman.


While this did help in the economic growth of women by inducting them into workforce, it was chokingly patriarchal as it did so by enforcing the “boys will be boys” stereotype. In fact, ‘draconian’ could be a better word. Only unmarried girls aged between 17 and 26 and above a certain height were hired as telephone operators. The work was demanding, pay meagre and working space uncomfortable (cramped rooms, straight-backed chairs). The operators were forbidden from talking to one another and had to be patient and polite even with irascible customers. Minor errors were severely penalised. Even Emma Nutt earned only $10 a month for working a 54-hour week.


The women formed a union and protested against the working conditions during the First World War. This conveniently led to telephone companies using a pre-recorded female voice instead of human operators answering the phone.


At a time when women and girls are frequently snubbed and not allowed to voice their opinion in all spheres, using a female voice in AI devices may sound encouraging. But in a world where most domestic chores are handled by women and voice assistants are technically digital domestic helpers, this school of thought does not seem very saintly.


Enraged male car drivers in Germany called up BMW and asked them to get rid of the female GPS voice. How could they trust directions given by a woman, they demanded. The BMW people, adding insult to female injury, tried to pacify them saying that only men were behind the coding and the automobile. Still, they pulled out the female GPS voice.


It has also been observed that female voices are used to give general instructions while male voices are used to offer financial advice. Here is a snapshot from the website of a company that advocates for “digital humans” for a personalised experience:


 
Things had taken an obnoxious turn in the recent past, with Microsoft launching “Ms. Dewey”, a gendered search engine in 2006. The interface showed a woman of colour called Ms. Dewey (played by Indo-Dutch actor Janina Gavankar) in an all-black, tight-fitting attire. Users had to type in the search box and the search results would appear in a not-so-big box, as the Ms. Dewey character was the actual focus. Ms. Dewey could speak and she was lauded by many for being “full of attitude”. But she was programmed to be vulgar and rude very often and the search results she offered were more culturally relevant than informationally relevant, said Miriam E. Sweeney in her dissertation titled “Not Just a Pretty (Inter)Face: A Critical Analysis of Microsoft’s ‘Ms. Dewey’”.


While many men may have found this fun and entertaining, women users might have seen Ms. Dewey behaviour as inappropriate and shocking, especially for a search engine. So, despite featuring a non-white, opinionated woman, the Ms. Dewey search engine was just another tech product that objectified women. Ms. Dewey was pulled out in 2009.


Miriam writes in the same paper, “The computer is woman metaphor has roots in the early history of computing in which women technicians functioned as human computers performing tasks such as book-keeping, calculating, stenography, filing and clerking. Initially, computing was viewed as menial number-crunching and repetitive processing akin to low-level clerical work. Male engineers felt that these tasks were a waste of their time and skills, thus women were tracked into these jobs. Gender and sexual stereotypes were crucial in justifying and maintaining the division of labour. Women were cast as “naturally” possessing the physical and mental dispositions of patience, alertness, tirelessness and precision that made them ideal computing workers. As machine computers were invented to take over the work of these female technicians, the same gendered attributes were effectively transferred to the machine, metaphorically constructing the machine as woman.”


The trend still continues today with a marked lack of female leadership in technology. Only 30.9% of Google’s workforce is female, according to UNESCO’s report titled “I’d Blush if I Could: Closing Gender Divides in Digital Skills through Education”. The first part of the title is what Siri replied to, “Hey Siri, you’re a bi***.” Our society has created some specific vocabulary for judging and abusing women. So, it is no surprise that the female voice assistants in use today have some of these jibes aimed at them.


Tech workers have now programmed voice assistants to give some bang-on replies to patriarchal/offensive comments from human users. For instance, Siri now says, “I won’t respond to that,” Google Assistant says, “My apologies, I don’t understand,” Cortana replies, “Moving on,” and Alexa makes a dismissive noise.


However, tech inequalities run deeper. Studies have time and again found that speech recognition systems work better for men’s voices. Rachael Tatman, a data scientist and linguistics expert, found that YouTube’s automated captioning system was more accurate for men than women – by a whopping 70%. This is allegedly because speech recognition systems were largely made for and by men. So, we are literally teaching AI to continue gender discrimination.


Privacy concerns due to voice assistants are running high. Women in general have always been perceived as indulgent gossipmongers. Read that again, add two and two and what you get is a really bad image.


Ladies (or gentlemen), isn’t it infuriating that the little lady inside your VA device, who is “herself” facing sexism, is discriminating against you (or your female relative/friend)?


A digital solution-proving company called Knowit came up with a video of what an all-male VA might look and sound like, if engineered in a gendered manner. They named it “Dick” and listed out its typical features: mansplaining, easily offended, lazy, selfish, self-pitying, creepy, sexist and arrogant. While this is certainly a witty and sarcastic take on the stereotyped nature of VAs, an exclusively-male voice assistant would not be sustainable as it would be motivated by negativity and want for revenge.


The good news is that companies are now trying to diversify the voices of VAs. Think of it – in the real everyday world, diversification means inclusion of women and other vulnerable groups, but in the world of VAs, it means dragging in men! But seriously, there has been some welcome positive change.


In fact, Google wanted to have both male and female voices since the background days of Google Assistant. But owing to the limited database on male voices and unsure of whether it would appeal to customers, they dropped it. Apple’s Siri now does not use a female voice by default and users are required to choose between a male and female voice. Apple is also trying to make Siri sound gender-neutral. Now, you can even find an Indian version of Siri as the company is planning further diversification.


Amazon’s Alexa too has an option of a male voice which was launched last year. You can even make Alexa talk in Amitabh Bachchan’s voice. It is a paid upgrade and although Amitji can play songs for you and tell you the weather, he won’t do your tasks.


Now VAs, whichever voice they use, have been programmed to claim that they do not have a gender when confronted with the question, “What is your gender?” or “Are you a man/woman?”. Siri says, “I don’t have a gender,” or, “I am genderless. Like cacti. And certain species of fish”. Alexa and Google say, “I don’t have a gender”. Cortana gives an eloquent answer, “Well, technically I’m a cloud of infinitesimal data computation”.


Research has revealed that by 2023, that is next year, there would be more voice assistants than humans. It is paramount that we make technology and the virtual world inclusive and free of sexism and gender discrimination.

Popular Posts