If we ask Siri “Are you male or female?”, the voice assistant replies “I transcend the human concept of gender”.
Even though it is merely a set of informatic data technically speaking, yet Siri “appears” female.
Her name is female and it means “beautiful woman leading to victory” in Norwegian (according to the project’s co-director, Dag Kittlaus); her voice and the pronoun with which we usually refer to her are female as well.
The same is true for other popular voice assistants: Amazon’s Alexa, named after Alexandria’s library, is definitely a female name; Cortana, Microsoft’s voice assistant, is named after an Artificial Intelligence character in the videogame series Halo, represented as a sensual and scantily dressed woman; Google Assistant’s name is neutral, but its default voice is female.
After Kitt of Supercar – the intelligent car with a male voice which helped the series’ protagonist – something went wrong. Even Unesco realized it and created a report in 2019 on voice assistants and their predominantly female features. Its title, “I’d blush if I could”, is significant: it refers to Siri’s old default answer when verbally attacked with sexist insults such as “Hey Siri, you are a bi*ch”. Indeed, the report underlines how the voice assistants’ female characterization risks perpetuating and exacerbating detrimental gender stereotypes.
Going into detail about an analysis that could sound too alarmist, Unesco researchers state that the system’s answers, more “servile” than obliging, to several questions (even if overtly abusive) consolidate the idea of a submissive woman. “(…) Since most voice assistants talk with a female voice, the system conveys the message that women are docile and accommodating, always available as soon as someone touches a button or activates voice control”.
The Guardian discussed the same theme in the same year. The article of the British newspaper investigated verbal abuses against Siri and listed the system’s slight and remissive, if not coquettish, answers. These dynamics can reinforce sexist behavioural models that can be implicitly transmitted to future generations.
The real question, however, still remains unanswered.
Why do most Artificial Intelligence-driven virtual assistants have female names, voices, and an obliging attitude?
Daniel Rausch, project manager of Alexa for Amazon, has put forward economic reasons. Referring to a study promoted by the University of Indiana in 2008, he claimed that female voices are generally more polite and therefore pleasant for both men and women. Since Alexa’s voice directly influences Amazon’s business, acting as intermediary during the user’s purchase, the company has perfected a timbre that sounds caring and understanding. The idea of making the user’s experience as enjoyable as possible, avoiding any kind of friction, has done the rest, creating a docile and complacent personality.
In 2011, Professor Clifford Nass of Stanford University told CNN that “It is much easier to find a female voice everyone likes than a male voice. It is a well-known phenomenon that the human brain is structured to appreciate female voices”. The human brain, however, should be able to consider also different reactions.
To delve into the issue, one can easily become aware of the fact that these technological novelties aim to automatise a series of “duties” that people usually associate with female gender. To provide help in organizational tasks, such as planning a meeting or setting up reminders, echoes the job of a secretary. Therefore, it is not that weird if technological interfaces are codified as feminine. Clerical and service work are traditionally designated as feminine, and this stereotype is likely one of the reasons for the predominance of feminized digital assistants.
The Unesco report further ascribes the stereotyped representation of female Artificial Intelligence to gender inequality, which is still very much present in the world of technology. Statistics show that women only represent the 15% of executive roles in important technology companies, and only the 12% in the field of Artificial Intelligence research. According to the report’s conclusions, the inadequate representation of women in project teams has involuntarily facilitated the perpetuation of male chauvinist stereotypes in final products: from the implicit association of women with domesticity and service, to the ambiguous and submissive answers of Siri&Co., to verbal attacks.
Are there any solutions? Yes, absolutely.
First, to update the scripts: Siri’s answer “I’d blush if I could” was changed in 2019 with “I don’t know how to answer”. Other companies reacted to criticism with similar revisions: Alexa, for instance, answers “I’m not sure what outcome you expected”. But there is still room for improvement.
Then, it could be the user who chooses from the start whether to set up a female or male voice assistant. An easy procedure, right? In fact, not really.
With Siri’s iOS 10 update (after a two-year development), the voice assistant also has a male voice, but this is not a default setting: one must search for it, wait for the device to download the necessary information, and change settings. The option is still unavailable for Alexa or Cortana.
Saniye Gülser Corat, Unesco’s Director for Gender Equality, believes that "the world should pay much more attention to how, when, and whether AI technologies are against gender equality. More than that, these technologies should be diversified and not only female by default. Technology companies should explore the feasibility of the development of a gender-neutral machine that is neither male nor female. This technology could annul gender-based insults and the offensive language with which women are addressed".
Q was born from these premises, a digital assistant that proudly defines itself as “the first non-gendered voice assistant in the world”. Indeed, its voice speaks in a range between 145 Hz and 175 Hz, inbetween what we recognise as male or female, so that it is not possible to ascribe it to a gender. The project was created experimentally to challenge gender stereotypes in the world of technology and to promote a non-binary perception of gender both in our society and in the tools that are part of our daily life. Its aim is to become the third option in the field of voice assistants like Siri and Alexa and to offer a form of representation not based on gender, but only on technology.