Why making Siri, Alexa, and other AI voices “female” perpetuates gender stereotypes

A young man talking to Siri on his phone
Kleber Cordeiro/Shutterstock
Impact

Why is Siri female? When the revolutionary voice assistant launched on iPhones in 2010, her gender was taken as a given in the face of her quick-witted, futuristic abilities. It wasn’t until 2017 that male options were provided and by then Microsoft’s Cortana (which still doesn’t have the option for a male-voiced counterpart) was on the scene as well as Amazon’s Alexa. However, it didn’t take long for people to push the boundaries of what they could ask ask the new female AIs, prompting Siri to respond to sexual questions or commentary with, “I’d blush if I could." Uncoincidentally, this phrase forms the title of a new United Nations report that examines how female AI voices perpetuate harmful gender stereotypes and what that could mean in the future as they become more integrated into our daily lives.

United Nations

Culturally, there are already pervasive stereotypes about women being secretaries, assistants, and generally subservient. In addition, market research and even Amazon’s own focus groups have shown that consumers associate female voices with being more pleasing and comforting. So, if you’re a tech company trying to market a universally appealing virtual helper it’s little surprise that a docile, somewhat flirtatious-sounding woman is (consciously or unconsciously) your go-to voice.

Beyond the obvious technical requirements, creating an appealing Siri, Cortana, or Google Home involves teams of software engineers and creative professionals who make a significant effort to create whole backstories and personalities that help the assistant appeal to customers.

“The assistant is, in effect, hardly a generic woman,” the U.N. report states, “but rather a young woman from a particular place and shaped by life experiences that carry meaning for the (presumably, mostly American) team that designed ‘her’ personality and voice.”

In Google’s case for example, the nameless but distinctively (intentionally) female assistant was, “...imagined as: a young woman from Colorado; the youngest daughter of a research librarian and physics professors who has a B.A. in history from Northwestern, an elite research university in the United States; and as a child, won US $100,000 on Jeopardy Kids Edition,” James Giangola, a lead conversation and personality designer for Google Assistant, told The Atlantic.

Many experts, and the U.N. report itself, call out the male-dominated teams that are designing these seemingly innocent backstories. The underrepresentation and absence of women, people of color, and those in the LGBTQ communities in the tech world has been acutely felt in AI and algorithm design, and the lack of diverse perspectives can let conscious and unconscious biases guide the technology’s behavior.

“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the report states. “Because the speech of most voice assistants is female, it sends a signal that women are ... docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’.”

This has brought up issues around how to treat AI assistants, particularly when children are beginning to interact with them and give them orders from a young age. Already, consumer responses to voice assistants range from formal and polite to intentionally disrespectful, and in general the expectation has become that all voice assistants will be female.

"If you talk derogatory to an Alexa, children pick this up," Robert LoCascio, CEO of an integrated messaging company called LivePerson, told NPR in an interview. "They go back to school and they think this is the way you talk to someone and this is maybe the way you talk to women."

Solutions to this challenge are, as one might expect, highly complicated. However, the creative studio Virtue Nordic, recently released a genderless AI voice that could go a long way towards solving this problem. The voice is called Q, and its creators tested it with over, “4,600 people identifying as non-binary from Denmark, the U.K., and Venezuela, who rated the voice on a scale of 1 to 5–1 being 'male' and 5 being 'female,'" until they received consistent feedback that the voice was gender-neutral.

While Q wasn’t created for any particular tech company, its creators hope that it can influence the conversations being had about gendered AIs and hopefully get picked up as a “third option” for Siri and Alexa.

To their credit, major tech companies have begun making changes to their voice assistants’ responses to make them less submissive. According to the report, “Examples of female digital assistants capable of robust defence are harder to find, although recent updates to Apple, Amazon, Google and Microsoft operating systems have eliminated some of the most excessively apologetic or flirtatious responses to sexual harassment.”

In addition, these updates have given some voice assistants, Google Home specifically, the ability to interact more positively with users that engage with them using "please" and "thank you."

However, in order to truly solve this problem, advocates say that the practice of making AI assistants female-identified by default must come to an end and more must be done to integrate women and other diverse voices on the teams that are developing these tools. There is already evidence to suggest that the use of voice assistants is on the rise, though many people are wary of trusting their Alexa with important tasks or information. As companies try to change this by making their voice assistant products more likable and part of their customers' everyday lives, the report cautions developers and users alike to avoid the temptation to revert to harmful stereotypes about women simply to make people more "comfortable."