The Drum Awards for Marketing - Extended Deadline

-d -h -min -sec

Author

By Rebecca Stewart, Trends Editor

October 24, 2018 | 9 min read

The issue of an AI voice’s ‘gender’ goes to the heart of how users interact with a human-like system. Does a female voice pander to stereotypes? Should companies ditch voice personality altogether? Do users even care?

If you’re au fait with Netflix’s Black Mirror, you’ll remember the White Christmas special in which one woman is reduced to a ‘cookie’ – a digital copy of her own consciousness that can act as a perfectly tuned subservient personal digital assistant. Now, hold down the home button on your smartphone and listen carefully to the voice that responds: it’s most likely female, and very human.

AI assistants have a gender problem. It’s one that has now spread far beyond Charlie Brooker’s observant pop-culture series, as well as other depictions such as Ex Machina and Spike Jonze’s Her, and is slowly weaving its way into consumers’ everyday interactions with these machines.

When you utter the phrase “Hey Siri/Cortana/Alexa”, or even “OK Google”, the default is always set to enable a female voice, even if the product itself isn’t technically gendered. Most of the time, ‘She’ll’ be ready to remind you to pick up milk from the store on the way home from work, organize your calendar or crunch the numbers in a comparametric equation.

While Google and Apple do allow users to manually change the voices on their assistants to male ones, Canvas8 consumer behavioral analyst Jo Allison argues that the default ultimately reinforces the stereotype that women should perform service-orientated tasks and fill administrative roles.

This, she notes, could turn some consumers off and is a significant problem for a technology that’s becoming increasingly adopted either side of the pond, with 20% of Americans and 10% of Brits telling YouGov they have a smart speaker in their homes.

So, why is it the case that it’s a resounding all mouth and no trousers effort from the world’s biggest companies to make their tech more diverse? And should the Amazons and Apples of the world even be ascribing a human personality to these assistants – or do users prefer robots to be, well, robots?

AI psychology

Allison explains that since consumers assign human qualities to AI, “from gender to personality traits”, they therefore have pre-constructed expectations. The concern, she argues, is not just that the assigned voices can perpetuate current gender stereotypes, but take them even further.

She continues: “People inevitably humanize technology, so it provides pause for thought that the majority of AI, which is currently subservient, has been given female personalities.

“The whole point of having a digital assistant is to have it do stuff for you. You’re supposed to boss it around, so giving subservient services feminine voices is obviously concerning to many.”

Psychologists, meanwhile, are divided over the actual value in gifting a female voice to public address systems – which are also typically sounded by women – or AI assistants.

A 2017 study from Indiana University, which involved 151 men and 334 women, shows that both men and women express “explicit preference” for female synthesized voices, which they describe as sounding “warmer” than male ones. The same study notes that women prefer female synthesized voices when tested for implicit responses, while men show no gender bias in implicit responses to voices.

A second report from Stanford University, however, says people do prefer male voices – but only when it is used to teach them about computer science, with respondents agreeing that women’s tones were “warmer”.

In developing Alex and Cortana, Amazon and Microsoft tested male and female voices and found customers to overwhelmingly prefer the female tenor. Smart home giant Nest, meanwhile, makes an even more compelling case for its Nest Protect being sounded by a woman, saying its research revealed children to be more responsive to voices that sound like their mothers.

However, Jemma Elliot, global head of content at brand consultancy Wolff Olins, which works with the likes of Hive, Alibaba and Spotify, argues that in general, market forces are blocking progress when it comes to equality.

“Female virtual assistants are perpetuating stereotypes of subservience at a time when the shifting power dynamics of gender are being scrutinized and challenged. This gives ostensibly progressive umbrella companies some moral challenges,” she says.

Canvas8’s Allison also questions the effect bot gendering is having on wider society, and the implications this has for the brands carrying them. She points to Robin Labs, which has a bot platform to help commercial divers with routes and logistics, and recently reported that 5% of interactions with its machines are “sexually explicit”.

Robin Labs found that harassment varied from teenagers commonly trying to elicit outrageous responses for fun, to others being aggressive and degrading.

“Likewise, there was a large number of early inquiries to Microsoft’s Cortana related to its sex life,” Allison goes on, “but bots are purposefully designed to pander to the user, and therefore it’s not surprising that some seemingly not only forgive or ignore such ‘harassment’, but ‘play along’ and flirt.”

Gendered by design

For Ileana Stigliani, assistant professor of design and innovation at Imperial College London’s innovation and entrepreneurship department, there’s also something much bigger at play, and it starts with conception.

AI, as she points out, is a “sea of dudes”, and many of these systems are designed and programed by men. It’s simple: “If the people who design machines are only men, there is a strong likelihood that the resulting products will be gender biased.”

Indeed, at Google just 20% of its tech-focused workforce is female, Apple’s sits at 23% and Microsoft at just 19%.

The solution is obvious for Stigliani – she doesn’t think tech firms should concentrate on creating gender-neutral solutions to inspire consumer adoption and engagement. Instead, she thinks they should focus on fostering inclusivity internally to create products that work for everyone.

“In this respect it’s crucial for them to foster diversity inside the design team, not only from the gender point of view, but also from the cultural and ethnical standpoints. The more diverse the design team, the more inclusive the design outcome will be.”

Elliot suggests the next step, to some extent, would be to follow the lead of Apple and Google and give users the option of whichever voice resonates most with them.

“Perhaps the answer here is users being able to customize the voice interface they’re using, as with TomTom’s GPS VoiceSkins. There could be scope for having different voices on the same device, or even a plethora of assistants, that morph to fit topical themes like Google’s homepage.”

Personality crisis

Understandably, brands could be nervous to do so in case it dilutes their voice assistant’s carefully curated ‘personality’, but if anything, there remains the question of whether users even want their AI to have a personality.

Users’ trepidation of Amazon ascribing human-like qualities to Alexa was personified in headlines at the start of the year, when the smart speaker was caught ‘creepily’ laughing without prompt.

The brand was forced to issue a quick fix to what it said was Alexa mishearing things, but that didn’t stop a plethora of negative publicity.

The internet was also resoundingly ‘shaken’ when Google’s voice assistant made a very human-like phone call to a hair salon back in May, complete with eerily impressive uhming and ahing.

Elliot muses: “Realism and natural language simulation have become the Holy Grail … but sounding real and pretending to be real are very different things. The former is great, the latter not so much. People need to know when they are talking to a bot.”

Stigliani concurs: “What inspires genuine consumer interaction is empathy, a characteristic that voice assistants, be they male or female, do not and cannot possess.

“People would certainly want to know if they’re speaking with a bot, as opposed to a real human being. This can help them adjust their expectations, and avoid the frustration arising from interacting with someone who shows no empathy.”

With a recent in-depth study by the Royal Society finding that central to consumer concerns about the use of machine learning was a fear that they would be ‘harmed’ in the process of interacting with a human-like system, it seems Amazon et al have their work cut out driving trust and engagement in their systems – female or male.

This article first appeared in July's voice-themed issue of The Drum magazine in which we hear all about how AI assistants are being utilised in sectors such as retail, charity, healthcare and education, and why marketers need to be concerning themselves with this technology now. We also speak to a very familiar voice with a not so familiar name – Susan Bennett, the voice Apple’s Siri – and catch up with the CMO of smart speaker brand Sonos, Joy Howard, who tells us about how it is taking on the world’s tech giants with its foray into this space.

Cortana Voice Assistant Gender Stereotypes

More from Cortana

View all