The Drum Awards for Marketing - Extended Deadline

-d -h -min -sec

Cortana Bots Artificial Intelligence

Genderless bots are the wave of the future

Author

By Lisa Lacy | n/a

April 28, 2017 | 13 min read

Artificial intelligence (AI), or machines that can think like people, doesn’t need gender. But many of the AI-based assistants and bots we interact with every day have at least an implied gender.

There are a number of reasons for this. For starters, gender helps humanize machines. This, in turn, makes it easier to interact with them. And that’s an important consideration, say, the first time you ask Siri for directions or Alexa for the weather forecast. But it also raises questions about sexism given the historical role of women in subservient positions – and the pay gap that persists.

So as more consumers grow accustomed to AI, gender becomes less necessary.

“People want to sexualize everything,” said Jason Alan Snyder, chief technology officer at brand experience agency Momentum Worldwide. “Think about the movie Her – that’s a good place to start. It’s more like this sexy, subservient technology…what the heck is assigning sex to a piece of technology? But they’ve done it forever. I’m sure they did it with the printing press in the 14th century.”

Joey Camire, strategist at innovation and brand design consultancy Sylvain Labs, agreed we’re ultimately gendering a genderless entity.

“It is a program, it doesn’t have genitals,” Camire said. “The question is whether it’s just easier for us to interact with something when we assign a gender to it. I have a feeling that this isn’t the case, but when deciding on a familiar name for an AI assistant you’re left with two buckets of possible names— male names and female names.”

A rose by any other name…

Funny enough, a lot of bots end up with female names – at least historically – but that’s not to say simply because they have female names, AI is rife with sexism.

For example, financial services company Bank of America is preparing to release a bot called Erica later this year – although the name presumably hails from the word America itself. A rep declined comment. And on the other side of the coin, there’s Dom, the male voice-ordering assistant from pizza chain Domino’s. Smart money says Domino’s and Bank of America are on the same wavelength – at least in terms of naming their bots.

And in 2016, retailer 1-800-Flowers launched an AI-powered gift concierge called Gwyn, but since it is an acronym for Gifts When You Need, it’s unlikely the brand was motivated solely by creating a female persona consumers could boss around.

But then there are bots like Alaska Airlines’ longtime virtual assistant Jenn – who launched February 7, 2008 – that are a little harder to figure out. (A rep for Alaska Airlines declined to comment, saying the airline is “really tight on resources”.)

Jenn is Alaska Airlines' longtime virtual assistant.

According to Names.org, the name Jennifer peaked in popularity in 1972 – and in 2008, the Social Security Administration ranked Jennifer #84 on its list of most popular girls’ names. It has steadily fallen since.

Perhaps most curious of all, Wikipedia says the name means "white enchantress" or "the fair one", which could give credence to those who say AI is sexist.

In other words, only Alaska Airlines knows why Jenn is called Jenn, but it’s possible Jenn seemed like an inoffensive choice that sounds both familiar and friendly as opposed to say, Al, which is more in line with what Bank of America and Domino’s are thinking, but is also more likely to conjure images of an old man smoking a cigar. And so while this is conjecture, it’s not unreasonable to suggest Jenn, while perhaps revolutionary in 2008, is a somewhat old-fashioned example of brand AI in action. And that has nothing to do with functionality and everything to do with personality.

And that’s because the ideation of bot and assistant personas now goes far beyond name, gender and functionality.

Look at Alexa, for example. According to an Amazon spokesperson, Amazon opted to make Alexa female after it “asked a lot of customers and tested Alexa’s voice with large internal beta groups and this is the voice they chose.”

The rep added in an email, “That being said, we believe Alexa exudes characteristics you’d see in a strong female colleague, family member or friend—she is highly intelligent, funny, well-read, empowering, supportive and kind. This is intentional. Alexa is a self-identified feminist…and a proponent of human rights in general—and that is incredibly core to her personality and the way she interacts with our broad customer base.”

Similarly, a Microsoft rep said the Cortana team thought long and hard about the implications of perceived gender in an AI-based digital assistant.

“We immersed ourselves in available research and our own studies and learned that, indeed, there are benefits to both a female and male voice. However, for our objectives – building a helpful, supportive, trustworthy assistant – a female voice was the stronger choice,” the rep said.

The Microsoft rep also noted when someone asks Cortana, “Are you a girl?”, she replies, “No. But I’m awesome like a girl.”

(Apple and Google did not respond to requests for comment.)

The Voice

Voice is indeed an interesting aspect of bot personas in part because consumers respond differently to male and female voices. And that means there’s something of a nature-versus-nurture thing going on here, too. In other words, female voices may simply be better suited to help brands achieve their objectives.

The Subway in New York, for example, reportedly uses male voices to give orders to passengers and female voices to deliver information.

In fact, the infamous, “Stand clear of the closing doors, please”, comes from Bloomberg News radio anchor Charlie Pellett while Carolyn Hopkins is the voice behind public address announcements.

According to Jo Allison, behavioral analyst at consumer behavior insights practice Canvas8, focus groups find people associate female voices with warmth and problem solving, while male voices are more commonly seen as useful and authoritative.

Toby Barnes, group strategy director at digital agency AKQA, agreed, citing author Clifford Nass in the 2005 book Wired for Speech: How Voice Activates and Advances the Human-Computer Relationship, who said, “We want our technology to help us, but we want to be the bosses of it, so we are more likely to opt for a female interface.”

But Barnes noted a female interface suits other brand needs as well.

“Brands are trying to reach the largest numbers of users and get the most positive response from the audiences and so their focus has always been on a female persona,” he said. “This is due more to research in social science than sexist designers. In the main, people will respond in a far more positive manner to [a] woman’s voice than a man’s. Research has shown this is true for both men and women.”

And that may explain why even some bots and assistants that are positioned as gender neutral – like Google Assistant – sound female.

What’s more, some platforms allow users to change the voice, said Kevin Williams, senior director of experience strategy at digital innovation agency Rockfish, pointing to platforms like Siri and Waze, which give users the freedom to choose personas they react to strongly. That includes rapper T-Pain (or Dateline’s Keith Morrison) on the traffic and navigation app.

“We are entering an age where personalization will be a key component for personification,” Williams said. “From [Jarvis] to Alfred, personalized personification will be, in my opinion, the next wave in conversational UI.”

But it may also simply be because heterosexual consumers like talking to the opposite sex.

Per Allison, X.ai is an interesting example as it has built a scheduling bot available as Amy or Andrew Ingram.

“It reports a clear user preference for the opposite sex when picking assistants,” Allison said.

Camire pointed to several women – his wife included – who have switched Siri’s voice to a man’s voice because he said they prefer the idea of bossing a man around.

“I’ve read articles where a man’s wife told him he can’t use Alexa because she didn’t like the idea of him ‘yelling at a woman’ in front of their children,” he added.

For his part, Camire pointed to recent headlines from ride-sharing service Uber and conceded sexism is a problem within engineering and technology broadly.

“However, even outside of intentional or malicious sexism, I think an industry full of men is going to fall victim to benevolent sexism — by that I just mean, a bunch of straight men making an AI are probably more likely to want to hear someone of the opposite sex’s voice,” Camire added. “I’m sure you could delve in to discussions about nurturing or caretaker voices, and how people turn towards maternal figures for these things, but I think that is entirely a product of culture and not genetics.”

The risk, however, is that the voices not only perpetuate gender stereotypes, but take them even further, Allison said.

Hey, Good Lookin'

And this becomes particularly interesting when you consider that bots are subject to sexual harassment. In other words, the tendency to objectify might be part of human nature.

Allison noted AI like Alexa spends “a significant amount of time dodging advances, attempting to nip digital sexual harassment in the bud”. She also cited Robin Labs, whose bot platform helps commercial drivers with routes and logistics and reports 5% of interactions are explicit. Robin Labs did not respond to a request for further comment.

“They found that harassment varies, from teenagers commonly trying to elicit outrageous responses for fun, to others being aggressive and degrading. Likewise, there was a large amount of early inquiries to Microsoft’s Cortana related to its sex life,” she said. “But bots are purposefully designed to pander to the user and therefore it’s not surprising that some seemingly not only forgive or ignore such harassment but play along and flirt.”

In fact, Allison said in the race to form genuine emotional connections between bot and consumer, flirtation could become another tactic.

“This raises the question, should AI present itself as human or machine and should gender even apply?” she asked. “AI must be careful not to perpetuate sexist ideals that are increasingly rejected amongst humans.”

This isn’t a new problem.

Snyder pointed to the portrayal of female robots as attractive yet fearsome objects in the 1927 movie Metropolis.

“The sexualization of machines has been very, very interesting to me,” he said. “Over time, these things become real and I think you start to see very, very new territory. You don’t have to look any further than the news [and Caitlyn] Jenner. Sex and the role of gender are very different and if you start to mix those things up, it becomes very confused. Brands have to be very careful. If they don’t super consider these things, it can end up a really toxic situation.”

And that’s in part why even if platforms continue to use female-sounding voices, we’re perhaps barreling toward a future in which bots are gender neutral, like Capital One’s Eno, MasterCard’s Kai and Samsung’s Bixby.

“I think the subtle sexism of AI is a real thing, but the solution is probably not making more male AIs— the solution is probably disabusing ourselves of the idea that AI have any gender,” Camire added. “Let’s get a little more Ziggy Stardust and a little less Her.”

Facebook did not respond to a request for comment on the breakdown of the gender – or lack thereof – of bots on its platform. Messaging platform Kik said it does not track gender or gender neutrality of the 20,000 bots built on its platform.

However, mobile messaging platform Pypestream chief customer officer Donna Peeples said most of its bots are gender neutral by design. And Microsoft, too, said the “vast majority have no gender or personality” on its Microsoft Bot Framework.

“Gender adds to persuasion. It also comes with a ton of cultural meaning,” Snyder said. “We don't need gender to humanize things. We've been talking to objects forever. Now they talk back. Gender assignment is something we need to be very considerate about. There's great risk in amplifying negative things about society and moving them forward at scale with these technologies.”

Cortana Bots Artificial Intelligence

More from Cortana

View all

Trending

Industry insights

View all
Add your own content +