Probably not, but they will change it.
As consumers interact with AI like Alexa, Siri and Cortana – not to mention brand chatbots – more and more, human language will change. That was the topic of conversation at a recent panel during Social Media Week that asked in part whether technology will corrupt language.
AI is also changing our relationship with technology, particularly among children who grow up with voice-enabled devices, and so the key for brands and marketers may very well be figuring out how to give robots more human-like speech, as well as to make them more empathetic. But that may also be easier said than done.
History repeats itself
Erin McKean, founder of online English dictionary Wordnik, noted the anxiety about technology changing language is nothing new. Greek philosopher Plato was against people writing things down because he thought it would ruin our memories, for example. Since then, virtually every innovation since – the printing press, radio, the telegraph, TV, movies and the internet – have all been accused of killing language, according to McKean.
“The telegraph is a great example with modern parallels,” added Ben Zimmer, language columnist for the Wall Street Journal. “[Philosopher and poet Henry David] Thoreau thought it would help us communicate more quickly, but we’d have nothing to say. It stripped down language, causing language to be used in a very functional way.”
Our relationship with technology is changing
And as technology hones its ability to communicate with us, our relationship with it is changing as well.
For one thing, Sara Holoubek, CEO of consulting firm Luminary Labs, noted, voice-enabled devices nag us.
Michelle McSweeney, research fellow at the Center for Spatial Research at Columbia University, agreed, saying when her phone tells her she didn’t run on a given day or hasn’t slept as much. “I get offended and don’t want to admit how offended I am. A lot of my work is on politeness, but I am offended. Don’t tell me what to do. I want you in my pocket, not controlling my life. A lot of us have that tension [and] anxiety of devices telling us what to do,” she said.
But frustrations also arise when chatbots don’t understand us, like in the phone trees brands use to solve basic customer service issues. Not to mention bots don’t always seem to care all that much about said customer service issues either.
Damned if you do…
“Isn’t the great promise of voice technology supposed to be some level of empathy? Isn’t it supposed to approximate humanity? How do companies do this?” Holoubek asked.
Robots don’t seem more empathetic in part because they use politeness markers that demonstrate social distance, such as, “please” and “thank you,” and they use fully formed sentences, McSweeney said. They also don’t drop syllables, which marks social distance.
“When we are talking together, we mark social closeness – ‘Hey, how’re you doing?’ – rather than, ‘Good morning. How are you doing today?’” McSweeney added. “We in personal relationships think it’s impolite.”
On the other hand, Zimmer said it’s tricky because chatbots trying to be more conversational with responses like, “Cool,” can go too far in the other direction, which can feel forced as well.
“If there’s an artificiality, which would you prefer? The fake politeness of distance or the fake solidarity of more colloquial language?” he asked. “You’re damned if do, damned if you don’t.”
Holoubek’s advice to brands developing bots is to do a lot of human work before they do tech work.
“Really invest more money in understanding people than you probably think you should do,” she said. “Everything is human driven.”
In addition, McKean said to think less about what the brand wants and more about what customers need to get out of interaction with a bot.
“People get upset when business goals are not in sync with their goals.”
The human-robot hierarchy
But there is also power demonstrated in language and politeness markers, such as when a cop says, “Please step out of your car, ma’am,” noted Zimmer, adding this is somewhat reminiscent of the power dynamic between humans and robots.
“More polite forms of language are not just social differentiators, but they can express deference,” he said. “Bots are seen as lower status and are talking as if we are higher status.”
McKean also pointed to a social hierarchy of yore in which children were not supposed to speak to adults unsolicited.
“I think it’s interesting robots have the same restrictions – they can’t initiate contact with you,” McKean said. “You’re the one who kicks it off. You theoretically set reminders up… but you can shout, ‘Stop!’ and they will.”
There are also parallels in the business world.
“It’s about high status and low status work,” McKean said. “Secretarial work used to be high status. Young men learned business and government work, so someday they could be titans of industry. When that work became less of a pathway to power, it became more feminine.”
Children in particular like to yell at robots.
“I think on some level they think of it as a device and they’re just having fun,” Zimmer said. “One of the concerns with Alexa is the idea that children might not be appreciating certain linguistic etiquette one should use. The case of Alexa is the name – a prompt – and an imperative. To try to change it by making it more indirect, which we do with humans…it just complicates things. We want an answer or to fulfill a command, so we say the name and imperative very clearly and loudly. That’s the kind of interaction that would be quite rude if you were saying it to a human – especially if it’s a child or an older person.”
Holoubek also noted it’s part of a parent’s job to correct language mistakes – and Alexa won’t do that.
But McSweeney said babies don’t acquire language from devices – they acquire it from the human beings around them.
“They may be commanding Alexa, but they are not acquiring speech from Alexa. They are still acquiring language from their parents,” she said.
In fact, McKean said language has always evolved and shifted – and it can’t be ruined as long as there are people.
“I think language is as much style and fashion as it is communicative,” she said. “If you’ve ever walked down street and said, ‘Kids today…’ it’s the same reaction to language change that wasn’t present when your language was formed.”
She even suggested putting googly eyes on devices to make them seem more human and to perhaps inspire us to be nicer to them.