Speaking “Jenessa”: what AI needs to learn
“Congrats, you speak ‘Jenessa’!” This is a light-hearted joke I often make after someone has been able to understand a message I’ve tried to communicate. It’s been said that I have a unique way of characterizing things — a creative application of words.
In actuality, we all suffer from “speaking our own language.” Each of us has our own ‘idiolect,’ our own way of speaking, writing, and communicating; or, as Merriam-Webster puts it, the language or speech pattern of one individual at a particular period of life. This includes things such as vocabulary, grammar, and pronunciation. In “Language, Culture, and Society,” anthropologist Zdeněk Salzmann describes how an individual’s idiolect is influenced by social situations and even mode of communication. As the environments and people around us change — like moving from our work environment to a night out with friends — our use of language shifts.
These set of influences make our idiolect as unique as our fingerprints; in essence, a linguistic biomarker.
The latest marketing news and insights straight to your inbox.
Get the best of The Drum by choosing from a series of great email briefings, whether that’s daily news, weekly recaps or deep dives into media or creativity.Sign up
Recently, when typing an email in Gmail, Google’s Smart Composure tool auto-completed my sentence incorrectly. My reaction was as if a friend had interrupted me to tell me what I think: “No, that’s not what I was going to say, GOOGLE.”
Google doesn’t write or speak the inner monologue or even public-facing verbalization of Jenessa; my individual idiolect is unknown. But will it someday be able to finish my sentence better than a lifelong best friend?
Right now, the great promise of AI is automation, efficiency, and differentiation for those who invest in it. But that’s just the beginning. Voice search is slated to grow in the next couple of years — and marketers need to start to prepare nowfor when it takes hold. AI tools already exist that help to tailor subject lines and Facebook copy; it’s only a matter of time before AI will help to improve other marketing messaging.
Marketers who will be successful using AI to personalize communications to individual customers will focus on understanding the expressive nature of their own idiolects first, and then their customers.
We leave much opportunity on the table if we don’t train our AI tools in the nuances of our communication. Unless AI learns the ways of our idiolects, then AI designed to assist in communication will remain laden with moments of inefficiency and mass-communication, where users must stop to correct the machine’s output because it wasn’t what they intended.
AI can learn to speak our own unique language. Here are three ways useful to marketers:
Learn our ‘tells’
For AI to truly be helpful on an individual level, it must speak like an individual, not a mass set of data.
Perhaps the most basic first step to achieving an understanding of how we communicate is to discover our go-to phrases for certain situations. For example, does someone say, “That’s cool” or “That’s lit” to show their excitement about a product or experience? To identify individual communication traits, an AI must gain access to a large enough dataset of an individual’s idiolect to incorporate a variety of situations. In this case, historic documentation, such as my email history or past voicemails, would be a strong start that could be perfected from there to gather both written and spoken use cases of my personal idiolect.
Move from contextual to relational awareness
AI is starting to account for situational awareness—understanding that, for instance, if I’m in a certain location I may need one thing, or if I’ve just been using a certain app, I may require something different. However, this is not the only thing that influences the way we tailor our conversations. Like most people, my voice changes depending on my relationship with the individual I’m speaking to, but AI doesn’t account for relational awareness. In the case of Smart Compose, Google should be thinking about a way to mimic my voice change.
Remember that AI is only as smart as the data it is fed. So, at some point, we must teach the AI who is a customer and who is a coworker, or even how to identify the occasion in which I am speaking. In the case of Google, a simple prompt that asks the user, me, to categorize my contacts would be enough for the AI to start to learn relational context. This could be done on an ongoing basis, so the task is not too cumbersome. Identifying the impacts of occasion on an idiolect is more abstract, but could be done through semantic analysis for phrases and marrying the data to calendar data.
Move from linear to interrelated
I recall telling my international friends in graduate school that they had a grasp on the English language when they understood my dry humor and terrible puns. If they laughed, it meant they could not only understand the origin of the joke, but that they also understood the complexities of English.
AI certainly understands interrelated subjects and is built to surface insights that would take a while to unearth as a human. However, it doesn’t understand what events and experiences matter, what events are formative and the resulting “inside jokes.” It has the capacity to understand the interconnections and an inside joke, but not why the interrelationships I made make sense at that moment.
Until we can teach AI the “why” and its relative importance, it will struggle to understand the interrelationships of concepts. Through AI’s strength in analyzing the past, a data set can present observations of which topics are commonly used together, upon which a user can self-reflect on why that may be in order to then inform the AI.
As idiolects are entirely unique to each individual, and there are so many nuances to communication, it is quite possible that AI will never fully be able to “Speak Jenessa.” Additionally, since many of us aren’t aware of the influences on our individual idiolect until someone points them out, training our AI may also be near impossible unless we build prompts into it to assist this process. To train AI to speak in an individual’s idiolect, we must first become students of linguistics and become self-aware of the influences on our own idiolect and cultural biases we may hold.
Marketers have always sought to understand the customer’s motivations and align communication accordingly, but the era of algorithms calls marketers to educate the technology on the human experience. Marketers must rise to the occasion representing their customers’ behaviors in new ways. Those that will be successful will apply these three elements to strengthen the technology in ways that truly differentiate and augment the human experience.
Jenessa Carder is strategy director of Isobar US