The Drum Awards Festival - Extended Deadline

-d -h -min -sec

Biometrics Emotion Tracking Neuromarketing

How brands are tapping into consumers’ faces – and brains


By Lisa Lacy, n/a

August 1, 2017 | 12 min read

Apple will reportedly enable users to unlock the upcoming iPhone 8 with their faces.

This type of facial recognition technology is also being used at airports to both improve the boarding process and to enhance security – at least allegedly.

But it can also help marketers determine consumers’ emotions. This burgeoning field includes startups like Affectiva, Eyeris, Kairos and Realeyes, as well as larger players like Microsoft and Apple (which acquired startup Emotient in 2016).

It’s a sizable market – in fact, Kairos, an AI company that says it specializes in face recognition, said its own research shows the face biometrics space alone will be worth $8bn by 2019. And research firm MarketandMarkets said the broader emotion detection and recognition market, which includes biosensors, will be worth more than $36bn by 2021.

The broader market includes companies like Beyond Verbal, which says it analyzes emotions from vocal intonations, as well as emotion tech company Lightwave, which uses bioanalytics to measure responses like heart rate, movement and temperature.

To date, these companies have used emotional analysis in myriad applications.

Jaguar worked with Lightwave to gauge the mood of the crowd at Wimbledon in 2015.

In 2015, for example, luxury automaker Jaguar used biometric wristbands from Lightwave to capture heart rate, movement and location of Wimbledon attendees to gauge the mood of the crowd, as well as in-ground sensors, which monitored crowd energy by collating data on movement and audio, and pushed out insights using #FeelWimbledon.

Lightwave also measured the emotional response of people who watched Hillary Clinton's 2016 acceptance speech.

More recently, Lightwave gave devices to US viewers of Hillary Clinton’s 2016 Democratic National Convention speech that measured their heart rate, motion and temperature to determine the most emotional moments – which included thanking rival Bernie Sanders and accepting the nomination itself – and determined her words were most moving to females, Middle Easterners and those over 55 years old.

Avocados from Mexico sponsored an emotion-enabled food experience at SXSW.

And, for its part, emotion measurement technology company Affectiva worked with Avocados From Mexico on what it dubbed an “emotion-enabled food experience” called the Avo-Matic for the 2017 South by Southwest Festival. Participants selected a custom avocado dish, which was prepared behind a “closed avocado door.” As the user’s happiness grew, the emotion recognition tool would open the door.

But this technology can also help gauge the effectiveness of advertising.

To wit: Emotional intelligence firm Realeyes said 65% of marketing ROI is driven by creative, but more than 90% of creative is never tested, which is in part because it has historically been difficult to test – and focus groups are slow and expensive and yield weak links to creative and media decisions.

However, Realeyes said it is measuring how consumers watch videos with webcam-based emotion measurement, which chief executive Mihkel Jäätma said works like a next-generation survey.

Clients drop a video into the Realeyes dashboard and choose geography and audience segments to test. Participants then watch the videos and Realeyes measures micro-movements of the face and uses computer vision and machine learning to analyze them, focusing on happiness, surprise, confusion, disgust and engagement and behaviors like how and when consumers move their heads.

Realeyes said it displays results in near real time with final results usually available within 24 hours of upload, including what Realeyes calls an EmotionAll score, or a score of one to 10 relative to its database of more than 8,000 videos. This demonstrates how well a video grabs consumer attention, retains it, builds momentum and leaves an impact. This, in turn, enables clients to assess earned media potential and inform edits to improve potential, Realeyes says.

Which ads sell and which don’t

Earlier this year, Realeyes partnered with confectionary giant Mars on a study that showed emotion measurement can determine which ads sell and which don’t with 75% accuracy.

The study included 149 ads across 35 brands and 22,334 people in six countries. The company measured how consumers felt while they watched the ads and its emotion data was cross-referenced with Mars’ known sales lift for each ad. The result is what Realeyes called “the largest emotional dataset linked to real business outcomes currently in existence.”

“Being able to identify strong creative with high sales impact enables advertisers to push these ads and avoid putting media spend behind those with low, or worse – no – sales impact. It’s about spending campaign budgets more effectively, optimizing ad creation and media buying at no additional cost,” Jäätma said in a statement. “Just think – an algorithm can detect how people feel about an advert by tracking their facial expressions and that can tell us whether that ad will sell or not – that’s exactly what our scientists been working to achieve.”

While work from Realeyes includes participants who give consent, Jason Snyder, chief technology officer of brand experience agency Momentum Worldwide, said facial recognition can also be used in public to capture data like gender, age and emotional status – and brands don’t necessarily need consumers to opt in for something like this.

“There’s no federal regulation around using facial recognition and storing that data,” Snyder said. “There’s no personally identifiable Information (PII) there. There’s no law that says your image on CCTV is PII and if I’m taking your image and storing it as a random number sequence to determine dwell time…I still don’t know who you are specifically. At that point, I just know you’re there [and] your age, gender and mood.”


But another branch of emotional measurement actually goes inside consumers’ heads to determine how their brains respond to campaigns.

Neuromarketing includes fMRI scans, or functional magnetic resonance images, which measure brain activity by detecting changes tied to blood flow – in other words, it’s sort of like reading minds.

According to Snyder, a number of recent studies suggest this neural data from relatively small groups of people – around 30 participants – can not only predict market-level behavior – on the order of millions of people – it can also predict this behavior better and outperform existing marketing tools.

In addition, Snyder said another fMRI study showed the timing of when consumers see a price may change the way they buy a product. In other words, when price came first, the neural data suggested the decision question shifted from “Do I like this?” to “Is this worth it?”. This, in turn, will dictate the story advertising creatives want to tell.

Steady state topography

However, per Heather Andrew, chief executive of neuroscience market research firm Neuro-Insight, fMRIs are fantastic for brain research, but not for rapid changes in real-time. It’s why Neuro-Insight measures electrical activity in the brain to determine responses to brand communication – primarily TV advertising – as well as context, such as the impact of using different devices or program environments.

“Brains are very specialized, so we can identify which part of the brain is active,” Andrew said. “We can’t read minds, but we can tell what cognitive process is going on and track processes in the brain.”

Richard Silberstein, founder and chief neuroscientist at Neuro-Insight, said this is called Steady State Topography (SST) and it measures how fast different parts of the brain are working.

“Every part of the brain is always active, but if it becomes more active and is operating faster, we measure speed and can infer the level of activity,” Silberstein added.

Participants in Neuro-Insight studies are fitted with headsets and watch recorded material. For studies into TV advertising, for example, they watch a 30-minute TV show with ad breaks. And, as they watch, small sensors in the headset record electrical activity in the brain on a second-by-second basis.

In order to maintain a normal viewing experience and not cue participants in to the test material, they are not told specifically what the research is about or who Neuro-Insight is working for. Instead, they are simply told the company is interested in their response to television viewing.

‘If it doesn’t get into memory, it can’t impact future behavior’

The most important brain processes for brands are emotion and memory, according to Andrew – in part because what the brain sends to long-term memory storage impacts future consumer behavior.

“The big problem is if you ask [a consumer] why they made a decision of what they’re thinking, you get a rationalized response and that can be inaccurate for various reasons – they don’t want to tell you [or] they don’t know what’s driving the behavior,” Andrew said. “If you look at brains, that’s why memory code is such a key for us. If it doesn’t get into memory, it can’t impact future behavior…but, further than that, brains are selective about what goes into memory. It’s what the brain has subconsciously identified as something useful.”

Per Silberstein, when you ask someone if they can remember their last birthday, certain parts of the brain light up and if you ask what they will do on their next birthday, many of the same regions light up again.

“The view is the role of memory is not a perfect record, but a guide to your future behavior,” he said. “If something is strongly encoded in your long-term memory, it is relative to future behavior. Your brain is queuing that for future action.”

What’s more, while our brains follow a narrative, they are essentially collecting snapshots and when they see a signal the story is coming to an end, the brain wants to bundle those snapshots and file them away for conceptual closure. And that means for a second or two, our brains are relatively unreceptive to new information. And that could be bad news if, say, a brand logo appears in that particular moment.

“It’s not about subliminal advertising – it’s working with the way the brain works,” she added.

Per Andrew, Neuro-Insight often works toward the end of the creative process to help identify the words, images and ideas that are resonating most strongly.

“It can’t generate a creative idea…but it can look at where and how branding is delivered or how to leverage TV advertising across other media,” Andrew said. “One important thing about how memory works in brain – it’s not like a video camera. If you watch a 30-second TV ad, it’s most like a series of JPEGs – snapshots. The brain takes those and retells the story. That’s all the brain’s got – snapshots are really important [because] if you know where the snapshots are, those scenes would work really well in other media.”

In a study with UK TV network ITV, for example, Neuro-Insight tracked viewers’ responses to the Britain’s Got Talent app and found app usage enhanced the TV viewing experience by 16% because those who viewed both TV and app content had higher enjoyment levels. App users also showed stronger brain response to on-screen TV activities and a 15% increase in brain response to TV ad breaks. App users showed a 43% increase in brain response to in-app ads versus TV ads and those who spent more time using the app showed stronger brain responses for on-screen TV ads.

‘You have this unbelievably precise tool that creates custom marketing for every single person’

Andrew said she expects to see brain research used as a complement to other methodologies moving forward.

“Neuroscience research is a versatile tool – it can complement other research methodologies and works across various different media or creative briefs,” she said. “The insights it can produce are instructive because they are based on immediate brain response, rather than secondary responses from written, verbal or physical signals. Those aspects make neuroscience an increasingly sought-after tool for marketers and we’ll only see this grow in the near future.”

It will also yield increasingly custom experiences when combined with technologies like facial recognition and AI.

“Where this gets kind of scary is we have the ability to manipulate perceptions of things because we know how brains will respond to them,” Snyder said.

In other words, let’s say a soda company wants to target females ages 18 to 30 who look sad.

“So if I can identify these people, I don’t need to know who you are and then I know from neuromarketing those most susceptible to the behaviors I want to cause,” Snyder said. “When you use neuromarketing in combination with deep learning and AI and have that hooked into big data and other sorts of technology like facial recognition, you have this unbelievably precise tool that creates custom marketing for every single person.”

Snyder said this means marketing technology is adapting to human emotions.

“That’s what this is about. It’s understanding that whoever is closest to the consumer controls the conversation,” he said. “What this technology does is help brand marketers get as close as possible – and understand mind state by using systems and technologies like the Internet of Things, persistent connectivity and machine learning, all of a sudden, the world begins to adapt to you in ways that are most compelling. Marketing is moving from ‘live,’ to ‘living,’ and these technologies form the foundation for this shift.”

Biometrics Emotion Tracking Neuromarketing

More from Biometrics

View all


Industry insights

View all
Add your own content +