Here’s how marketing research is looking past consumers and connecting with their bodies

Eye-tracking, brainwaves, facial expressions and other non-verbal responses are giving marketers inside tracks to your real feelings.

Chat with MarTechBot

A lab at iMotions

In their never-ending quest to understand which ads work best, marketers are now going beyond asking viewers for their reactions.

Instead, they are asking their bodies.

Ad agencies and researchers are regularly analyzing eye-tracking to measure attention, facial expressions to determine positive or negative emotions, galvanic skin response to gauge the intensity of emotions, brainwaves for determining patterns of mental responses, and other signals to get the kind of unfiltered info that conscious answers can’t supply.

This type of measurement, bio/neural research firm True Impact CEO Diana Lucaci told me via email, offers “unbiased, truthful customer reactions.”

Her company employs magnetic resolution imaging (MRI), electroencephalogram (EEG) and eye-tracking to measure responses to ads, movies, retail environments, branding and other choices that businesses make. As the Canadian company writes on its website:

“The popular methods of market research — focus groups and surveys — rely on participants verbally expressing their feelings about a product or brand. However, feelings are hidden from the conscious mind, and very hard to express. This presents a problem for advertisers, market researchers and marketers, who try to understand what people want, and deliver a product or service that addresses a real need.”

Every advertiser knows that it’s all about attention, emotion and intensity. Directly tracking attention and emotion can remove one more layer of interpretation.

“People act on how they feel,” Lucaci pointed out. “not always what they say,”

Increasing emotion increases attention, eye-tracking firm Sticky CEO Hans Lee told me, and increased attention leads to more sales. Here’s a screen showing Sticky’s eye-tracking:

sticky_eye_tracking

These connections have fueled a boom in efforts to get customers’ bodies to tell marketers what their words cannot.

New York City-based mobile marketer MediaBrix, for instance, regularly uses eye tracking, heart rates and brainwave monitoring to measure engagement and motivation in order to see which ad formats work best.

Mobile ad provider Kargo recently presented a study, conducted with neuroscience research firm MediaScience, that similarly used eye-tracking and other bodily feedback to determine which of several mobile ad formats generated the best response.

London-based Unruly has assembled a team of vendors — including Nielsen for brainwaves and Affectiva for recognizing emotions through facial expressions — to create a new, combined measurement of ad effectiveness.

But, even as the use of bio- and neural feedback grows, MediaScience founder and CEO Dr. Duane Varan told me that this is still “a very early stage,” and he cautions against reading too much into the measurements at this point.

There’s “a lot of abuse in the market” about claims and conclusions, he said, adding that the field needs to set up standards about terms, measurements and transparency.

First of all, Varan pointed out that “everyone is psychologically unique,” requiring individual calibrations that can eventually be done at scale.

And then there’s the issue of emotional or expressive subtlety. Labs can reliably determine broad reactions, he said, like whether someone is bored or excited.

EEG headgear, in MediaBrix' lab.

EEG headgear, in MediaBrix’s lab

Or you can tell with what he called “incredible accuracy” if someone finds a given scene to be funny by tracking the zygomaticus muscle at the ends of the mouth, which helps manage a smile. His firm, he said, was able to accurately rank 12 out of 14 new TV situation comedies last fall in terms of their later success, in large part thanks to the zygomaticus muscle.

Similarly, the corrugator muscle controls the eyebrows, so that tracking it can reliably indicate if a viewer is indicating confusion or a negative reaction by how much they are pushing their eyebrows together.

But beyond those broad-brush detections, he indicated, there are many kinds of responses that are more subtle, complex and interwoven. The criteria for determining if someone is engaged with an ad or content, for instance, is not clear.

That’s just the complexity of the response. Varan pointed out that stimuli can also be interconnected and complex, like trying to determine if a man looking at a video ad of a woman in a bikini holding a Coke can is showing interest in the woman, is thirsty for the drink, or both.

And, while stimuli lead to physiological responses, and responses lead to actions, the latter sequence becomes complex when you try to extrapolate how it becomes real-world activity. You might get sad watching a video ad, for example, but what are the steps translating that sadness into a purchase decision?

A key challenge, he said, is that there are no industry-wide definitions about what constitutes the measurements indicating certain characterizations, like engagement versus attention, or disgust versus annoyance. Until there is agreed-upon transparency, Varan said, the industry cannot live up to the expectations.

He pointed to an article he co-authored last year for the Journal of Advertising Research, in which eight neuroscience vendors analyzed the same eight TV ads for emotions and engagement — and came up with eight different analyses.

But, even as the standards and transparency need to mature, the technology is rapidly maturing into smaller and more affordable devices, as might be expected.

This technological evolution will lead all major companies to have internal labs for measuring feedback to their ads and marketing, according to Peter Hartzbech, CEO and founder of bio sensor research platform iMotions.

The smaller/cheaper sensors are also leading to much larger panels of users in their own homes, thus increasing the breadth of data collection, the speed of launching a new study and the ability to test reactions in real-world conditions.

One of the main providers of eye-tracking, Tobii, is now working on what it says is the first large-scale infrared eye-tracking panel with a thousand households, President Tom Englund told me.

Each participating household receives from Tobii an infrared eye tracker, which connects to a computer via USB. In one planned study for the new panel, Tobii is working with European ad tech firm Adssets to measure responses to ads on specific websites. The website’s cookies on the computers of the opt-in participants are matched to their individual identities.

In addition to cheaper infrared eye-trackers, Unruly VP of Insight Ian Forrester expects that cheaper EEG headsets for brainwave measurement will similarly be sent to participating homes at some point, for large panels that can provide much more data than the kind of small groups Unruly is currently running.

Already, companies like Sticky conduct eye-tracking and monitor facial responses remotely via webcams. Researchers say infrared eye-tracking can generate more granular results, but webcam tracking also produces usable data.

This kind of mass-sensing might eventually even tap into, say, the many exercise heartbeat trackers out there. And, if brainwave navigation for games and other computing activity ever catches on, those devices might also provide the hardware installed base to conduct massive testing with that technology.

In any case, there’s the distinct possibility of a new kind of online ad targeting, where ad creation and massive in-the-home biological feedback are connected into a kind of feedback platform.

Advertisers could generate their ads from component parts (as many do now), get feedback on emotional and other responses within minutes or possibly in real time — and then dynamically and almost immediately serve up another generation of those ads, tweaked for more effective emotional responses.

Two video ads for a new beer, for example, could be tested for their emotional responses, second by second. Then a new ad could be generated and served right away, combining the most emotionally effective or the most attention-getting scenes from each.

Static ads are already dynamically assembled from pictorial and textual elements. Animated ads could be generated on the fly, or video ads could be assembled from a library of shots.

In other words, this feedback machine could make today’s profile- and segment-based targeting seem quaint, as successive waves of ads and other content are sent out in immediate response to massive, immediate and involuntary viewer feedback.

The technological pieces to build such a bio- and neural feedback ad platform exist now, Unruly’s Forrester said.

It could finally achieve the goal marketers have probably always wanted, deep down: ads that change according to how you feel about them.


Opinions expressed in this article are those of the guest author and not necessarily MarTech. Staff authors are listed here.


About the author

Barry Levine
Contributor
Barry Levine covers marketing technology for Third Door Media. Previously, he covered this space as a Senior Writer for VentureBeat, and he has written about these and other tech subjects for such publications as CMSWire and NewsFactor. He founded and led the web site/unit at PBS station Thirteen/WNET; worked as an online Senior Producer/writer for Viacom; created a successful interactive game, PLAY IT BY EAR: The First CD Game; founded and led an independent film showcase, CENTER SCREEN, based at Harvard and M.I.T.; and served over five years as a consultant to the M.I.T. Media Lab. You can find him at LinkedIn, and on Twitter at xBarryLevine.

Get the must-read newsletter for marketers.