Creative agency Emotive partners with AI pioneer Realeyes to optimise emotion in film comms
Creative agency Emotive and AI-powered emotion measurement company Realeyes, today announced a strategic partnership to help Emotive clients maximise emotional intensity in film communications.
With over $35m raised in external financing and 20,000 videos tested in 78 countries, Realeyes has pioneered emotion technology and continued to perfect it over the last decade. Their facial coding technology will be exclusively integrated into Emotive’s proprietary Brand Resonance Dashboard, allowing Emotive to test the emotional intensity of long and short-form creative on both mobile and desktop as a whole and on a second-by-second basis, in combination with real time social performance.
Says Simon Joyce, CEO of Emotive: “Since launch, Emotive has concentrated on leveraging data to increase creative impact. Realeyes, with their focus on deep, measurable insights into the role of emotion in creative effectiveness, is the perfect partner. Their combination of AI and facial coding to determine emotional impact brings an objective perspective to the heart of the Emotive creative process – a critical step in driving creative iteration.”
Says Mihkel Jäätma, CEO, Realeyes: “The power of emotion and attention measurement leads to better creative, better consumer experiences, and stronger business performance. Creative is the most important lever in driving marketing success, and now marketers and video creators can operationalise emotion and attention AI to inform and optimise their creative to drive greater ROI at scale.”
Emotive launched the partnership with Realeyes technology driving creative optimisation for two recent campaigns: Optus’ Change the Future They See and Google’s AFL Season In Search, with both spots delivering emotional intensity well above global benchmarks.
Check out Optus’ Change the Future They See campaign HERE
Check out Google’s AFL Season In Search campaign HERE
7 Comments
Maybe just don’t write shit fucking ads in the first place?
Robots make emotions.
This is the end.
Can Emotive, or Realeyes, please provide reference to the peer group reviewed papers published in reputable scientific journals that supports their claim?
Realeyes might be able to say someone looks happy, or sad – although often a look can be the exact opposite of a feeling.
For instance pain can trigger a smile.
My guesstimate is the likelihood of Realeyes’ accurately identifying facial expressions accurately would be 50/50.
But so what?
Of more concern is that Realeyes fails to acknowledge they’re only measuring a facial expression they ‘believe’ is associated with an emotion.
They’re not measuring an emotion.
Realeyes does not and cannot measure how a respondent is responding emotionally.
It’s not measuring how the brain is responding, what’s being committed to memory, or the role one sound track/vo compares to another.
And as has been proven countless times, a negative response can have the greatest impact on memory and create the most powerful and positive behavioural response.
I fear Realeyes is just another of those pseudo-scientific thingamejigs that sounds amazing, but will soon disappear.
April 1?
Wonder how their own realeyes sizzle real tested?
What a load of shit. Unfortunately, an ignorant weakminded client may buy into this. Also, why do they have work in their reel that is not theirs?
Merry Christmas!