Over the past year or so I've made it a point to ask every media/advertising expert I've met what their definition of "engagement" is. It's one of those concepts that has a very fuzzy, ephemeral feel to me, and it's hard talking about it a lot of the time because you can never tell what it means within any given group of people. While my personal feeling is that engagement is that thing you try to measure when you can't measure anything else -- you know, real metrics like impressions, sales conversions, whatever -- I know there are people who make good money focusing on viewer/customer engagement, whether that means improving store layouts, changing the content in TV commercials, or writing books on ... engagement.
Fortunately, the folks at MIT have been struggling with the concept as well (which makes me feel slightly less stupid), and in the grand tradition of stuff coming from MIT, their solution to the puzzle is to objectively quantify the physiological states and changes that relate to being engaged. According to this article from New Scientist (which is a year old, but I only just came across recently), a device which was originally developed to help people with autism better "read" body language and thus interact with others is now being looked at as a marketing tool to help advertisers and developers to see when viewers/shoppers are actively engaged with a given target, whether it be a TV commercial, a product on a shelf, or even a store associate:
[The] software, developed with Peter Robinson at the University of Cambridge, could detect whether someone is agreeing, disagreeing, concentrating, thinking, unsure or interested, just from a few seconds of video footage. Previous computer programs have only detected the six more basic emotional states of happiness, sadness, anger, fear, surprise and disgust. El Kaliouby's complex states are more useful because they come up more frequently in conversation, but are also harder to detect, because they are conveyed in a sequence of movements rather than a single expression.
[MIT Media Lab researcher Rana El Kaliouby's] program is based on a machine-learning algorithm that she trained by showing it more than 100 8-second video clips of actors expressing particular emotions. The software picks out movements of the eyebrows, lips and nose, and tracks head movements such as tilting, nodding and shaking, which it then associates with the emotion the actor was showing. When presented with fresh video clips, the software gets people's emotions right 90 per cent of the time when the clips are of actors, and 64 per cent of the time on footage of ordinary people.
I think this kind of technology could gain a lot more traction than the current state of the art for neuromarketing, the fMRI, which requires users to stick little sensors on their heads in a lab. On the other hand, a more sophisticated version of this device could be used innocuously -- maybe even feeding off of security camera feeds -- to track and train customer behavior without our knowledge, whereas it's pretty easy to tell when you're in somebody's lab with a bunch of sensors stuck on your head. So score one for technology-driven marketing projects, but privacy-conscious folks might want to start getting their tinfoil hats ready, just in case.Unrelated, I find it odd that I nabbed two marketing-related articles from science site NewScientist recently. The other one went into an article on interactive paper and wireless power (yes, really) just a few days ago.
Tags: viewer engagement, out-of-home advertising, marketing