The retail technology vendor is moving beyond the paper shelf tags
Imagine: You’re walking down the beverage aisle at your local convenience store when you notice carbonated bubbles formulating across the digital display at the shelf. The movement and colors attract your attention, so you move closer to the shelf to investigate. As you do, different types of cola cans from the brand featured on the shelf (e.g. diet cola, cherry cola) appear across the screen along with the brand’s marketing message. You reach for a 12 pack on the shelf and see the display content change again—this time alerting you that you can buy two packs for $ 10.
This is the picture Jan Murley, chief strategy and market advisor to Cloverleaf, painted when describing the retail technology company’s newest solution: shelfPoint.
ShelfPoint is a dynamic shelf solution that marries in-store consumer behavior with emotional intelligence to help marketers push content to shoppers and learn more about them. ShelfPoint uses an on-shelf, digital LCD strip as the platform and user interface, Murley says, and leverages a media player to serve content based on shoppers’ movements, which it can track through optical sensors.
The triggered content is designed to get shoppers to do three things, she notes: Stop, engage, and convert. She explains that Cloverleaf measures conversion by comparing shelfPoint’s traffic count to the retailer’s point-of-sale data.
In addition to engaging and converting shoppers, the platform is designed to help marketers learn more about them. Leveraging optical sensors, shelfPoint gathers anonymous demographic data, such as gender and ethnicity. The sensors can also track customers’ facial expressions and, through a partnership with emotional artificial intelligence company Affectiva, classify them to analyze sentiment.
However, Murley is quick to point out that shelfPoint does not take pictures of customers or store any personal information. When shelfPoint detects a shopper, it pixelates the image and then sends the metadata to a cloud where the data is matched to a facial template. This data is then stored for only 24 hours.
“We’re very concerned about the security and the privacy of all of those individuals,” she says.
Seth Grimes, president of natural language processing and sentiment analysis consultancy Alta Plana, notes that emotional intelligence is something that’s been around for years; however, people are now leveraging machines to decipher human feelings and perceptions.
“We’re trying to computerize everything,” he says, including sentiment analysis for humans’ facial expressions, product reviews, and social content.
From Murley’s perspective, emotional intelligence opens the door for more “honest” feedback — arguing that consumers sometimes say what they think marketers want them to when asked for feedback.
However, Grimes doesn’t agree. He considers unsolicited social media posts and product reviews “genuine expressions” and says that facial recognition’s effectiveness can be offset depending on whether it’s conducted in a lab setting or “in the wild.” He also says that marketers can run into a bias issue if they don’t give their facial recognition technology enough time to recognize people who fall outside of the category parameters it’s been trained to identify.
“Technology can do a lot of things,” he says. “But if you don’t understand the ‘gotchas’…you’re going to fall into traps.”
Photo source: Cloverleaf