A Snapchat lens for food shopping
How could we use augmented reality to help people make better decisions at the grocery store?
Would it have changed my behavior this morning to know how much sugar was in that sticky pecan bun? Maybe not. Sometimes we want what we want, but sometimes we can be pushed.
What if I could see nutrition and allergy information superimposed on foods before I bought and ate them? This article is about building that augmented reality future by using the phone camera in your hand to influence and simplify the hundreds of decisions made every time we buy food.
Prototyping the future of AR glasses with today’s phones.
My book, SuperSight, explains how AR glasses provide decision support and guidance like a coach sitting on your shoulder all day long. The same way we’ve come to rely on GPS, these glasses will be the equivalent of navigation for food, work, DIY projects, and even offer conversational guidance.
Most people don’t wear smart glasses yet, but we can use smartphones to prototype that inevitable future. Computer vision apps help us identify plants, install virtual IKEA sofas in our living rooms, and add bunny ears to your selfie. Snapchat announced last week at the Augmented World Expo that 250 million people use AR on their phones every day. Let’s use this technology for something bigger than throwing up rainbows!
Modern phones can also “read” food packaging and then show you information to help you make a better choice. This micro-decision aid can be one of the best uses of AR. With a little CandyCrush-style gamification, making pro-health choices might actually be a little fun.
Families are faced with a bewildering minefield of decisions every week when deciding what to buy and what to eat. I met Ruchi S Gupta MD, a rock star in the food and allergy world who leads a research group at Northwestern. She explained that food choices are multi-layered and complex. They feel unimportant at the moment, but the consequences are huge. The food we buy determines how we snack, how much time we spend preparing meals or deciding to order. It affects our focus, how we feel, and how we connect. This is the second budget item after housing, which we do not review on a daily basis. Pushing consumer behavior around food could help people make changes that can improve their mood and waistline.
Ruchi encouraged me to experiment with how augmented reality could be useful. I worked with one of his grad students, an AR designer, and a programmer to prototype a new service called Best choice. We tested it at a local Whole Food Market in Brookline, MA.
The augmented reality filter decorates the packet with data.
Here’s the experiment: you take a box of granola. The phone recognizes the package in your hand, looks up all the nutritional data, compares it to your profile, and summarizes key information with five colorful icons that represent how it matches your profile: allergens, nutrition, financial value, drug ratings. customers and a sustainability metric. When you press the “best choice” button, it visually swaps what you’re holding for the best product choice for that food category, based on what you put in your profile (more fiber, no shellfish , nut allergy, etc.)
To make the interface simple and persuasive, we select the most compelling points of comparison and reasons to follow our advice. When you hold up a package of granola cereal, it displays a gluten-free alternative, locally produced, highly rated, made in Vermont and available on a few shelves in the store.
Glanceability is the killer app
Aggregating all the information about millions of consumer packaged products, their allergies and nutritional data, price and availability is key to driving this experience. My friend David Goodtree from FoodMap is working on this big data fusion problem. But for our prototype, we focused on the customer experience, which is how to synthesize and express data in a visible and actionable way.
There are a million websites, blogs, YouTube channels, and dense food labels that provide plenty of information for families. When you’re standing in the market aisle, blocking the way for other shoppers, with an impatient toddler, you need guidance quickly or not at all. Even reading food labels can be inconvenient when shopping. Our goal is to summarize reliable information to help buyers make informed decisions quickly.
The need for customization
My mother is gluten-free; my daughter is vegan; my wife is Pesca; I look for foods low in carbs, low in salt and high in protein; and the friends we receive are kosher. Other families have even finer filters. This web of requirements is complicated for humans to follow while shopping, but easy for algorithms.
Trade-offs and multi-channel experiences.
No big surprise: it’s expensive to buy local. Better reveals the most durable option, but it’s usually the most expensive unless you’re buying in bulk.
Could people be interested in group purchases or a subscription, given enough information to feel confident with such a choice? Who wants to haul dog food or a big bag of flour, rice, or other heavy-duty packages, especially if the bulk value equation is more appealing.
A more sophisticated form of computer vision uses scene understanding to perform the inverse of augmented reality, it can recognize and quietly remove objects. In cluttered environments like a store, this diminished reality technique can be more valuable than augmented reality.
For example, we may remove any items from your field of view that do not match your BetterChoice profile so that whatever remains on the shelf represents a decent match.
After building our prototype, we went to Whole Foods and got people’s feedback.
Here’s what we learned.
- People are overwhelmed when shopping and want to be guided.
- Allergens were the most pressing issue
- People think they would use such a tool and choose to have certain things delivered to them if they were convinced that it made financial sense to buy in bulk.
- It was important for people to know that the product information was from a trusted source, not a paid promotion
- the idea of getting a free sample to persuade you to try something new was interesting.
Our next step is to expand product categories, improve the BetterChoice algorithm, and then roll out a broader test to more stores and geographies.
Which product brands would benefit the most from AR shopping advice?
Products that have the best data, the best nutrition, the best customer reviews and the best financial value. those that match people’s interests where it’s hard to see that alignment today, and new brands. because we promote products based on their inherent attributes rather than brand recognition, big brands may have the most to lose.
I wonder if such a tool would differentiate the shopping experience enough for people to choose one grocery store over another?