Smart Glasses: Emotion and Food Tracking

Source: tomsguide.com

Published on June 12, 2025

Smart Glasses Track Emotions and Eating Habits

I believe glasses will be the next great fitness wearable. I tried a prototype of new eyewear from Emteq Labs. It features sensors around the rims to detect subtle changes in your facial expressions, even those you aren’t consciously aware of making. This data, along with AI, can become a personalized life coach for your fitness, diet, and emotional health. I spoke with Emteq CEO, Steen Strand, to see what the glasses can offer the average user.

How They Work

The glasses have nine sensors that identify facial movements to a near-microscopic degree. The sensors are dotted across the bottom of the lenses and are paired with AI. There are many potential uses for these glasses, such as using your face to interact with a computer or adding emotion to in-game characters. One use that stood out to me is health, both physical and emotional.

Currently, health tracking via consumer tech is limited to fitness routines and sleep tracking. Good nutrition is also important. While apps can deliver nutritional information, Emteq’s prototype setup is easy to use and complex with actionable detail.

Using ChatGPT-4o, the on-board camera takes a picture of your food and breaks it down into total calories and detailed macros. It also gives you a chewing score. The sensors on the glasses can track biting and chewing speeds.

Steen added that AI can give you custom, personalized guidance in real-time. The glasses have information about how and what you’re eating and use haptic feedback for notifications. They can also track activities like walking, running, and star jumps. With AI, this can give you a better understanding of your fitness levels.

Emotion Sensing

Emotion sensing is another piece of the puzzle. Current methods include journal prompts, heart rate tracking, and deep breathing exercises, but people can be dishonest. Understanding emotions can make AI more effective. Beyond assessing eating behaviors, other data points like mood detection and posture analysis can assess emotional context. The upper section of my face gave me away when I tried to fake a smile.

Data can be added to ChatGPT for emotional support and therapy. Steen commented that understanding emotions is a force multiplier for AI. He added that for AI to be effective, it’s critical that it understands how you’re feeling in real-time, in response to different things that are happening around you.

This technology could lead to real emotional honesty that you may not get by rationalizing with yourself in a journal app. There are questions surrounding privacy and whether we want to be judged for our chewing. But the end result is more advanced than a smart ring. At Augmented World Expo (AWE), I saw how sensors and real-time data collection can aid you into a better life, which is the clearest step towards personalizing XR.