The four pillars of health are sleep, exercise, emotional well-being, and a balanced diet. When it comes to health, we often focus on what we eat. But growing research shows that how we eat matters just as much.

Fast eaters were more likely to be obese and to develop metabolic syndrome compared with those who ate more slowly [6]. Similarly, slowing down meal duration has been shown to increase satiety and reduce energy intake: Participants who ate more slowly reported greater fullness and consumed fewer calories compared to those who ate quickly [7].

Chewing rate, meal duration, and bite count are now recognized as important factors linked to weight management, digestion, and long-term metabolic health. Despite this, most dietary tracking tools, such as apps, food diaries, and meal logs, rely on outdated manual methods that fail to capture these behaviors accurately. They depend on user recall, self-reporting, and multi-step processes that provide only rough estimates [2, 6].

At Emteq Labs, we are addressing both of these challenges by pioneering smart glasses that track not only what people eat, but also how they eat. We achieve this by accurately capturing chewing rate, meal duration, and bite count – metrics that traditional tools cannot match. Replacing outdated self-report methods with objective, automated measurement from OCOsense™,  transforming guesswork into real data, and providing clear insights into everyday eating behaviors.

Meet OCOsense™

Think of how financial transactions have changed over the past two decades. From writing checks and counting cash to today’s seamless tap-to-pay systems, payments have gone from slow and deliberate to instant and invisible. Food logging is poised for a similar transformation.

To understand how the technology fits together, think of OCOsense™ as the Intel Inside® equivalent of our smart glasses. Just as Intel processors power laptops made by many different brands, OCOsense™ is the facial intelligence system that powers our smart glasses models. The OCO™ sensors are adapted optical modules that capture subtle skin movements, while OCOsense™ integrates these sensors with firmware and processing. The hardware products themselves – whether research models like V1, V2, or S0.5, or the upcoming consumer-facing Sense® smart glasses – are all powered by OCOsense™, making invisible eating behaviors measurable.

Current manual and smartphone-based eating tracking methods not only interrupt meals but also contradict the principles of mindful eating, transforming a simple meal into a cumbersome, multi-step process that disrupts the natural enjoyment of eating.

How It Works

We developed the world's first facial neuromotor interface device, leveraging years of research in facial electromyography (EMG). As EMG requires electrical contact and is unsuitable for glasses, we pioneered a new sensor category called facial optomyography (OMG). Our patented OCO™ sensors are seamlessly integrated into the frame of smart glasses, discreetly detecting subtle facial muscle movements during chewing. Positioned at the temples, these non-contact sensors capture muscle activation, while an AI model accurately distinguishes chewing from other facial actions, such as smiling or talking [1]. At the core of this capability is MealID, an OCOsense™ algorithm for real-time eating detection. As soon as chewing begins, MealID flags the start of a meal and automatically triggers logging. Food tracking shifts from a manual, burdensome task to an effortless and invisible process that safeguards user privacy [2].

By combining contactless optomyography (OMG) sensing with on-device AI, the OCOsense™ platform provides a precise, natural, and hands-free way to monitor eating behavior. With no buttons to press, no photos to take, and no tedious self-logging required, dietary habits are captured seamlessly in everyday life – allowing meals to be enjoyed without interruption [2, 3].

Why Chewing Behavior Matters

Research shows that fast eaters are more likely to overeat, gain weight, and develop health issues like obesity and type 2 diabetes [6]. In contrast, slower, more mindful chewing supports better digestion and a stronger feeling of fullness [7].

Chewing is more than just breaking down food; it initiates a cascade of physiological responses that shape how the body processes a meal:

  • Satiety hormones: Slower chewing stimulates the release of hormones such as GLP-1 and peptide YY, which signal fullness to the brain and help regulate appetite.

  • Digestion efficiency: Prolonged chewing increases saliva production, which contains enzymes that begin breaking down carbohydrates and support smoother digestion further along the gastrointestinal tract.

  • Calorie intake regulation: When eating quickly, the body has less time to register fullness, often leading to excess calorie consumption before satiety signals can catch up.

  • Metabolic outcomes: Over time, these small differences in eating rate and meal duration accumulate, influencing body weight, metabolic health, and risks of chronic disease.

Yet these subtle eating patterns are nearly impossible to track accurately with conventional tools. While some mobile applications can tell you what you ate after manually logging food images (assuming you remember!), they cannot capture how you eat. The missing data included the pace of a meal, its duration, contextual factors like distraction, or the underlying patterns that shape eating behavior [4].

Consistent monitoring of these behaviors makes a difference. Studies show that individuals who regularly track what and how they eat are much more likely to reach and maintain their health goals. A peer-reviewed study demonstrated that individuals who consistently tracked their dietary intake achieved greater weight loss and sustained better long-term adherence compared to those who did not [8]. Overall, those who consistently log their meals are 3.5 times more likely to lose weight [9]. The OCOsense™ platform, including its latest consumer-focused model, Sense Lite, automates this process to deliver precise, objective insights into chewing, meal timing, and eating dynamics [2, 6], promoting healthier choices in a sustainable, discreet, and elegant way.

Gif.1 OCOsense™ chewing detection

Tested in the Lab and Beyond

In one of the largest validation studies of its kind with over 100 participants, smart glasses powered by the OCOsense™ system accurately detected chewing episodes, confirmed by synchronized video and manual annotations—the gold standard in behavioral research. The system achieves accuracy rates exceeding 90%, reliably differentiating chewing from other facial movements, such as speaking.

The OCOsense™ system transforms dietary tracking by surpassing traditional methods (self-reporting, motion sensors, and image capture). Unlike these conventional approaches, which often mistake eating for unrelated actions, OCOsense™ employs a unique approach to accurately monitor food intake by detecting chewing directly at its source through subtle facial muscle activity, utilizing OCO™ sensors and on-device AI. This allows our smart glasses to capture eating behavior with a level of accuracy that competing tools cannot achieve [4, 6].

In real-world trials, our MealID algorithm correctly identified 96% of self-reported eating episodes [2], even when participants were walking, talking, or pausing mid-meal. This demonstrates strong ecological validity: the smart glasses remained accurate in both controlled laboratory conditions and everyday life [6, 7].

Beyond detection, OCOsense™ has demonstrated promising potential as an intervention tool. In a recent pilot study, haptic feedback delivered via the smart glasses effectively slowed chewing speed and decreased overall eating rate, illustrating how the platform can actively promote healthier eating habits in addition to monitoring them [5]. By doing so, the OCOsense™ facial intelligence system provides objective measures of eating patterns that influence weight regulation, satiety, and long-term metabolic outcomes.

What Makes OCOsense™ Different

  • Contactless comfort: Sleek, lightweight glasses that monitor chewing without adhesive electrodes or restrictive straps [1].

  • Temple sensors for precision: Positioned directly over the masseter muscle to capture the most distinct chewing signals, while temple sensors serve as reference points to reduce motion artifacts [1].

  • Energy-efficient design: OCOsense™ smart glasses analyze chewing in short windows and are optimized for all-day battery life, ensuring hassle-free, continuous wear [2].

  • Comprehensive chewing analysis: The patented sensors track chew counts, speed, and meal duration. With each meal, the system adapts to personalize insights, revealing eating structures tailored to the user [2].

A Glimpse into Sense Lite

The newest OCOsense™-powered model, Sense Lite, is designed for everyday consumers and will be available this fall. Compact and discreet, it integrates OCO™ sensor–based chewing detection with a built-in camera and companion mobile app. Together, these features capture both how users eat and what they eat, providing richer insights into eating behavior. Importantly, images are handled securely, with no need to store raw photos – protecting user privacy at every step. All processed information is then synced to the cloud, enabling long-term behavioral insights.

This innovation is part of a broader movement toward digital phenotyping, where everyday devices capture behavior in real time. These moment-to-moment patterns matter since health is shaped not by a single meal or choice but by the accumulation of daily actions. By pairing chewing detection with contextual data, Sense Lite can reveal timing, frequency, pace, and meal composition – dimensions that traditional tracking often overlooks but which play a critical role in metabolism, weight regulation, and long-term health [6, 7].

In the context of nutrition, the impact could be transformative. By seamlessly combining chewing detection, meal timing, and contextual data, Sense Lite creates a holistic picture of eating behavior that goes far beyond calorie counting. Consider the health outcomes affected by diet that go beyond weight and gut health:

  • Cardiovascular risk (heart attacks & stroke) 
  • Brain function (Alzheimer’s risk, ADHD) 
  • Mental health (depression, bipolar disorder)
  • Hormone function (type II diabetes)
  • Inflammation (acne, eczema, arthritis)

This automated, unobtrusive approach supports healthier choices and long-term lifestyle change, without the burden of manual tracking [2].

Final Thought

Diet is the foundation of health, yet it is notoriously difficult to measure accurately. By combining precision sensing with AI, OCOsense™ transforms invisible behaviors into measurable insights – for researchers, healthcare professionals, and individuals alike. In contrast to traditional approaches that rely on guesswork or memory, it captures eating patterns objectively, one bite at a time [3].

References

[1] Archer, J. A., Mavridou, I., Stankoski, S., Broulidakis, M. J., Cleal, A., Walas, P., Fatoorechi, M., Gjoreski, H., & Nduka, C. (2023). OCOsense™ smart glasses for analyzing facial expressions using optomyographic sensors. IEEE Pervasive Computing, 22(3), 53–61. https://doi.org/10.1109/MPRV.2023.3276471

[2] Stankoski, S., Panchevski, F., Kiprijanovska, I., Gjoreski, M., Archer, J., Broulidakis, J., Mavridou, I., Hayes, B., Guerreiro, T., Nduka, C., & Gjoreski, H. (2024). Controlled and real-life investigation of optical tracking sensors in smart glasses for monitoring eating behaviour using deep learning: Cross-sectional study (Preprint). JMIR mHealth and uHealth, 12, e59469. https://doi.org/10.2196/59469

[3] Kiprijanovska, I., Stankoski, S., Broulidakis, M. J., Archer, J., Fatoorechi, M., Gjoreski, M., Nduka, C., & Gjoreski, H. (2024). Smartglasses for behaviour monitoring: Recognizing facial expressions, daily activities, and eating habits. AISE Journal. https://doi.org/10.3233/AISE240031

[4] Stankoski, S., Jordan, M., Gjoreski, H., & Luštrek, M. (2021). Smartwatch-based eating detection: Data selection for machine learning from imbalanced data with imperfect labels. Sensors, 21(5), 1902. https://doi.org/10.3390/s21051902

[5] Baert, C., Sazdov, B., Stankoski, S., Gjoreski, H., Nduka, C., & Jordan, C. (2025). Pilot study to reduce chewing and eating rates using haptic feedback from the OCOsense™ glasses. Appetite, 213, 108056. https://doi.org/10.1016/j.appet.2025.108056

[6] Ohkuma, T., Hirakawa, Y., Fujii, H., Kojima, T., Fukuhara, M., Kitazono, T., & Kiyohara, Y. (2015). Association between eating rate and obesity: A systematic review and meta-analysis. International Journal of Obesity, 39(11), 1589–1596. https://doi.org/10.1038/ijo.2015.96

[7] Robinson, E., Almiron-Roig, E., Rutters, F., de Graaf, C., Forde, C. G., Tudur Smith, C., Nolan, S. J., & Jebb, S. A. (2014). A systematic review and meta-analysis examining the effect of eating rate on energy intake and hunger. American Journal of Clinical Nutrition, 100(1), 123–151. https://doi.org/10.3945/ajcn.113.081745

[8] Harvey, J., Krukowski, R., Priest, J., & West, D. (2019). Log Often, Lose More: Electronic Dietary Self-Monitoring for Weight Loss. Obesity, 27(3), 380–384. https://doi.org/10.1002/oby.22382

[9] Robinson, E., Bevelander, K. E., Field, M., & Jones, A. (2021). The influence of eating rate on energy intake and satiety: A systematic review and meta-analysis. Public Health Nutrition, 25(2), 229–241. https://doi.org/10.1017/S136898002100358X