By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage and assist in our marketing efforts. Read our Privacy Policy for more info.
Clients and partners
Created with professional emotional scientists

Wireless, contact-free, accurate facial sensing for breakthrough insights

The world's first eyewear that combines wireless non-contact sensors with a machine learning platform, enabling you to effortlessly collect and analyse facial data and activities.

With our revolutionary technology, you can explore the deepest layers of human expression and emotion, helping you make groundbreaking discoveries and achieve unprecedented insights.

Features and Benefits

Our system provides a simple, all-in-one wireless solution to track, timestamp and annotate user responses to content.
Proprietary OCO™ Optomyography (OMG) sensors detect high-resolution facial activations at key muscle locations.
Smart 9-axis inertial measurement unit (IMU) and an altimeter for behavioural understanding
Outward-facing camera to synchronise context to responses
Realtime streaming of data to mobile app
Eating behaviour
Engagement & Valence
Facial expressivity
Activity (walking, sitting, reading etc.)
Regulatory Compliance CE mark, RoHS, WEE.
Save data on the go. Build and run studies using our data collection mobile app.
View data in real-time via mobile app and observe patterns in the wearers behaviour and activity.
Tag important events that occur during the experience either in real-time or offline.
Collect time-synchronised data, both raw and interpreted, as well as video and annotations.
Use data to develop new algorithms and behavioural insights.

The Power of OCO™ Sensor Technology

At the heart of the device lies our patented OCO™ sensors, which provide precise mapping of facial skin movement in three dimensions. This breakthrough technology allows for a myriad of powerful applications:
Research and Academia
Gather high-resolution insights effortlessly, leading to groundbreaking findings in psychology, neuroscience, and human-computer interaction.
Healthcare and Wellness
Assess patients' emotional states accurately, monitor progress in facial rehabilitation, and design personalized treatment plans.
Content Creation and Marketing
Measure and understand how content makes users feel, their reactions and preferences, and develop novel human-computer interactions.
Gaming and Extended Realiies
Control characters and environments naturally using facial expressions and gestures, enhancing immersion and interactivity.
Corporate Training and Soft Skills Development
Measure and improve non-verbal communication, emotional intelligence, and interpersonal skills for better professional performance.
Human-Computer Interaction
Subtle facial gestures can be configured to allow hands-free interactions, with interfaces or devices
Dr Lisa Quadt
Research fellow in Psychiatry
I am interested in the interaction between body and mind. For researchers like me, integrated facial biofeedback is a dream come true.
Prof. Hugo Critchley
Consultant Psychiatrist
I’ve had the pleasure of working with Emteq’s R&D team on a number of projects and am impressed with their commitment to research. They are dedicated to developing evidence-based technologies for the benefit of users, clinicians, and researchers.
Prof. Tiago Guerreiro
Professor at FCUL. HCI Researcher and Vice-Director at LASIGE
A great device with so many possibilities for clinical research - usable and powerful
Mitja Lustrek
Senior Researcher and Head of the Ambient Intelligence Group at the Jozef Stefan Institute in Slovenia
A novel approach to sensing facial expressions and a nice product overall, would be happy to give it a try in my own research.

Join our newsletter

Stay up to date with our tech improvments and get detailed brochures from our products.
Thank you for joining!
Oops! Something went wrong while submitting the form.

Available for researchers

Download our Brochure for more information
OCO™ and OCOsense™ are trademarks of Emteq Limited

This technology is protected by multiple granted patents, including but not limited to:
GB2561537, GB2604076, US11,003,899, US11,538,279