By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage and assist in our marketing efforts. Read our Privacy Policy for more info.

OCOsense

Unlock the Power of Facial and Movement Analytics with OCOsense
See Options
Groundbreaking eyewear that revolutionizes the way we understand human behaviour. By seamlessly integrating wireless sensors and advanced machine learning, OCOsense empowers you to uncover profound insights in real-world or augmented reality environments.

Unparalleled
Behavioural Analytics

Gain superior data quality and meaningful insights by mapping emotional responses to activities and contexts.

Wearable Expression
and Emotion Sensing

OCO™ sensors track facial muscle activation, providing precise three-dimensional facial movement mapping.

Break Free from Limitations and Discover the Power of Facial Data

Limited access to accurate and real-time data, capturing subtle facial movements, and assessing emotional responses have hindered progress in these fields,but no longer.

Introducing OCOsense:
Unlock the secrets of human behaviour effortlessly

The world's first eyewear that combines wireless non-contact sensors with a machine learning platform, enabling you to effortlessly collect and analyse facial data and activities.

With our revolutionary technology, you can explore the deepest layers of human expression and emotion, helping you make groundbreaking discoveries and achieve unprecedented insights.

Features and Benefits

Our system provides a simple, all-in-one wireless solution to track, timestamp and annotate user responses to content.
Proprietary OCO™ Optomyography (OMG) sensors detect high-resolution facial activations at key muscle locations.
Smart 9-axis inertial measurement unit (IMU) and an altimeter for behavioural understanding
Detachable external camera to synchronise context to responses (coming soon)
Realtime streaming of data to mobile app
Attention
Engagement & Valence
Facial expressivity
Activity (walking, sitting, reading etc.)
Regulatory Compliance CE mark, RoHS, WEE.
Collect
Save data on the go. Build and run studies using our data collection mobile app.
Monitor
View data in real-time via mobile app and observe patterns in the wearers behaviour and activity.
Annotate
Tag important events that occur during the experience either in real-time or offline.
Analyse
Collect time-synchronised data, both raw and interpreted, as well as video and annotations.
Develop
Use data to develop new algorithms and behavioural insights.

The Power of OCO™ Sensor Technology

At the heart of the device lies our patented OCO™ sensors, which provide precise mapping of facial skin movement in three dimensions. This breakthrough technology allows for a myriad of powerful applications:
Research and Academia
Gather high-resolution insights effortlessly, leading to groundbreaking findings in psychology, neuroscience, and human-computer interaction.
Healthcare and Wellness
Assess patients' emotional states accurately, monitor progress in facial rehabilitation, and design personalized treatment plans.
Content Creation and Marketing
Measure and understand how content makes users feel, their reactions and preferences, and develop novel human-computer interactions.
Gaming and Extended Realities
Control characters and environments naturally using facial expressions and gestures, enhancing immersion and interactivity.
Corporate Training and Soft Skills Development
Measure and improve non-verbal communication, emotional intelligence, and interpersonal skills for better professional performance.

Join our newsletter

Stay up to date with exclusive tech updates, deals, webinars, and get detailed brochures from all our products. You can unsubscribe anytime.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Join the Researcher Program

Apply Now

Get yours now!

Download our Brochure for more information
Download
OCO™ and OCOsense™ are trademarks of Emteq Limited

This technology is protected by multiple granted patents, including but not limited to:
GB2561537, GB2604076, US11,003,899, US11,538,279