Ambition: understanding emotional response
In May 2019, Emteq has partnered with the University of Bournemouth to deliver a first of its kind, public display project at the Science Museum in London. The project was part of a two month long exhibition called “Who Am I” aimed at inquisitive self-discovery through intriguing objects, provocative artworks and hands-on exhibits.
Creating a tool for understanding how people respond to emotional stimuli.
emteqPRO masks that monitor physiological emotional responses in controlled scenarios in a study with 730 participants.
Researcher Ifigenia Mavridou installed four standing and one room-scale Virtual Reality (VR) experiences with content ranging from negative to positive scenarios.
In order to provide a background for the data we were collecting we devised our experience in two stages. In the first stage the participants were providing anonymised information about their personality traits and their reactions to emotions through a questionnaire. In the second stage they were put through a VR experience to see how they reacted to different simulations.
We wanted not only to collect the different data that characterised their experience in VR – facial expressions, movement, posture, hear rate – but also to correlate that with their personality, emotional and expressivity traits.
Largest, multi-modal datasets of emotional, non-verbal signals from a demographically rich audience, affect algorithm based on arousal and valence.
Algorithms built with this data set are the basis of bespoke algorithms built for new clients. This growing data base helps with our vision of understanding the human behaviour.
Development work: Emteq Labs
Integration with technology: Emteq Labs
Study protocol: University of Bournemouth and Emteq Labs
Data Collection: Emteq Labs
Data analysis and report: Emteq Labs
New algorithm development and smart analytics: Emteq Labs
“Conventional systems for measuring changes in how a person responds to an emotional stimulus such as monitoring heart rate and facial expressions are difficult to do in real world unconstrained environments. One would need a camera to record the subject’s point of view, as well as multiple cameras to ensure an uninterrupted view of the subject’s face. A device to monitor the user’s heart rate would also be needed, with all data channels synchronised for later analysis. This is a complex task that would take a long time for each participant to be rigged up with the equipment.
We therefore partnered with Emteq who have created an all-in-one solution that incorporates physiological measurements via specialised sensors in the VR device. This methods avoids trailing cables and the inconvenience of attaching chest straps or cables to the subject and is much faster. As researchers we have complete control of what the subject sees and hears, together with automatic tagging of their behaviours and interactions."