September 21, 2020

Why VR provides the perfect environment to study human behaviour and emotion.

The fifth episode of the Emotion Lab takes us on an exploration of the two dimensional model of human emotions, and takes a deep dive into the largest ever study in a VR environment at the London Science Museum. Graeme and James are joined by Ifigeneia (Ifi) Mavridou, an Affect Scientist with expertise in human-computer interactions and evoking emotional states in VR environments. It was her background in Fine Arts that led her to study how VR can be used in artistic interactions. She is now looking at how emotional stimulation is related to presence and addiction, factors that can assist in the creation of immersive experiences and development of optimal VR content design. 

What is affect? How do we measure it? The model of human emotions. 

Affect relates to human emotion. Having worked in the field of affect for a number of years, Ifi has identified biometrics and physiological sensors as the best means of measuring emotion. She explained that human emotions are far more complex than the labels we’re familiar with, like, happy, sad, nervous etc. In fact, they’re actually conceptualised within the two-dimensional model of emotional valence and arousal.  

What is valance and what is arousal?

Valence is like polarity, sitting on a spectrum from negative through to positive. By contrast, arousal, or intensity, is measured on a scale from low to high. Low arousal could be described by sleepy, whereas someone with high arousal may present as excited. Remember that last scary movie you watched that made your palms sweaty and your heart beat fast? That’s your high-arousal state!

So what metrics and measurables apply to valence vs arousal? Psychological arousal is usually measured with PPG sensors that collect data such as heart rate, skin changes, sweat production to name a few. Other measurements like EEG, EMG and movement are usually correlated with valence. In practice, if a person leans towards or walks towards an object, it’s assumed that the experience is positive, however, when they retract, it’s negative – think about how you would lean towards a puppy but jump away from a spider. That said, there are no clear cut offs because this area is complex – heart rate, or some aspects of it, can be correlated with valence too – there’s still much to be discovered and understood in this field!

London Science Museum Study

“Who Am I?” was a study of over 780 volunteers across more than 3,000 recording sessions, hosted as an exhibition at the London Science Museum. The study had two main goals:

  • To find out if it is possible to create VR experiences that can reliably elicit valence and arousal
  • To identify whether changes in valence and arousal can be detected using behavioral and physiological analysis via the VR technology

To answer these questions, the scientific research team created four different stations in the museum: three active ones and one passive, where visitors could learn about VR and emotions. In the three active scenarios, participants wore emteq headsets and could explore and interact with the environment, sounds and objects. These three environments were designed and controlled to deliver a different stimulus: negative, neural and positive. 

With such a huge response, the research far outstripped any expectations, creating in excess of 31,000 data streams that are now available for use in new future algorithms. As well as providing a unique opportunity for the museum and its visitors  to collaborate with research , the results showed that ML models could accurately predict valence and arousal based on physiological data.

Data collection – progress and limitations

According to Ifi, with this study and others like it, we’ve now been able to show that VR can create the perfect environment to study human behaviour and emotion. As an Affect Researcher, Ifi knows first hand just how much the progress in wearables, sensors and VR technology has the potential to advance measuring emotion. James, Graeme and Ify also explored the concepts of wet vs dry sensors, and how the emteq labs kit allows scientists to create a natural and engaging environment for participants, at the same time as being cost-effective and convenient for study designers. Sharing his perspective, Graeme tells how emteq labs used feedback from the London Science Museum study to further improve their product by looking at the challenges of current technology, such as limited field of vision. He predicts that as the technology advances, it will only become more convenient to use and enable more accurate measurement. 

How did the public react?

The research exhibition attracted many peoples’ attention – from those who were already passionate about VR to those who’ve never tried it before. “It was one of the greatest experiences I’ve ever had in terms of interacting with the general public”, Ify says. Participants had fun and learned something new, whilst simultaneously contributing to society by advancing emotion research.

Future

As in every podcast so far, we have discussed the application and the future of research into our emotions, and with each new guest, it becomes all the more clear that there are huge possibilities that lie ahead. For instance, by developing standardised methodology for emotion research, more and more industries, such as  gaming, entertainment medicine, education and more, will be able to deploy the principles and technology more effectively than ever before. VR will become the ultimate lab environment for natural environment research. New methods, scenarios, some even remote…we really are only just at the beginning, so who knows where we’ll end up?!

You may also like

Automated VR Therapy – VR and Mental Health

In this episode of Emotion Lab, Graeme is joined by William Hamilton, the Founder and CTO of Mimerse – a Stockholm based VR company which builds evidence-based treatments for mental health conditions. During the podcast, William and Graeme discuss how VR is already being used to treat people suffering from mental health conditions and  explore the potential for automated virtual reality exposure therapies (VRETs), which are a new clinically validated approach for treating phobias.

November 30, 2020

Read more

VR Training for Front Line Journalists

This week Graeme is joined by Aela Callan – a documentary filmmaker, journalist and co-founder of Head Set . Aela and co-founder Kate Parkinson are no strangers to hostile environments, with more than 30 years of experience as foreign correspondents. Now Head Set is using the power of immersive technologies to create VR scenarios that help prepare journalists to work on the frontline. Listen in to hear the story of how and why Head Set was founded and the role of VR in training journalists and many other professions for work-related scenarios.

 

November 16, 2020

Read more

White Paper: Improving Emotional And Psychological Well-Being in Distributed Digital Times

This white paper is the latest in a series, providing insight for academic, clinical and market researchers, content and training specialists and all those who have an interest in the potential for Virtual Reality and emotion analytics. Written by the team at emteq labs, the white paper offers valuable insights into the capabilities and potential for the use of biometric feedback gathered within virtual reality, for both healthcare therapies and training.

November 12, 2020

Read more

About us

Objective biofeedback in
immersive experiences

Location

  • Emteq Ltd.
    Sussex Innovation Centre,
    Brighton BN1 9SB,
    United Kingdom