Posted in | News | Sensors General

AI Approach based on Wireless Signals Could Help Detect Inner Emotions

A novel artificial intelligence (AI) approach based on wireless signals could help to reveal our inner emotions, according to new research from Queen Mary University of London.

The study, published in the journal PLOS ONE, demonstrates the use of radio waves to measure heartrate and breathing signals and predict how someone is feeling even in the absence of any other visual cues, such as facial expressions.

Participants were initially asked to watch a video selected by researchers for its ability to evoke one of four basic emotion types; anger, sadness, joy and pleasure. Whilst the individual was watching the video the researchers then emitted harmless radio signals, like those transmitted from any wireless system including radar or WiFi, towards the individual and measured the signals that bounced back off them. By analysing changes to these signals caused by slight body movements, the researchers were able to reveal 'hidden' information about an individual's heart and breathing rates.

Previous research has used similar non-invasive or wireless methods of emotion detection, however in these studies data analysis has depended on the use of classical machine learning approaches, whereby an algorithm is used to identify and classify emotional states within the data. For this study the scientists instead employed deep learning techniques, where an artificial neural network learns its own features from time-dependent raw data, and showed that this approach could detect emotions more accurately than traditional machine learning methods.

Achintha Avin Ihalage, a PhD student at Queen Mary, said: "Deep learning allows us to assess data in a similar way to how a human brain would work looking at different layers of information and making connections between them. Most of the published literature that uses machine learning measures emotions in a subject-dependent way, recording a signal from a specific individual and using this to predict their emotion at a later stage.

"With deep learning we've shown we can accurately measure emotions in a subject-independent way, where we can look at a whole collection of signals from different individuals and learn from this data and use it to predict the emotion of people outside of our training database."

Traditionally, emotion detection has relied on the assessment of visible signals such as facial expressions, speech, body gestures or eye movements. However, these methods can be unreliable as they do not effectively capture an individual's internal emotions and researchers are increasingly looking towards 'invisible' signals, such as ECG to understand emotions.

ECG signals detect electrical activity in the heart, providing a link between the nervous system and heart rhythm. To date the measurement of these signals has largely been performed using sensors that are placed on the body, but recently researchers have been looking towards non-invasive approaches that use radio waves, to detect these signals.

Methods to detect human emotions are often used by researchers involved in psychological or neuroscientific studies but it is thought that these approaches could also have wider implications for the management of health and wellbeing.

In the future, the research team plan to work with healthcare professionals and social scientists on public acceptance and ethical concerns around the use of this technology.

Ahsan Noor Khan, a PhD student at Queen Mary and first author of the study, said: "Being able to detect emotions using wireless systems is a topic of increasing interest for researchers as it offers an alternative to bulky sensors and could be directly applicable in future 'smart' home and building environments. In this study, we've built on existing work using radio waves to detect emotions and show that the use of deep learning techniques can improve the accuracy of our results."

"We're now looking to investigate how we could use low-cost existing systems, such as WiFi routers, to detect emotions of a large number of people gathered, for instance in an office or work environment. This type of approach would enable us to classify emotions of people on individual basis while performing routine activities. Moreover, we aim to improve the accuracy of emotion detection in a work environment using advanced deep learning techniques."

Professor Yang Hao, the project lead added: "This research opens up many opportunities for practical applications, especially in areas such as human/robot interaction and healthcare and emotional wellbeing, which has become increasingly important during the current Covid-19 pandemic."

Source: https://www.qmul.ac.uk/

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.