Artificial intelligence can detect our inner emotions via ‘invisible signals’

YouTube video

LONDON — Can’t get your partner to ever tell you how they really feel? There may be an app for that…one day. Scientists can now predict how someone is feeling using radio waves to measure heart rate and breathing. The wireless signals can detect a person’s feelings even in the absence of any other visual cues such as facial expressions. This AI technology could be used to help reveal our inner emotions.

In the past, emotion detection has relied on assessing visible signals such as facial expressions, speech, body gestures or eye movements. However, these methods can be unreliable as they do not capture an individual’s internal emotions effectively. Researchers are increasingly looking towards “invisible” signals, such as ECG to understand emotions.

ECG signals detect electrical activity in the heart, providing a link between the nervous system and heart rhythm. To date, the measurement of these signals has largely been performed using sensors that are placed on the body. Recently, non-invasive approaches that use radio waves have been used to detect them.

Researchers hope to investigate how they can use existing systems, like WiFi routers, to reveal the emotions of large groups of people, such as office workers. They also plan to work with healthcare professionals and social scientists on public acceptance and ethical concerns around the use of this new technology.

YouTube video

This type of approach would enable us to classify emotions of people on an individual basis while performing routine activities. Moreover, we aim to improve the accuracy of emotion detection in a work environment using advanced deep learning techniques,” says lead study author Ahsan Khan, a PhD student at Queen Mary University in London, in a statement.  “Being able to detect emotions using wireless systems is a topic of increasing interest for researchers. It offers an alternative to bulky sensors and could be directly applicable in future ‘smart’ home and building environments.

“In this study, we’ve built on existing work using radio waves to detect emotions and show that the use of deep learning techniques can improve the accuracy of our results,” he adds.

How machine learning can second your emotions

For the experiment, participants were asked to watch a video chosen for its ability to evoke one of four basic emotions: anger, sadness, joy and pleasure. Meanwhile, researchers emitted harmless radio signals, like those transmitted from WiFi, towards participants and measured the signals that bounced back off of them. The team was then able to reveal ‘hidden’ information about a person’s heart and breathing rates by analyzing changes to these signals caused by slight body movements.

“This research opens up many opportunities for practical applications, especially in areas such as human robot interaction and healthcare and emotional wellbeing, which has become increasingly important during the current Covid-19 pandemic,” notes project leader Yang Hao, a professor of antennas and electromagnetics at the university.

Psychological or neuroscientific researchers often use methods to detect human emotions. However, it is thought that these approaches could also have wider implications for the management of health and wellbeing. Previous research has used similar non-invasive or wireless methods of emotion detection. However, analysis depended on the use of classical machine learning approaches where an algorithm is used to identify emotional states within the data.

For this study, scientists used deep learning techniques where an artificial neural network learns its own features from time-dependent raw data. Results showed that this approach could detect emotions more accurately than traditional machine learning methods.

“Most of the published literature that uses machine learning measures emotions in a subject-dependent way, recording a signal from a specific individual and using this to predict their emotion at a later stage. However, deep learning allows us to assess data in a similar way to how a human brain would work, looking at different layers of information and making connections between them,” says Achintha Ihalage, also a PhD student at Queen Mary.

“With deep learning we’ve shown we can accurately measure emotions in a subject-independent way, where we can look at a whole collection of signals from different individuals and learn from this data and use it to predict the emotion of people outside of our training database,” adds Ms. Ihalage.

The findings are published in the journal PLOS One.

SWNS writer Laura Sharman contributed to this report.