App recognizes feelings


Emotions play an important role when dealing with technology. However, even intelligent systems are not yet able to respond appropriately to human emotions.
In the EMOIO project, Fraunhofer IAO and the Institute of Human Factors and Technology Management IAT at the University of Stuttgart are researching how neuroscientific methods can be used to derive and interpret emotions from users' brain waves.
The identified emotions will then be made available to a technical system via a brain-computer interface (BCI), which can then adapt the design and behavior more specifically to the individual needs of the user.
Together with the project partners, Fraunhofer IAO presented a first live demo of the developed Brain-Computer-Interface (BCI). A test person was shown images with emotional content such as baby animals or war scenes.

The algorithm searches the recorded signals for patterns that have been identified in various previous studies as characteristic of positive and negative emotions.
In this way, the algorithm classifies the subject's reaction to an image as positive or negative within a few seconds. The classification result can then be forwarded to any technical system.
Which emotional state was determined by the algorithm is not only relevant for technical systems, but also interesting for the user himself. Therefore, Fraunhofer IAO developed a mobile app that displays the classified emotion live to the user.