In order to fulfill its function in social interaction emotion recognition has to be fast and accurate. One way how speeded recognition can be ensured seems to be via multisensory simultaneous perception of emotions. Usually, we can perceive others’ emotions via their facial expression, speech, as well as body posture. In most cases, the information from these different modalities is congruent. In order to produce a final coherent percept of someone’s emotional state, it is necessary to integrate these complementary sources of information.
Overall, the multimodal integration of emotional information appears to be an important mechanism for emotion recognition, yet relatively little research has been done to investigate the underlying neuronal mechanisms, especially with respect to the time course of processing. An investigation of this process can thus add an important piece of information to the understanding of how the brain processes emotions.
In the course of my dissertation project, I will investigate the integration of emotional body language with emotional information from speech, mainly interjections, and facial expressions. As the overarching aim of the project is to learn more about emotion perception under natural circumstances, I use complex, moving stimuli to approximate ecologically valid situations as close as possible. I plan to investigate the integration process on a neuropsychological level, using mainly event-related potentials (ERPs), but also behavioral data, in order to gain more information about the time course underlying multimodal integration of emotional information.
Prof. Dr. Arthur Jacobs
Prof. Dr. Sonja Kotz