There is abundant evidence that perceivers can accurately infer some emotions from prototypical nonverbal expressions. In our work, we try to understand the factors that influence how and when we recognise emotions and produce emotional expressions.
Examples of this research include cultural and social factors such as group belonging [Pumin Kommattam; Xia Fang; see also culture] and covered faces [Agneta Fischer / Mariska Kret], and the role of temperament in infants’ emotion perception [Evin Aktar].
We also study emotional responses in psychopathological populations such as socially anxious individuals [Corine Dijk].
Some of our studies examine specific learning mechanisms, such as the influence of auditory learning on the development of emotional vocalizations [Disa Sauter].
We try to broaden the ways in which nonverbal communication of emotion is studied, by examining spontaneously produced expressions [Disa Sauter] and by studying expressions across different communicative channels (facial, postural [Mariska Kret], vocal [Disa Sauter], and across modalities [Gerben van Kleef]).
Our studies include oft-neglected emotions such as awe [Disa Sauter], contempt [Agneta Fischer], and amusement [Disa Sauter], and emotion-related phenomena, such as blushing [Corinne Dijk] and pupil aperture [Mariska Kret].
As part of our work on the nonverbal communication of emotions, members of AICE have developed stimulus sets available to other researchers - find out more [here].
In our research, we also make use of objective measures of nonverbal signals, including the Facial Action Coding Scheme (FACS) [Agneta Fischer], FaceReader (automated facial expression software) [Peter Lewinski], and acoustic analyses [Disa Sauter].
We also investigate the interpersonal effects of emotional expressions on other individuals' emotions, cognitions, and behaviors, both in dyadic and in group settings [Gerben van Kleef, Marc Heerdink].
Lewinski, P., den Uyl, T. M., & Butler, C. (2014). Automated facial coding: Validation of basic emotions and FACS AUs recognition in FaceReader. Journal of Neuroscience, Psychology, and Economics, 7(4), 227-236.