Emotional expressivity of the observer mediates recognition of affective states from human body movements
Datasets usually provide raw data for analysis. This raw data often comes in spreadsheet form, but can be any collection of data, on which analysis can be performed.
Research on human motion perception shows that people are highly adept at inferring emotional states from body movements. Yet, this process is mediated by a number of individual factors and experiences. Within this study, we tackle two questions. Firstly, we ask which part of the body transmits the key information that is used to infer affective states. Secondly, we address how the observer’s own emotional expressivity influences the recognition process. We used two types of impoverished point-light displays depicting the same emotional interactions as either arm or trunk movements. Results showed that participants used different sources of information in an emotion-specific manner. Participants with richer self-reported emotional expressivity showed higher recognition accuracies overall but also benefited more from information delivered by arm gestures. We interpret our findings in terms of embodied simulation, suggesting that emotion perception constitutes a function of the expressing body and the individual observer.