Interindividual differences in dyadic human interactions

In real-life social interactions, we constantly monitor others and adjust our own actions accordingly. During these close interactions with others, people look at each other's faces and listen to each other's utterances, quickly picking up on multiple socio-emotional cues. Faces and vocalisations convey various information such as the identity of others, their internal mental states, emotions and intentions, as well as their suitability as social partners, e.g. in terms of trustworthiness and formidability.

We simulate real-life dyadic interaction in the laboratory. Participants work together on a transparent dyadic interaction platform on a task that simulates a dynamic foraging situation with continuous movement. Both partners have the opportunity to collect individual or joint targets, acting competitively or cooperatively. In this experimental setup, socio-emotional cues are dynamically available to both partners and can be used for individual and joint decision making. We aim to investigate which of these cues attract attention, enhance neural processing, and shape decisions at the individual and dyadic levels.

This project is part of the SFB1528 – Cognition of Interaction (C02): https://uni-goettingen.de/de/research+area+c/653292.html

 

Methods

EEG, eye-tracking (gaze behaviour in 2D and 3D space), ECG, video recordings (facial expression recognition), audio recordings (non-verbal utterances)

 

Contact

anna.fischer@uni-goettingen.de