Social Interaction

Social interaction makes a difference. We think. I mean, it has to. But a difference to what? And what is it that makes interaction make a difference? Can we quantify it? Does it influence social cognitive processes that we use when not in interaction?

 
 

Social cognition in gaze-based interaction

Our overarching objective is to develop a behavioural account of how human understanding of others is both shaped by, and expressed in our most crucial capacity, on-going social interaction. Social interaction ostensibly adds a different quality to how we think about others —in fact, we possibly "think about others" very little during ongoing interaction— but what is it?

There are two candidates that spring to mind: (1) interaction allows us to observe the effect of our actions on and in others, indeed, the actions of the other are this effect; (2) interaction allows for the development of complex dynamics, whereby individuals become a system, any state of which becomes irreducible to a linear combination of its constituents. But so what? What can this explain? What effect does it have?

 Adapted from Schilbach & Timmermans et al. 2013  Behav Brain Sci

Adapted from Schilbach & Timmermans et al. 2013 Behav Brain Sci

 Oh, she follows my gaze to the blue ball! — But what does it do to me?

Oh, she follows my gaze to the blue ball! — But what does it do to me?

We implement interaction methodologically by focusing on eye-gaze as one of the most salient and rich components of human communication. We look at how people experience and respond to others’ reaction to own gaze on others, as well as dynamic gaze-based interactions, measured via “interactive eye-tracking”.

This project is funded by a Marie Curie Career Integration Grant (FP7-PEOPLE-2013-CIG #631224 "DUALGAZE")

 
 DiVA (Dual interactive eye tracking with Virtual anthropomorphic Avatars)

DiVA (Dual interactive eye tracking with Virtual anthropomorphic Avatars)

 Dual gaze-patterns run through MdRQA (Multidimensional Recurrence Quantification Analysis)

Dual gaze-patterns run through MdRQA (Multidimensional Recurrence Quantification Analysis)

DiVA — Uncovering gaze dynamics

When you do Dual interactive eye tracking with Virtual anthropomorphic Avatars, you know you need an acronym. For which I chose DiVA, as the setup was originally conceived and programmed by Iva Barisic (and she likes pink feathered boas). The setup projects people's eye gaze patterns in real time onto the eyeballs of the avatar the other person sees, including blinks and eyelid movements. This allows us to isolate natural gaze behaviour that is displayed on a face (instead of a cursor) while still controlling for person impression effects due to how the other looks — and it allows us to manipulate how the other looks. Also, it allows us to tweak gaze dynamics in real time beyond mere delay, and we can easily present different realities to both people. It's Python-coded in Vizard with face textures from FaceGen (occasionally we paste real faces from existing databases onto the virtual face surfaces), and the whole runs on EyeLink1000+ machines. 

We try and predict how certain characteristics of joint and individual tasks influence gaze dynamics between two people, and how these gaze dynamics predict their decisions or task success. To this end, we quantify those dynamics with a variety of nonlinear analysis methods, which also require some assumptions about how much you think the dyad is more than the sum of its parts. Cross Recurrence Quantification, (Multidimensional) Recurrence Quantification Analysis... This setup could be used to quantify abnormal gaze patterns between clinical groups and healthy controls and facilitate diagnosis.

This project forms part of DUALGAZE

 

Initiated joint attention and reward-driven learning

How does social interaction, the associated sociomotor action contingency, and in particular joint attention, shape the development and learning of our social skills? Social rewards such as smiles, trigger the brain’s reward system similarly to monetary rewards. In particular, having one’s gaze followed (self-Initiated Joint Attention, IJA) activates that system. However, the reward associated with joint attention has never been shown to impact behaviour directly. This project aims to show that IJA can facilitate learning new behaviours, which could potentially explain how we learn social skills. Monetary rewards can enable motor learning of faster saccadic eye movements towards peripheral objects, and we show that the same holds for experiencing IJA. We look at whether this is specifically tied to IJA, or whether more domain-general processes are at play, and experiencing contingency is enough. Is there anything in experiencing contingency in other persons that is more rewarding than other contingencies? When I experience something else as having agency, does that make me being able to have it react to what I do more rewarding?

This project forms part of DUALGAZE, and a School of Psychology PhD studentship

 Gaze and money: the same effect? To a large extent, so it seems.

Gaze and money: the same effect? To a large extent, so it seems.

 
RaceSpectrum.png

Initiated joint attention modulates bias, memory...

Does the group to which my interaction partner belongs (race, gender, age...) and the impression I have from them influence our gaze-based interaction dynamics? And more importantly, can better dynamics, or more contingency behaviour positively influence my impression? Looking at classics like the Other Race Bias, and using measures like IAT or reverse correlation, we look at how bias can be overcome, and how more in general, our impression of, and memory for faces and people is influenced by aspects of our interaction.

 

The effects of experiencing interaction

We like people who look at us longer. On picture. But what if it isn't a picture, but a video, and what if it isn't a video, but we tell you it's a live video link and you can see a person in another room, and what if we tell you they see you as well, via the webcam above your screen? Turns out, it's a whole different ballgame then, and the liking effect of mutual gaze is linked to knowing thats what the experiment is about. However, live interaction, irrespective of mutual gaze duration, does lead to higher likability ratings!

 

Consciousness

[to be extended soon]

 

Social agency

This project on sociomotor action contingency is done with Rama Chakravarthi (threeneurons.com) and straddles both social interaction and consciousness areas. The Sense of Agency (SoA), i.e., the feeling of ownership of an action and its consequences, is central to everyday life. Surprisingly, however, most studies on agency have utilised simple actions like button presses, whereas in everyday life most of our actions are interactions with others, wherein, just as we decide to act from our own conscious volition, they decide to react from their own volition. We propose that experiencing self-agency in social interaction is different from when we engage with inanimate objects, and that our sense of self-agency is dependent on the degree to which we perceive the other to be an agent as well (other-agency). Specifically, we think both the time course of agency experience, and the degree to which it is linked to predictability of effects, differs in interaction with another agent.

Social reward driving implicit learning

This project, again on sociomotor action contingency, is an extension of the joint attention and reward driven learning project. However, instead of looking at how the experience of contingent gaze can drive motor learning, we look at whether and how having one’s action imitated or experience someone else acting contingently upon it, can improve implicit motor learning.

Subjective measures of awareness

Most of my work on consciousness has been on subjective measures of awareness, how different measures may measure different aspects of awareness, and how these reflect the same information stream underlying behavioural responses. Also, I look at how we can distinguish measures of metacognition and measures of awareness, and how task accuracy impacts on either.

 
 

Money

  • 2019—2020 Carnegie Trust Research Incentive Grant (RIG008270), “Experiencing myself through you: Self-agency in social interaction”, £10K to B. Timmermans, PI, and R. Chakravarthi, co-PI.

  • 2017—2021 University of Aberdeen School of Psychology Master+PhD 4y stipend to M. Kasprzyk, “Gaze contingencies in joint attention as social reward that drives motor learning”

  • 2015—2019 Marie Curie Career Integration Grant (EU FP7-PEOPLE-CIG 631224), “DUALGAZE: Social cognition in gaze-based interaction”, €100K to B. Timmermans, PI.

  • 2010—2013 Volkswagen Foundation Grant (European Platform for Mind Sciences, Life Sciences, and Humanities), “Being addressed as you: Conceptual and empirical investigations of a Second-Person approach to other minds”, €317K to B. Timmermans co-PI, with L. Schilbach (UH Cologne), T. Schlicht (U Bochum), N. Steinbeis (MPI Leipzig).

  • 2009—2011 Marie Curie Intra-European Fellowship (EU FP7-PEOPLE-IEF 237502), “SOCIAL BRAIN: How does our brain learn to be social?”, €169K to B. Timmermans, PI.

  • 2017 Guarantors of Brain travel grant to C. Luke

  • 2017 EPS Grindley travel grant to C. Luke

  • 2015 School of Psychology Summer Internship stipend to M. Cederblad “Motor learning mediated by social reward of joint attention”

  • 2014 University of Aberdeen Principal’s Interdisciplinary Fund travel grant to B. Timmermans

  • 2014 University of Aberdeen Principal’s Excellence Fund travel grant to B. Timmermans

  • 2014 School of Psychology Summer Internship stipend to K. Bebus “Dual Interactive Eye Tracking”

  • 2019—2023 Member of COST Action CA18106 “The neural architecture of consciousness” (Kristian Sandberg, Aarhus University; with M. Allen, T. Bachmann, M. Brazdil, D. Carmel, A. Cleeremans, H. Critchley, Z. Dienes, I. Farkas, S. Fleming, S. Frässle, I. Griskova, B. Gutkins, G. Hesselmann, D. Nemeth, E. Norman, A. Opre, M. Schabus, C. Tallon-Baudry, C. Teufel, & M. Wilke)

  • 2015—2018 Collaborating researcher on HARMONIA grant from the Polish National Science Centre — "Cognitive and neuronal mechanisms of metacognitive awareness" (PI Michal Wierzchon, Jagiellonian University of Krakow)