When people discuss something that they can both see, their attention becomes increasingly coupled. Previous studies have found that this coupling is temporally asymmetric (e.g., one person leads and one follows) when dyads are assigned conversational roles (e.g., speaker and listener). And while such studies have focused on the coupling of gaze, there is also evidence that people use their hands to coordinate attention. The present study uses a visual task to expand on this past work in two respects. First, rather than assigning conversational roles, participants' background knowledge was manipulated (e.g., expert and novice) to elicit differential roles inherent to the conversation. Second, participants were permitted to gesture freely while interacting. Cross Recurrence Quantification Analysis with data from mobile eye trackers and manually coded pointing gestures revealed that although more knowledgeable participants dominated the dialogue by talking and pointing more, the symmetry of coupled behaviors (gaze and pointing) between participants remained fixed. Asymmetric attentional coupling emerged, although this was dependent on conversational turn taking. Specifically, regardless of background knowledge, the currently speaking participant led attention, both with the eyes and with the hands. These findings suggest stable, turn-dependent interpersonal coupling dynamics, and highlight the role of pointing gestures and conversational turn-taking in multimodal attention coordination.
Copyright: © 2024 Haraped et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.