Music, body, and machine: gesture-based synchronization in human-robot musical interaction

Front Robot AI. 2024 Dec 5:11:1461615. doi: 10.3389/frobt.2024.1461615. eCollection 2024.

Abstract

Musical performance relies on nonverbal cues for conveying information among musicians. Human musicians use bodily gestures to communicate their interpretation and intentions to their collaborators, from mood and expression to anticipatory cues regarding structure and tempo. Robotic Musicians can use their physical bodies in a similar way when interacting with fellow musicians. The paper presents a new theoretical framework to classify musical gestures and a study evaluating the effect of robotic gestures on synchronization between human musicians and Shimon - a robotic marimba player developed at Georgia Tech. Shimon utilizes head and arm movements to signify musical information such as expected notes, tempo, and beat. The study, in which piano players were asked to play along with Shimon, assessed the effectiveness of these gestures on human-robot synchronization. Subjects were evaluated for their ability to synchronize with unknown tempo changes as communicated by Shimon's ancillary and social gestures. The results demonstrate the significant contribution of non-instrumental gestures to human-robot synchronization, highlighting the importance of non-music-making gestures for anticipation and coordination in human-robot musical collaboration. Subjects also indicated more positive feelings when interacting with the robot's ancillary and social gestures, indicating the role of these gestures in supporting engaging and enjoyable musical experiences.

Keywords: human-robot interaction; music; robotic gestures; robotic musicianship; robots; synchronization.

Grants and funding

The author(s) declare that no financial support was received for the research, authorship, and/or publication of this article.