Abstract
Music has long been used to strengthen bonds between humans. In our research, we develop musical coplayer robots with the hope that music may improve human-robot symbiosis as well. In this paper, we underline the importance of non-verbal, visual communication for ensemble synchronization at the start, during and end of a piece. We propose three cues for interplayer communication, and present a thereminplaying, singing robot that can detect them and adapt its play to a human flutist. Experiments with two naive flutists suggest that the system can recognize naturally occurring flutist gestures without requiring specialized user training. In addition, we show how the use of audio-visual aggregation can allow a robot to adapt to tempo changes quickly.
Original language | English |
---|---|
Pages (from-to) | 363-381 |
Number of pages | 19 |
Journal | Advanced Robotics |
Volume | 26 |
Issue number | 3-4 |
DOIs | |
Publication status | Published - 2012 |
Externally published | Yes |
Keywords
- Entertainment robot
- audio-visual integration
- gesture recognition
ASJC Scopus subject areas
- Control and Systems Engineering
- Software
- Human-Computer Interaction
- Hardware and Architecture
- Computer Science Applications