A musical robot that synchronizes with a coplayer using non-verbal cues

Angelica Lim*, Takeshi Mizumoto, Tetsuya Ogata, Hiroshi G. Okuno

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)

Abstract

Music has long been used to strengthen bonds between humans. In our research, we develop musical coplayer robots with the hope that music may improve human-robot symbiosis as well. In this paper, we underline the importance of non-verbal, visual communication for ensemble synchronization at the start, during and end of a piece. We propose three cues for interplayer communication, and present a thereminplaying, singing robot that can detect them and adapt its play to a human flutist. Experiments with two naive flutists suggest that the system can recognize naturally occurring flutist gestures without requiring specialized user training. In addition, we show how the use of audio-visual aggregation can allow a robot to adapt to tempo changes quickly.

Original languageEnglish
Pages (from-to)363-381
Number of pages19
JournalAdvanced Robotics
Volume26
Issue number3-4
DOIs
Publication statusPublished - 2012
Externally publishedYes

Keywords

  • Entertainment robot
  • audio-visual integration
  • gesture recognition

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Human-Computer Interaction
  • Hardware and Architecture
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'A musical robot that synchronizes with a coplayer using non-verbal cues'. Together they form a unique fingerprint.

Cite this