A musical robot that synchronizes with a coplayer using non-verbal cues

Angelica Lim, Takeshi Mizumoto, Tetsuya Ogata, Hiroshi G. Okuno

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

Music has long been used to strengthen bonds between humans. In our research, we develop musical coplayer robots with the hope that music may improve human-robot symbiosis as well. In this paper, we underline the importance of non-verbal, visual communication for ensemble synchronization at the start, during and end of a piece. We propose three cues for interplayer communication, and present a thereminplaying, singing robot that can detect them and adapt its play to a human flutist. Experiments with two naive flutists suggest that the system can recognize naturally occurring flutist gestures without requiring specialized user training. In addition, we show how the use of audio-visual aggregation can allow a robot to adapt to tempo changes quickly.

Original languageEnglish
Pages (from-to)363-381
Number of pages19
JournalAdvanced Robotics
Volume26
Issue number3-4
DOIs
Publication statusPublished - 2012
Externally publishedYes

Fingerprint

Robots
Visual communication
Synchronization
Agglomeration
Communication
Experiments

Keywords

  • audio-visual integration
  • Entertainment robot
  • gesture recognition

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Human-Computer Interaction
  • Computer Science Applications
  • Hardware and Architecture
  • Software

Cite this

A musical robot that synchronizes with a coplayer using non-verbal cues. / Lim, Angelica; Mizumoto, Takeshi; Ogata, Tetsuya; Okuno, Hiroshi G.

In: Advanced Robotics, Vol. 26, No. 3-4, 2012, p. 363-381.

Research output: Contribution to journalArticle

@article{bda88d3324794fb8bbfb3f09e013cc7f,
title = "A musical robot that synchronizes with a coplayer using non-verbal cues",
abstract = "Music has long been used to strengthen bonds between humans. In our research, we develop musical coplayer robots with the hope that music may improve human-robot symbiosis as well. In this paper, we underline the importance of non-verbal, visual communication for ensemble synchronization at the start, during and end of a piece. We propose three cues for interplayer communication, and present a thereminplaying, singing robot that can detect them and adapt its play to a human flutist. Experiments with two naive flutists suggest that the system can recognize naturally occurring flutist gestures without requiring specialized user training. In addition, we show how the use of audio-visual aggregation can allow a robot to adapt to tempo changes quickly.",
keywords = "audio-visual integration, Entertainment robot, gesture recognition",
author = "Angelica Lim and Takeshi Mizumoto and Tetsuya Ogata and Okuno, {Hiroshi G.}",
year = "2012",
doi = "10.1163/156855311X614626",
language = "English",
volume = "26",
pages = "363--381",
journal = "Advanced Robotics",
issn = "0169-1864",
publisher = "Taylor and Francis Ltd.",
number = "3-4",

}

TY - JOUR

T1 - A musical robot that synchronizes with a coplayer using non-verbal cues

AU - Lim, Angelica

AU - Mizumoto, Takeshi

AU - Ogata, Tetsuya

AU - Okuno, Hiroshi G.

PY - 2012

Y1 - 2012

N2 - Music has long been used to strengthen bonds between humans. In our research, we develop musical coplayer robots with the hope that music may improve human-robot symbiosis as well. In this paper, we underline the importance of non-verbal, visual communication for ensemble synchronization at the start, during and end of a piece. We propose three cues for interplayer communication, and present a thereminplaying, singing robot that can detect them and adapt its play to a human flutist. Experiments with two naive flutists suggest that the system can recognize naturally occurring flutist gestures without requiring specialized user training. In addition, we show how the use of audio-visual aggregation can allow a robot to adapt to tempo changes quickly.

AB - Music has long been used to strengthen bonds between humans. In our research, we develop musical coplayer robots with the hope that music may improve human-robot symbiosis as well. In this paper, we underline the importance of non-verbal, visual communication for ensemble synchronization at the start, during and end of a piece. We propose three cues for interplayer communication, and present a thereminplaying, singing robot that can detect them and adapt its play to a human flutist. Experiments with two naive flutists suggest that the system can recognize naturally occurring flutist gestures without requiring specialized user training. In addition, we show how the use of audio-visual aggregation can allow a robot to adapt to tempo changes quickly.

KW - audio-visual integration

KW - Entertainment robot

KW - gesture recognition

UR - http://www.scopus.com/inward/record.url?scp=84857894003&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84857894003&partnerID=8YFLogxK

U2 - 10.1163/156855311X614626

DO - 10.1163/156855311X614626

M3 - Article

VL - 26

SP - 363

EP - 381

JO - Advanced Robotics

JF - Advanced Robotics

SN - 0169-1864

IS - 3-4

ER -