Robot musical accompaniment

Integrating audio and visual cues for real-time synchronization with a human flutist

Angelica Lim, Takeshi Mizumoto, Louis Kenzo Cahier, Takuma Otsuka, Toru Takahashi, Kazunori Komatani, Tetsuya Ogata, Hiroshi G. Okuno

研究成果: Conference contribution

22 引用 (Scopus)

抄録

Musicians often have the following problem: they have a music score that requires 2 or more players, but they have no one with whom to practice. So far, score-playing music robots exist, but they lack adaptive abilities to synchronize with fellow players' tempo variations. In other words, if the human speeds up their play, the robot should also increase its speed. However, computer accompaniment systems allow exactly this kind of adaptive ability. We present a first step towards giving these accompaniment abilities to a music robot. We introduce a new paradigm of beat tracking using 2 types of sensory input - visual and audio - using our own visual cue recognition system and state-of-the-art acoustic onset detection techniques. Preliminary experiments suggest that by coupling these two modalities, a robot accompanist can start and stop a performance in synchrony with a flutist, and detect tempo changes within half a second.

元の言語English
ホスト出版物のタイトルIEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings
ページ1964-1969
ページ数6
DOI
出版物ステータスPublished - 2010
外部発表Yes
イベント23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Taipei
継続期間: 2010 10 182010 10 22

Other

Other23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010
Taipei
期間10/10/1810/10/22

Fingerprint

Synchronization
Robots
Computer systems
Acoustics
Experiments

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Control and Systems Engineering

これを引用

Lim, A., Mizumoto, T., Cahier, L. K., Otsuka, T., Takahashi, T., Komatani, K., ... Okuno, H. G. (2010). Robot musical accompaniment: Integrating audio and visual cues for real-time synchronization with a human flutist. : IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings (pp. 1964-1969). [5650427] https://doi.org/10.1109/IROS.2010.5650427

Robot musical accompaniment : Integrating audio and visual cues for real-time synchronization with a human flutist. / Lim, Angelica; Mizumoto, Takeshi; Cahier, Louis Kenzo; Otsuka, Takuma; Takahashi, Toru; Komatani, Kazunori; Ogata, Tetsuya; Okuno, Hiroshi G.

IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings. 2010. p. 1964-1969 5650427.

研究成果: Conference contribution

Lim, A, Mizumoto, T, Cahier, LK, Otsuka, T, Takahashi, T, Komatani, K, Ogata, T & Okuno, HG 2010, Robot musical accompaniment: Integrating audio and visual cues for real-time synchronization with a human flutist. : IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings., 5650427, pp. 1964-1969, 23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010, Taipei, 10/10/18. https://doi.org/10.1109/IROS.2010.5650427
Lim A, Mizumoto T, Cahier LK, Otsuka T, Takahashi T, Komatani K その他. Robot musical accompaniment: Integrating audio and visual cues for real-time synchronization with a human flutist. : IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings. 2010. p. 1964-1969. 5650427 https://doi.org/10.1109/IROS.2010.5650427
Lim, Angelica ; Mizumoto, Takeshi ; Cahier, Louis Kenzo ; Otsuka, Takuma ; Takahashi, Toru ; Komatani, Kazunori ; Ogata, Tetsuya ; Okuno, Hiroshi G. / Robot musical accompaniment : Integrating audio and visual cues for real-time synchronization with a human flutist. IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings. 2010. pp. 1964-1969
@inproceedings{873ba17a09e7444f969ee62c1506f8a3,
title = "Robot musical accompaniment: Integrating audio and visual cues for real-time synchronization with a human flutist",
abstract = "Musicians often have the following problem: they have a music score that requires 2 or more players, but they have no one with whom to practice. So far, score-playing music robots exist, but they lack adaptive abilities to synchronize with fellow players' tempo variations. In other words, if the human speeds up their play, the robot should also increase its speed. However, computer accompaniment systems allow exactly this kind of adaptive ability. We present a first step towards giving these accompaniment abilities to a music robot. We introduce a new paradigm of beat tracking using 2 types of sensory input - visual and audio - using our own visual cue recognition system and state-of-the-art acoustic onset detection techniques. Preliminary experiments suggest that by coupling these two modalities, a robot accompanist can start and stop a performance in synchrony with a flutist, and detect tempo changes within half a second.",
author = "Angelica Lim and Takeshi Mizumoto and Cahier, {Louis Kenzo} and Takuma Otsuka and Toru Takahashi and Kazunori Komatani and Tetsuya Ogata and Okuno, {Hiroshi G.}",
year = "2010",
doi = "10.1109/IROS.2010.5650427",
language = "English",
isbn = "9781424466757",
pages = "1964--1969",
booktitle = "IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings",

}

TY - GEN

T1 - Robot musical accompaniment

T2 - Integrating audio and visual cues for real-time synchronization with a human flutist

AU - Lim, Angelica

AU - Mizumoto, Takeshi

AU - Cahier, Louis Kenzo

AU - Otsuka, Takuma

AU - Takahashi, Toru

AU - Komatani, Kazunori

AU - Ogata, Tetsuya

AU - Okuno, Hiroshi G.

PY - 2010

Y1 - 2010

N2 - Musicians often have the following problem: they have a music score that requires 2 or more players, but they have no one with whom to practice. So far, score-playing music robots exist, but they lack adaptive abilities to synchronize with fellow players' tempo variations. In other words, if the human speeds up their play, the robot should also increase its speed. However, computer accompaniment systems allow exactly this kind of adaptive ability. We present a first step towards giving these accompaniment abilities to a music robot. We introduce a new paradigm of beat tracking using 2 types of sensory input - visual and audio - using our own visual cue recognition system and state-of-the-art acoustic onset detection techniques. Preliminary experiments suggest that by coupling these two modalities, a robot accompanist can start and stop a performance in synchrony with a flutist, and detect tempo changes within half a second.

AB - Musicians often have the following problem: they have a music score that requires 2 or more players, but they have no one with whom to practice. So far, score-playing music robots exist, but they lack adaptive abilities to synchronize with fellow players' tempo variations. In other words, if the human speeds up their play, the robot should also increase its speed. However, computer accompaniment systems allow exactly this kind of adaptive ability. We present a first step towards giving these accompaniment abilities to a music robot. We introduce a new paradigm of beat tracking using 2 types of sensory input - visual and audio - using our own visual cue recognition system and state-of-the-art acoustic onset detection techniques. Preliminary experiments suggest that by coupling these two modalities, a robot accompanist can start and stop a performance in synchrony with a flutist, and detect tempo changes within half a second.

UR - http://www.scopus.com/inward/record.url?scp=78651475255&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=78651475255&partnerID=8YFLogxK

U2 - 10.1109/IROS.2010.5650427

DO - 10.1109/IROS.2010.5650427

M3 - Conference contribution

SN - 9781424466757

SP - 1964

EP - 1969

BT - IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings

ER -