Implementation of a musical performance interaction system for the waseda flutist robot: Combining visual and acoustic sensor input based on sequential bayesian filtering

Klaus Petersen, Jorge Solis, Atsuo Takanishi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

The flutist robot WF-4RIV at Waseda University is able to play the flute at the level of an intermediate human player. So far the robot has been able to play in a statically sequenced duet with another musician, individually communicating only by keeping eye-contact. To extend the interactive capabilities of the flutist robot, we have in previous publications described the implementation of a Music-based Interaction System (MbIS). The purpose of this system is to combine information from the robot's visual and aural sensor input signal processing systems to enable musical communication with a partner musician. In this paper we focus on that part of the MbIS that is responsible for mapping the information from the sensor processing system to generate meaningful modulation of the musical output of the robot. We propose a two skill level approach to enable musicians of different ability levels to interact with the robot. When interacting with the flutist robot the device's physical capabilities / limitations need to be taken into account. In the beginner level interaction system the user's input to the robot is filtered in order to adjust it to the state of the robot's breathing system. The advanced level stage uses both the aural and visual sensor processing information. In a teaching phase the musician teaches the robot a tone sequence (by actually performing the sequence) that he relates to a certain instrument movement. In a performance phase, the musician can trigger these taught sequences by performing the according movements. Experiments to validate the functionality of the MbIS approach have been performed and the results are presented in this paper.

Original languageEnglish
Title of host publicationIEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings
Pages2283-2288
Number of pages6
DOIs
Publication statusPublished - 2010 Dec 1
Event23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Taipei, Taiwan, Province of China
Duration: 2010 Oct 182010 Oct 22

Publication series

NameIEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings

Conference

Conference23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010
CountryTaiwan, Province of China
CityTaipei
Period10/10/1810/10/22

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Control and Systems Engineering

Fingerprint Dive into the research topics of 'Implementation of a musical performance interaction system for the waseda flutist robot: Combining visual and acoustic sensor input based on sequential bayesian filtering'. Together they form a unique fingerprint.

  • Cite this

    Petersen, K., Solis, J., & Takanishi, A. (2010). Implementation of a musical performance interaction system for the waseda flutist robot: Combining visual and acoustic sensor input based on sequential bayesian filtering. In IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings (pp. 2283-2288). [5652576] (IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings). https://doi.org/10.1109/IROS.2010.5652576