Development of a real-time instrument tracking system for enabling the musical interaction with the Waseda flutist robot

Klaus Petersen, Jorge Solis, Atsuo Takanishi

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    9 Citations (Scopus)

    Abstract

    The aim of this paper is to create an interface for human-robot interaction. Specifically, musical performance parameters (i.e. vibrato expression) of the Waseda Flutist Robot No.4 Refined IV (WF-4RIV) are to be manipulated. Our research is focused on enabling the WF-4RIV to interact with human players (musicians) in a natural way. In this paper, as a first approach, a vision processing algorithm, that is able to track the 3D-orientation and position of a musical instrument, was developed. In particular, the robot acquires image data through two cameras attached to its head. Using color histogram matching and a particle filter, the position of the musician's hands on the instrument are tracked. Analysis of this data determines orientation and location of the instrument. These parameters are mapped to manipulate the musical expression of the WF-4RIV, more specifically sound vibrato and volume values.We present preliminary experiments to determine if the robot may dynamically change musical parameters while interacting with a human player (i.e. vibrato etc.). From the experimental results, we may confirm the feasibility of the interaction during a performance, although further research must be carried out to consider the physical constraints of the flutist robot.

    Original languageEnglish
    Title of host publication2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS
    Pages313-318
    Number of pages6
    DOIs
    Publication statusPublished - 2008
    Event2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS - Nice
    Duration: 2008 Sep 222008 Sep 26

    Other

    Other2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS
    CityNice
    Period08/9/2208/9/26

    Fingerprint

    Robots
    Musical instruments
    Color matching
    Human robot interaction
    Cameras
    Acoustic waves
    Processing
    Experiments

    ASJC Scopus subject areas

    • Artificial Intelligence
    • Computer Vision and Pattern Recognition
    • Control and Systems Engineering
    • Electrical and Electronic Engineering

    Cite this

    Petersen, K., Solis, J., & Takanishi, A. (2008). Development of a real-time instrument tracking system for enabling the musical interaction with the Waseda flutist robot. In 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS (pp. 313-318). [4650831] https://doi.org/10.1109/IROS.2008.4650831

    Development of a real-time instrument tracking system for enabling the musical interaction with the Waseda flutist robot. / Petersen, Klaus; Solis, Jorge; Takanishi, Atsuo.

    2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS. 2008. p. 313-318 4650831.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Petersen, K, Solis, J & Takanishi, A 2008, Development of a real-time instrument tracking system for enabling the musical interaction with the Waseda flutist robot. in 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS., 4650831, pp. 313-318, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, Nice, 08/9/22. https://doi.org/10.1109/IROS.2008.4650831
    Petersen K, Solis J, Takanishi A. Development of a real-time instrument tracking system for enabling the musical interaction with the Waseda flutist robot. In 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS. 2008. p. 313-318. 4650831 https://doi.org/10.1109/IROS.2008.4650831
    Petersen, Klaus ; Solis, Jorge ; Takanishi, Atsuo. / Development of a real-time instrument tracking system for enabling the musical interaction with the Waseda flutist robot. 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS. 2008. pp. 313-318
    @inproceedings{6fbfe9e1d2b14b269c4ac295dd07f2fb,
    title = "Development of a real-time instrument tracking system for enabling the musical interaction with the Waseda flutist robot",
    abstract = "The aim of this paper is to create an interface for human-robot interaction. Specifically, musical performance parameters (i.e. vibrato expression) of the Waseda Flutist Robot No.4 Refined IV (WF-4RIV) are to be manipulated. Our research is focused on enabling the WF-4RIV to interact with human players (musicians) in a natural way. In this paper, as a first approach, a vision processing algorithm, that is able to track the 3D-orientation and position of a musical instrument, was developed. In particular, the robot acquires image data through two cameras attached to its head. Using color histogram matching and a particle filter, the position of the musician's hands on the instrument are tracked. Analysis of this data determines orientation and location of the instrument. These parameters are mapped to manipulate the musical expression of the WF-4RIV, more specifically sound vibrato and volume values.We present preliminary experiments to determine if the robot may dynamically change musical parameters while interacting with a human player (i.e. vibrato etc.). From the experimental results, we may confirm the feasibility of the interaction during a performance, although further research must be carried out to consider the physical constraints of the flutist robot.",
    author = "Klaus Petersen and Jorge Solis and Atsuo Takanishi",
    year = "2008",
    doi = "10.1109/IROS.2008.4650831",
    language = "English",
    isbn = "9781424420582",
    pages = "313--318",
    booktitle = "2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS",

    }

    TY - GEN

    T1 - Development of a real-time instrument tracking system for enabling the musical interaction with the Waseda flutist robot

    AU - Petersen, Klaus

    AU - Solis, Jorge

    AU - Takanishi, Atsuo

    PY - 2008

    Y1 - 2008

    N2 - The aim of this paper is to create an interface for human-robot interaction. Specifically, musical performance parameters (i.e. vibrato expression) of the Waseda Flutist Robot No.4 Refined IV (WF-4RIV) are to be manipulated. Our research is focused on enabling the WF-4RIV to interact with human players (musicians) in a natural way. In this paper, as a first approach, a vision processing algorithm, that is able to track the 3D-orientation and position of a musical instrument, was developed. In particular, the robot acquires image data through two cameras attached to its head. Using color histogram matching and a particle filter, the position of the musician's hands on the instrument are tracked. Analysis of this data determines orientation and location of the instrument. These parameters are mapped to manipulate the musical expression of the WF-4RIV, more specifically sound vibrato and volume values.We present preliminary experiments to determine if the robot may dynamically change musical parameters while interacting with a human player (i.e. vibrato etc.). From the experimental results, we may confirm the feasibility of the interaction during a performance, although further research must be carried out to consider the physical constraints of the flutist robot.

    AB - The aim of this paper is to create an interface for human-robot interaction. Specifically, musical performance parameters (i.e. vibrato expression) of the Waseda Flutist Robot No.4 Refined IV (WF-4RIV) are to be manipulated. Our research is focused on enabling the WF-4RIV to interact with human players (musicians) in a natural way. In this paper, as a first approach, a vision processing algorithm, that is able to track the 3D-orientation and position of a musical instrument, was developed. In particular, the robot acquires image data through two cameras attached to its head. Using color histogram matching and a particle filter, the position of the musician's hands on the instrument are tracked. Analysis of this data determines orientation and location of the instrument. These parameters are mapped to manipulate the musical expression of the WF-4RIV, more specifically sound vibrato and volume values.We present preliminary experiments to determine if the robot may dynamically change musical parameters while interacting with a human player (i.e. vibrato etc.). From the experimental results, we may confirm the feasibility of the interaction during a performance, although further research must be carried out to consider the physical constraints of the flutist robot.

    UR - http://www.scopus.com/inward/record.url?scp=69549116703&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=69549116703&partnerID=8YFLogxK

    U2 - 10.1109/IROS.2008.4650831

    DO - 10.1109/IROS.2008.4650831

    M3 - Conference contribution

    AN - SCOPUS:69549116703

    SN - 9781424420582

    SP - 313

    EP - 318

    BT - 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS

    ER -