Musical-based interaction system for the Waseda Flutist Robot

Implementation of the visual tracking interaction module

Klaus Petersen, Jorge Solis, Atsuo Takanishi

    Research output: Contribution to journalArticle

    16 Citations (Scopus)

    Abstract

    Since 1990, at Waseda University the development on the Anthropomorphic Flutist Robot has been focused on mechanically reproducing the physiology of the organs involved during the flute playing (i.e. lungs, lips, etc.) and implementing basic cognitive capabilities to interact with flutist beginners. As a results of the research efforts done until now, the Waseda Flutist Robot is considered to play the flute nearly similar to the performance of a intermediate human player. However, we consider that in order to extend the interaction capabilities of the flutist robot with musical partners, further research efforts should be done. In this paper, we propose as a long-term goal to enable the flutist robot to interact more naturally with musical partners on the context of a Jazz band. For this purpose a Musical-Based Interaction System (MbIS) is proposed to enable the robot the process both visual and aural cues coming throughout the interaction with musicians. In particular, in this paper, the details of the implementation of the visual tracking module on the Waseda Flutist Robot No. 4 Refined IV (WF-4RIV) is presented. The visual tracking module is composed by two levels of interaction: basic (visual interface for the musician based on controlling virtual buttons and faders) and advanced (instrument tracking system so that the robot can process motion gestures performed by the musical partner in real-time which are then directly mapped into musical parameters of the performance of the robot). The experiments carried out were focused in verifying the effectiveness and usability of the proposed levels of interaction. In particular, we focused on determining how well our the WF-4RIV dynamically changes musical parameters while interacting with a human player. From the experimental results we observed that the physical constraints of the robot play an important role during the interaction. Although further improvements should be done to overcome such constrains, we expect that the interaction experience may become more natural.

    Original languageEnglish
    Pages (from-to)471-488
    Number of pages18
    JournalAutonomous Robots
    Volume28
    Issue number4
    DOIs
    Publication statusPublished - 2010 May

    Fingerprint

    Robots
    Anthropomorphic robots
    Physiology
    Experiments

    Keywords

    • Human-robot interaction
    • Music
    • Particle filter

    ASJC Scopus subject areas

    • Artificial Intelligence

    Cite this

    Musical-based interaction system for the Waseda Flutist Robot : Implementation of the visual tracking interaction module. / Petersen, Klaus; Solis, Jorge; Takanishi, Atsuo.

    In: Autonomous Robots, Vol. 28, No. 4, 05.2010, p. 471-488.

    Research output: Contribution to journalArticle

    @article{34810d16305f49698d58885c97e532b1,
    title = "Musical-based interaction system for the Waseda Flutist Robot: Implementation of the visual tracking interaction module",
    abstract = "Since 1990, at Waseda University the development on the Anthropomorphic Flutist Robot has been focused on mechanically reproducing the physiology of the organs involved during the flute playing (i.e. lungs, lips, etc.) and implementing basic cognitive capabilities to interact with flutist beginners. As a results of the research efforts done until now, the Waseda Flutist Robot is considered to play the flute nearly similar to the performance of a intermediate human player. However, we consider that in order to extend the interaction capabilities of the flutist robot with musical partners, further research efforts should be done. In this paper, we propose as a long-term goal to enable the flutist robot to interact more naturally with musical partners on the context of a Jazz band. For this purpose a Musical-Based Interaction System (MbIS) is proposed to enable the robot the process both visual and aural cues coming throughout the interaction with musicians. In particular, in this paper, the details of the implementation of the visual tracking module on the Waseda Flutist Robot No. 4 Refined IV (WF-4RIV) is presented. The visual tracking module is composed by two levels of interaction: basic (visual interface for the musician based on controlling virtual buttons and faders) and advanced (instrument tracking system so that the robot can process motion gestures performed by the musical partner in real-time which are then directly mapped into musical parameters of the performance of the robot). The experiments carried out were focused in verifying the effectiveness and usability of the proposed levels of interaction. In particular, we focused on determining how well our the WF-4RIV dynamically changes musical parameters while interacting with a human player. From the experimental results we observed that the physical constraints of the robot play an important role during the interaction. Although further improvements should be done to overcome such constrains, we expect that the interaction experience may become more natural.",
    keywords = "Human-robot interaction, Music, Particle filter",
    author = "Klaus Petersen and Jorge Solis and Atsuo Takanishi",
    year = "2010",
    month = "5",
    doi = "10.1007/s10514-010-9180-5",
    language = "English",
    volume = "28",
    pages = "471--488",
    journal = "Autonomous Robots",
    issn = "0929-5593",
    publisher = "Springer Netherlands",
    number = "4",

    }

    TY - JOUR

    T1 - Musical-based interaction system for the Waseda Flutist Robot

    T2 - Implementation of the visual tracking interaction module

    AU - Petersen, Klaus

    AU - Solis, Jorge

    AU - Takanishi, Atsuo

    PY - 2010/5

    Y1 - 2010/5

    N2 - Since 1990, at Waseda University the development on the Anthropomorphic Flutist Robot has been focused on mechanically reproducing the physiology of the organs involved during the flute playing (i.e. lungs, lips, etc.) and implementing basic cognitive capabilities to interact with flutist beginners. As a results of the research efforts done until now, the Waseda Flutist Robot is considered to play the flute nearly similar to the performance of a intermediate human player. However, we consider that in order to extend the interaction capabilities of the flutist robot with musical partners, further research efforts should be done. In this paper, we propose as a long-term goal to enable the flutist robot to interact more naturally with musical partners on the context of a Jazz band. For this purpose a Musical-Based Interaction System (MbIS) is proposed to enable the robot the process both visual and aural cues coming throughout the interaction with musicians. In particular, in this paper, the details of the implementation of the visual tracking module on the Waseda Flutist Robot No. 4 Refined IV (WF-4RIV) is presented. The visual tracking module is composed by two levels of interaction: basic (visual interface for the musician based on controlling virtual buttons and faders) and advanced (instrument tracking system so that the robot can process motion gestures performed by the musical partner in real-time which are then directly mapped into musical parameters of the performance of the robot). The experiments carried out were focused in verifying the effectiveness and usability of the proposed levels of interaction. In particular, we focused on determining how well our the WF-4RIV dynamically changes musical parameters while interacting with a human player. From the experimental results we observed that the physical constraints of the robot play an important role during the interaction. Although further improvements should be done to overcome such constrains, we expect that the interaction experience may become more natural.

    AB - Since 1990, at Waseda University the development on the Anthropomorphic Flutist Robot has been focused on mechanically reproducing the physiology of the organs involved during the flute playing (i.e. lungs, lips, etc.) and implementing basic cognitive capabilities to interact with flutist beginners. As a results of the research efforts done until now, the Waseda Flutist Robot is considered to play the flute nearly similar to the performance of a intermediate human player. However, we consider that in order to extend the interaction capabilities of the flutist robot with musical partners, further research efforts should be done. In this paper, we propose as a long-term goal to enable the flutist robot to interact more naturally with musical partners on the context of a Jazz band. For this purpose a Musical-Based Interaction System (MbIS) is proposed to enable the robot the process both visual and aural cues coming throughout the interaction with musicians. In particular, in this paper, the details of the implementation of the visual tracking module on the Waseda Flutist Robot No. 4 Refined IV (WF-4RIV) is presented. The visual tracking module is composed by two levels of interaction: basic (visual interface for the musician based on controlling virtual buttons and faders) and advanced (instrument tracking system so that the robot can process motion gestures performed by the musical partner in real-time which are then directly mapped into musical parameters of the performance of the robot). The experiments carried out were focused in verifying the effectiveness and usability of the proposed levels of interaction. In particular, we focused on determining how well our the WF-4RIV dynamically changes musical parameters while interacting with a human player. From the experimental results we observed that the physical constraints of the robot play an important role during the interaction. Although further improvements should be done to overcome such constrains, we expect that the interaction experience may become more natural.

    KW - Human-robot interaction

    KW - Music

    KW - Particle filter

    UR - http://www.scopus.com/inward/record.url?scp=77952002443&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=77952002443&partnerID=8YFLogxK

    U2 - 10.1007/s10514-010-9180-5

    DO - 10.1007/s10514-010-9180-5

    M3 - Article

    VL - 28

    SP - 471

    EP - 488

    JO - Autonomous Robots

    JF - Autonomous Robots

    SN - 0929-5593

    IS - 4

    ER -