Development of the Waseda Flutist Robot No. 4 Refined IV: Implementation of a real-time interaction system with human partners

Klaus Petersen, Jorge Solis, Koichi Taniguchi, Takeshi Ninomiya, Tetsuro Yamamoto, Atsuo Takanishi

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    2 Citations (Scopus)

    Abstract

    The aim of our research is to develop an anthropomorphic flutist robot that on the one hand reproduces the human motor skills required for playing the flute, and on the other hand displays cognitive capabilities for interacting with other (human) musicians. In this paper, we detail the recent mechanical improvements on the Waseda Flutist Robot (WF- 4RIV), enhancing the realistic production of the flute sound. In particular, improved lips, oral cavity and tonguing are introduced and their mechanisms described: The possibility to deform the lip shape in 3-DOF, allows us to accurately control the characteristics of the air-stream (width, thickness and angle). An improved tonguing mechanism (1-DOF) has been designed to reproduce double tonguing. Furthermore we present the implementation of a real-time interaction system with human partners. We developed, as a first approach, a vision processing algorithm to track the 3D-orientation and position of a musical instrument: Image data is recorded using two cameras attached to the head of the robot, and processed in real-time. The proposed algorithm is based on color histogram matching and particle filter techniques to follow the position of a musician's hands on an instrument. Data analysis enables us to determine the orientation and location of the instrument. We map these parameters to control musical performance parameters of the WF-4RIV, such as sound vibrato and sound volume. A set of experiments were proposed to verify the effectiveness of the proposed tracking system during interaction with a human player. We conclude, that the quality of the musical performance of the WF-4RIV and its capabilities to interact with musical partners, have been significantly improved by the implementation of the techniques, that are proposed in this paper.

    Original languageEnglish
    Title of host publicationProceedings of the 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2008
    Pages421-426
    Number of pages6
    DOIs
    Publication statusPublished - 2008
    Event2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2008 - Scottsdale, AZ
    Duration: 2008 Oct 192008 Oct 22

    Other

    Other2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2008
    CityScottsdale, AZ
    Period08/10/1908/10/22

    Fingerprint

    Acoustic waves
    Robots
    Anthropomorphic robots
    Musical instruments
    Color matching
    Cameras
    Processing
    Air
    Experiments

    ASJC Scopus subject areas

    • Artificial Intelligence
    • Computer Vision and Pattern Recognition
    • Biomedical Engineering

    Cite this

    Petersen, K., Solis, J., Taniguchi, K., Ninomiya, T., Yamamoto, T., & Takanishi, A. (2008). Development of the Waseda Flutist Robot No. 4 Refined IV: Implementation of a real-time interaction system with human partners. In Proceedings of the 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2008 (pp. 421-426). [4762798] https://doi.org/10.1109/BIOROB.2008.4762798

    Development of the Waseda Flutist Robot No. 4 Refined IV : Implementation of a real-time interaction system with human partners. / Petersen, Klaus; Solis, Jorge; Taniguchi, Koichi; Ninomiya, Takeshi; Yamamoto, Tetsuro; Takanishi, Atsuo.

    Proceedings of the 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2008. 2008. p. 421-426 4762798.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Petersen, K, Solis, J, Taniguchi, K, Ninomiya, T, Yamamoto, T & Takanishi, A 2008, Development of the Waseda Flutist Robot No. 4 Refined IV: Implementation of a real-time interaction system with human partners. in Proceedings of the 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2008., 4762798, pp. 421-426, 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2008, Scottsdale, AZ, 08/10/19. https://doi.org/10.1109/BIOROB.2008.4762798
    Petersen K, Solis J, Taniguchi K, Ninomiya T, Yamamoto T, Takanishi A. Development of the Waseda Flutist Robot No. 4 Refined IV: Implementation of a real-time interaction system with human partners. In Proceedings of the 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2008. 2008. p. 421-426. 4762798 https://doi.org/10.1109/BIOROB.2008.4762798
    Petersen, Klaus ; Solis, Jorge ; Taniguchi, Koichi ; Ninomiya, Takeshi ; Yamamoto, Tetsuro ; Takanishi, Atsuo. / Development of the Waseda Flutist Robot No. 4 Refined IV : Implementation of a real-time interaction system with human partners. Proceedings of the 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2008. 2008. pp. 421-426
    @inproceedings{98396d4b5c1345df87023491eb77a0f3,
    title = "Development of the Waseda Flutist Robot No. 4 Refined IV: Implementation of a real-time interaction system with human partners",
    abstract = "The aim of our research is to develop an anthropomorphic flutist robot that on the one hand reproduces the human motor skills required for playing the flute, and on the other hand displays cognitive capabilities for interacting with other (human) musicians. In this paper, we detail the recent mechanical improvements on the Waseda Flutist Robot (WF- 4RIV), enhancing the realistic production of the flute sound. In particular, improved lips, oral cavity and tonguing are introduced and their mechanisms described: The possibility to deform the lip shape in 3-DOF, allows us to accurately control the characteristics of the air-stream (width, thickness and angle). An improved tonguing mechanism (1-DOF) has been designed to reproduce double tonguing. Furthermore we present the implementation of a real-time interaction system with human partners. We developed, as a first approach, a vision processing algorithm to track the 3D-orientation and position of a musical instrument: Image data is recorded using two cameras attached to the head of the robot, and processed in real-time. The proposed algorithm is based on color histogram matching and particle filter techniques to follow the position of a musician's hands on an instrument. Data analysis enables us to determine the orientation and location of the instrument. We map these parameters to control musical performance parameters of the WF-4RIV, such as sound vibrato and sound volume. A set of experiments were proposed to verify the effectiveness of the proposed tracking system during interaction with a human player. We conclude, that the quality of the musical performance of the WF-4RIV and its capabilities to interact with musical partners, have been significantly improved by the implementation of the techniques, that are proposed in this paper.",
    author = "Klaus Petersen and Jorge Solis and Koichi Taniguchi and Takeshi Ninomiya and Tetsuro Yamamoto and Atsuo Takanishi",
    year = "2008",
    doi = "10.1109/BIOROB.2008.4762798",
    language = "English",
    isbn = "9781424428830",
    pages = "421--426",
    booktitle = "Proceedings of the 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2008",

    }

    TY - GEN

    T1 - Development of the Waseda Flutist Robot No. 4 Refined IV

    T2 - Implementation of a real-time interaction system with human partners

    AU - Petersen, Klaus

    AU - Solis, Jorge

    AU - Taniguchi, Koichi

    AU - Ninomiya, Takeshi

    AU - Yamamoto, Tetsuro

    AU - Takanishi, Atsuo

    PY - 2008

    Y1 - 2008

    N2 - The aim of our research is to develop an anthropomorphic flutist robot that on the one hand reproduces the human motor skills required for playing the flute, and on the other hand displays cognitive capabilities for interacting with other (human) musicians. In this paper, we detail the recent mechanical improvements on the Waseda Flutist Robot (WF- 4RIV), enhancing the realistic production of the flute sound. In particular, improved lips, oral cavity and tonguing are introduced and their mechanisms described: The possibility to deform the lip shape in 3-DOF, allows us to accurately control the characteristics of the air-stream (width, thickness and angle). An improved tonguing mechanism (1-DOF) has been designed to reproduce double tonguing. Furthermore we present the implementation of a real-time interaction system with human partners. We developed, as a first approach, a vision processing algorithm to track the 3D-orientation and position of a musical instrument: Image data is recorded using two cameras attached to the head of the robot, and processed in real-time. The proposed algorithm is based on color histogram matching and particle filter techniques to follow the position of a musician's hands on an instrument. Data analysis enables us to determine the orientation and location of the instrument. We map these parameters to control musical performance parameters of the WF-4RIV, such as sound vibrato and sound volume. A set of experiments were proposed to verify the effectiveness of the proposed tracking system during interaction with a human player. We conclude, that the quality of the musical performance of the WF-4RIV and its capabilities to interact with musical partners, have been significantly improved by the implementation of the techniques, that are proposed in this paper.

    AB - The aim of our research is to develop an anthropomorphic flutist robot that on the one hand reproduces the human motor skills required for playing the flute, and on the other hand displays cognitive capabilities for interacting with other (human) musicians. In this paper, we detail the recent mechanical improvements on the Waseda Flutist Robot (WF- 4RIV), enhancing the realistic production of the flute sound. In particular, improved lips, oral cavity and tonguing are introduced and their mechanisms described: The possibility to deform the lip shape in 3-DOF, allows us to accurately control the characteristics of the air-stream (width, thickness and angle). An improved tonguing mechanism (1-DOF) has been designed to reproduce double tonguing. Furthermore we present the implementation of a real-time interaction system with human partners. We developed, as a first approach, a vision processing algorithm to track the 3D-orientation and position of a musical instrument: Image data is recorded using two cameras attached to the head of the robot, and processed in real-time. The proposed algorithm is based on color histogram matching and particle filter techniques to follow the position of a musician's hands on an instrument. Data analysis enables us to determine the orientation and location of the instrument. We map these parameters to control musical performance parameters of the WF-4RIV, such as sound vibrato and sound volume. A set of experiments were proposed to verify the effectiveness of the proposed tracking system during interaction with a human player. We conclude, that the quality of the musical performance of the WF-4RIV and its capabilities to interact with musical partners, have been significantly improved by the implementation of the techniques, that are proposed in this paper.

    UR - http://www.scopus.com/inward/record.url?scp=63049126724&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=63049126724&partnerID=8YFLogxK

    U2 - 10.1109/BIOROB.2008.4762798

    DO - 10.1109/BIOROB.2008.4762798

    M3 - Conference contribution

    AN - SCOPUS:63049126724

    SN - 9781424428830

    SP - 421

    EP - 426

    BT - Proceedings of the 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2008

    ER -