Toward enabling a natural interaction between human musicians and musical performance robots

Implementation of a real-time gestural interface

Klaus Petersen, Jorge Solis, Atsuo Takanishi

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    14 Citations (Scopus)

    Abstract

    Our research aims to develop an anthropomorphic flutist robot as a benchmark for the better understanding of interaction between musicians and musical performance robots from a musical point of view. As a long-term goal of our research, we would like to enable such robots to play actively together with a human band, and create novel ways of musical expression. For this purpose, we focus on enhancing the perceptual capabilities of the flutist robot to process musical information coming from the aural and visual perceptual channels. In this paper, we introduce, as a first approach, a hands-free gesture-based control interface designed to modify musical parameters in real-time. In particular, we describe a set of virtual controllers, that a composer can manipulate through gestures of with a musical instrument. The gestures are Identified by 2-D motion sensitive areas which graphically represent common control interfaces used in music production. The resulting information from the vision processing is then transformed into MIDI messages, which are subsequently played by the flute robot. In order to verify the effectiveness of the proposed gestural interface, we performed experiments to musically interact with musical partners. ¿From the experimental results we concluded that our method satisfies the technical and idiosyncratic requirements for being a suitable tool for musical performance.

    Original languageEnglish
    Title of host publicationProceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN
    Pages340-345
    Number of pages6
    DOIs
    Publication statusPublished - 2008
    Event17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN - Munich
    Duration: 2008 Aug 12008 Aug 3

    Other

    Other17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN
    CityMunich
    Period08/8/108/8/3

    Fingerprint

    Robots
    Anthropomorphic robots
    Musical instruments
    Controllers
    Processing
    Experiments

    ASJC Scopus subject areas

    • Artificial Intelligence
    • Computer Vision and Pattern Recognition
    • Human-Computer Interaction

    Cite this

    Petersen, K., Solis, J., & Takanishi, A. (2008). Toward enabling a natural interaction between human musicians and musical performance robots: Implementation of a real-time gestural interface. In Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN (pp. 340-345). [4600689] https://doi.org/10.1109/ROMAN.2008.4600689

    Toward enabling a natural interaction between human musicians and musical performance robots : Implementation of a real-time gestural interface. / Petersen, Klaus; Solis, Jorge; Takanishi, Atsuo.

    Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN. 2008. p. 340-345 4600689.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Petersen, K, Solis, J & Takanishi, A 2008, Toward enabling a natural interaction between human musicians and musical performance robots: Implementation of a real-time gestural interface. in Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN., 4600689, pp. 340-345, 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN, Munich, 08/8/1. https://doi.org/10.1109/ROMAN.2008.4600689
    Petersen K, Solis J, Takanishi A. Toward enabling a natural interaction between human musicians and musical performance robots: Implementation of a real-time gestural interface. In Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN. 2008. p. 340-345. 4600689 https://doi.org/10.1109/ROMAN.2008.4600689
    Petersen, Klaus ; Solis, Jorge ; Takanishi, Atsuo. / Toward enabling a natural interaction between human musicians and musical performance robots : Implementation of a real-time gestural interface. Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN. 2008. pp. 340-345
    @inproceedings{8b18b697210641b2845b4a69b8d2f980,
    title = "Toward enabling a natural interaction between human musicians and musical performance robots: Implementation of a real-time gestural interface",
    abstract = "Our research aims to develop an anthropomorphic flutist robot as a benchmark for the better understanding of interaction between musicians and musical performance robots from a musical point of view. As a long-term goal of our research, we would like to enable such robots to play actively together with a human band, and create novel ways of musical expression. For this purpose, we focus on enhancing the perceptual capabilities of the flutist robot to process musical information coming from the aural and visual perceptual channels. In this paper, we introduce, as a first approach, a hands-free gesture-based control interface designed to modify musical parameters in real-time. In particular, we describe a set of virtual controllers, that a composer can manipulate through gestures of with a musical instrument. The gestures are Identified by 2-D motion sensitive areas which graphically represent common control interfaces used in music production. The resulting information from the vision processing is then transformed into MIDI messages, which are subsequently played by the flute robot. In order to verify the effectiveness of the proposed gestural interface, we performed experiments to musically interact with musical partners. ¿From the experimental results we concluded that our method satisfies the technical and idiosyncratic requirements for being a suitable tool for musical performance.",
    author = "Klaus Petersen and Jorge Solis and Atsuo Takanishi",
    year = "2008",
    doi = "10.1109/ROMAN.2008.4600689",
    language = "English",
    isbn = "9781424422135",
    pages = "340--345",
    booktitle = "Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN",

    }

    TY - GEN

    T1 - Toward enabling a natural interaction between human musicians and musical performance robots

    T2 - Implementation of a real-time gestural interface

    AU - Petersen, Klaus

    AU - Solis, Jorge

    AU - Takanishi, Atsuo

    PY - 2008

    Y1 - 2008

    N2 - Our research aims to develop an anthropomorphic flutist robot as a benchmark for the better understanding of interaction between musicians and musical performance robots from a musical point of view. As a long-term goal of our research, we would like to enable such robots to play actively together with a human band, and create novel ways of musical expression. For this purpose, we focus on enhancing the perceptual capabilities of the flutist robot to process musical information coming from the aural and visual perceptual channels. In this paper, we introduce, as a first approach, a hands-free gesture-based control interface designed to modify musical parameters in real-time. In particular, we describe a set of virtual controllers, that a composer can manipulate through gestures of with a musical instrument. The gestures are Identified by 2-D motion sensitive areas which graphically represent common control interfaces used in music production. The resulting information from the vision processing is then transformed into MIDI messages, which are subsequently played by the flute robot. In order to verify the effectiveness of the proposed gestural interface, we performed experiments to musically interact with musical partners. ¿From the experimental results we concluded that our method satisfies the technical and idiosyncratic requirements for being a suitable tool for musical performance.

    AB - Our research aims to develop an anthropomorphic flutist robot as a benchmark for the better understanding of interaction between musicians and musical performance robots from a musical point of view. As a long-term goal of our research, we would like to enable such robots to play actively together with a human band, and create novel ways of musical expression. For this purpose, we focus on enhancing the perceptual capabilities of the flutist robot to process musical information coming from the aural and visual perceptual channels. In this paper, we introduce, as a first approach, a hands-free gesture-based control interface designed to modify musical parameters in real-time. In particular, we describe a set of virtual controllers, that a composer can manipulate through gestures of with a musical instrument. The gestures are Identified by 2-D motion sensitive areas which graphically represent common control interfaces used in music production. The resulting information from the vision processing is then transformed into MIDI messages, which are subsequently played by the flute robot. In order to verify the effectiveness of the proposed gestural interface, we performed experiments to musically interact with musical partners. ¿From the experimental results we concluded that our method satisfies the technical and idiosyncratic requirements for being a suitable tool for musical performance.

    UR - http://www.scopus.com/inward/record.url?scp=52949100629&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=52949100629&partnerID=8YFLogxK

    U2 - 10.1109/ROMAN.2008.4600689

    DO - 10.1109/ROMAN.2008.4600689

    M3 - Conference contribution

    SN - 9781424422135

    SP - 340

    EP - 345

    BT - Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN

    ER -