Development of a real-time gestural interface for hands-free musical performance control

Klaus Petersen, Jorge Solis, Atsuo Takanishi

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    Our research aims to develop an anthropomorphic flutist robot (WF-4RIV) as a benchmark for better understanding the interaction between musicians and musical performance robots from a musical point of view. As a longterm goal of our research, we would like to enable such robots to play actively together with a human band, and create novel ways of musical expression. For this purpose, we focus on enhancing the perceptual capabilities of the flutist robot to process musical information coming from the aural and visual perceptual channels. In this paper, we introduce, as a first approach, a hands-free gesture-based control interface designed to modify musical parameters in real-time. In particular, we describe a set of virtual controllers, that a composer can manipulate through gestures of the body or a musical instrument. The gestures are identified by 2-D motion sensitive areas which graphically represent common control interfaces used in music production: trigger pads and faders. The resulting information from the vision processing is transformed into MIDI messages, which are subsequently played by the WF-4RIV. In order to verify the effectiveness of the proposed gestural interface, we performed experiments to control a synthesizer and then, to musically interact with the WF-4RIV. From the experimental results we concluded that our method satisfies the technical and idiosyncratic requirements for being a suitable tool for musical performance and composition.

    Original languageEnglish
    Title of host publicationInternational Computer Music Conference, ICMC 2008
    PublisherInternational Computer Music Association
    Publication statusPublished - 2008
    EventInternational Computer Music Conference, ICMC 2008 - Belfast
    Duration: 2008 Aug 242008 Aug 29

    Other

    OtherInternational Computer Music Conference, ICMC 2008
    CityBelfast
    Period08/8/2408/8/29

    Fingerprint

    Robots
    Anthropomorphic robots
    Musical instruments
    Controllers
    Processing
    Chemical analysis
    Robot
    Musical Performance
    Experiments
    Gesture
    Flautist
    Interaction
    Experiment
    Composer
    Synthesizer
    Aural
    Musicians
    Musical Instruments
    Music
    Musical Expression

    ASJC Scopus subject areas

    • Computer Science Applications
    • Media Technology
    • Music

    Cite this

    Petersen, K., Solis, J., & Takanishi, A. (2008). Development of a real-time gestural interface for hands-free musical performance control. In International Computer Music Conference, ICMC 2008 International Computer Music Association.

    Development of a real-time gestural interface for hands-free musical performance control. / Petersen, Klaus; Solis, Jorge; Takanishi, Atsuo.

    International Computer Music Conference, ICMC 2008. International Computer Music Association, 2008.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Petersen, K, Solis, J & Takanishi, A 2008, Development of a real-time gestural interface for hands-free musical performance control. in International Computer Music Conference, ICMC 2008. International Computer Music Association, International Computer Music Conference, ICMC 2008, Belfast, 08/8/24.
    Petersen K, Solis J, Takanishi A. Development of a real-time gestural interface for hands-free musical performance control. In International Computer Music Conference, ICMC 2008. International Computer Music Association. 2008
    Petersen, Klaus ; Solis, Jorge ; Takanishi, Atsuo. / Development of a real-time gestural interface for hands-free musical performance control. International Computer Music Conference, ICMC 2008. International Computer Music Association, 2008.
    @inproceedings{8f56319a28af43648daccf6395706588,
    title = "Development of a real-time gestural interface for hands-free musical performance control",
    abstract = "Our research aims to develop an anthropomorphic flutist robot (WF-4RIV) as a benchmark for better understanding the interaction between musicians and musical performance robots from a musical point of view. As a longterm goal of our research, we would like to enable such robots to play actively together with a human band, and create novel ways of musical expression. For this purpose, we focus on enhancing the perceptual capabilities of the flutist robot to process musical information coming from the aural and visual perceptual channels. In this paper, we introduce, as a first approach, a hands-free gesture-based control interface designed to modify musical parameters in real-time. In particular, we describe a set of virtual controllers, that a composer can manipulate through gestures of the body or a musical instrument. The gestures are identified by 2-D motion sensitive areas which graphically represent common control interfaces used in music production: trigger pads and faders. The resulting information from the vision processing is transformed into MIDI messages, which are subsequently played by the WF-4RIV. In order to verify the effectiveness of the proposed gestural interface, we performed experiments to control a synthesizer and then, to musically interact with the WF-4RIV. From the experimental results we concluded that our method satisfies the technical and idiosyncratic requirements for being a suitable tool for musical performance and composition.",
    author = "Klaus Petersen and Jorge Solis and Atsuo Takanishi",
    year = "2008",
    language = "English",
    booktitle = "International Computer Music Conference, ICMC 2008",
    publisher = "International Computer Music Association",

    }

    TY - GEN

    T1 - Development of a real-time gestural interface for hands-free musical performance control

    AU - Petersen, Klaus

    AU - Solis, Jorge

    AU - Takanishi, Atsuo

    PY - 2008

    Y1 - 2008

    N2 - Our research aims to develop an anthropomorphic flutist robot (WF-4RIV) as a benchmark for better understanding the interaction between musicians and musical performance robots from a musical point of view. As a longterm goal of our research, we would like to enable such robots to play actively together with a human band, and create novel ways of musical expression. For this purpose, we focus on enhancing the perceptual capabilities of the flutist robot to process musical information coming from the aural and visual perceptual channels. In this paper, we introduce, as a first approach, a hands-free gesture-based control interface designed to modify musical parameters in real-time. In particular, we describe a set of virtual controllers, that a composer can manipulate through gestures of the body or a musical instrument. The gestures are identified by 2-D motion sensitive areas which graphically represent common control interfaces used in music production: trigger pads and faders. The resulting information from the vision processing is transformed into MIDI messages, which are subsequently played by the WF-4RIV. In order to verify the effectiveness of the proposed gestural interface, we performed experiments to control a synthesizer and then, to musically interact with the WF-4RIV. From the experimental results we concluded that our method satisfies the technical and idiosyncratic requirements for being a suitable tool for musical performance and composition.

    AB - Our research aims to develop an anthropomorphic flutist robot (WF-4RIV) as a benchmark for better understanding the interaction between musicians and musical performance robots from a musical point of view. As a longterm goal of our research, we would like to enable such robots to play actively together with a human band, and create novel ways of musical expression. For this purpose, we focus on enhancing the perceptual capabilities of the flutist robot to process musical information coming from the aural and visual perceptual channels. In this paper, we introduce, as a first approach, a hands-free gesture-based control interface designed to modify musical parameters in real-time. In particular, we describe a set of virtual controllers, that a composer can manipulate through gestures of the body or a musical instrument. The gestures are identified by 2-D motion sensitive areas which graphically represent common control interfaces used in music production: trigger pads and faders. The resulting information from the vision processing is transformed into MIDI messages, which are subsequently played by the WF-4RIV. In order to verify the effectiveness of the proposed gestural interface, we performed experiments to control a synthesizer and then, to musically interact with the WF-4RIV. From the experimental results we concluded that our method satisfies the technical and idiosyncratic requirements for being a suitable tool for musical performance and composition.

    UR - http://www.scopus.com/inward/record.url?scp=84923879598&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=84923879598&partnerID=8YFLogxK

    M3 - Conference contribution

    AN - SCOPUS:84923879598

    BT - International Computer Music Conference, ICMC 2008

    PB - International Computer Music Association

    ER -