Development of a real-time gestural interface for hands-free musical performance control

Klaus Petersen, Jorge Solis, Atsuo Takanishi

研究成果: Paper査読


Our research aims to develop an anthropomorphic flutist robot (WF-4RIV) as a benchmark for better understanding the interaction between musicians and musical performance robots from a musical point of view. As a longterm goal of our research, we would like to enable such robots to play actively together with a human band, and create novel ways of musical expression. For this purpose, we focus on enhancing the perceptual capabilities of the flutist robot to process musical information coming from the aural and visual perceptual channels. In this paper, we introduce, as a first approach, a hands-free gesture-based control interface designed to modify musical parameters in real-time. In particular, we describe a set of virtual controllers, that a composer can manipulate through gestures of the body or a musical instrument. The gestures are identified by 2-D motion sensitive areas which graphically represent common control interfaces used in music production: trigger pads and faders. The resulting information from the vision processing is transformed into MIDI messages, which are subsequently played by the WF-4RIV. In order to verify the effectiveness of the proposed gestural interface, we performed experiments to control a synthesizer and then, to musically interact with the WF-4RIV. From the experimental results we concluded that our method satisfies the technical and idiosyncratic requirements for being a suitable tool for musical performance and composition.

出版ステータスPublished - 2008 1月 1
イベントInternational Computer Music Conference, ICMC 2008 - Belfast, Ireland
継続期間: 2008 8月 242008 8月 29


ConferenceInternational Computer Music Conference, ICMC 2008

ASJC Scopus subject areas

  • コンピュータ サイエンスの応用
  • メディア記述
  • 音楽


「Development of a real-time gestural interface for hands-free musical performance control」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。