Bowing-Net: Motion Generation for String Instruments Based on Bowing Information

Asuka Hirata, Keitaro Tanaka, Ryo Shimamura, Shigeo Morishima

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper presents a deep learning based method that generates body motion for string instrument performance from raw audio. In contrast to prior methods which aim to predict joint position from audio, we first estimate information that dictates the bowing dynamics, such as the bow direction and the played string. The final body motion is then determined from this information following a conversion rule. By adopting the bowing information as the target domain, not only is learning the mapping more feasible, but also the produced results have bowing dynamics that are consistent with the given audio. We confirmed that our results are superior to existing methods through extensive experiments.

Original languageEnglish
Title of host publicationSpecial Interest Group on Computer Graphics and Interactive Techniques Conference Posters, SIGGRAPH 2021
PublisherAssociation for Computing Machinery, Inc
ISBN (Electronic)9781450383714
DOIs
Publication statusPublished - 2021 Aug 5
EventSpecial Interest Group on Computer Graphics and Interactive Techniques Conference: Posters, SIGGRAPH 2021 - Virtual, Online, United States
Duration: 2021 Aug 92021 Aug 13

Publication series

NameSpecial Interest Group on Computer Graphics and Interactive Techniques Conference Posters, SIGGRAPH 2021

Conference

ConferenceSpecial Interest Group on Computer Graphics and Interactive Techniques Conference: Posters, SIGGRAPH 2021
Country/TerritoryUnited States
CityVirtual, Online
Period21/8/921/8/13

Keywords

  • Motion generation
  • music information retrieval
  • neural networks.

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Bowing-Net: Motion Generation for String Instruments Based on Bowing Information'. Together they form a unique fingerprint.

Cite this