A role of multi-modal rhythms in physical interaction and cooperation

Kenta Yonekura, Chyon Hae Kim*, Kazuhiro Nakadai, Hiroshi Tsujino, Shigeki Sugano

*この研究の対応する著者

研究成果: Article査読

4 被引用数 (Scopus)

抄録

As fundamental research for human-robot interaction, this paper addresses the rhythmic reference of a human while turning a rope with another human. We hypothyzed that when interpreting rhythm cues to make a rhythm reference, humans will use auditory and force rhythms more than visual ones. We examined 21-23 years old test subjects. We masked perception of each test subject using 3 kinds of masks, an eye-mask, headphones, and a force mask. The force mask is composed of a robot arm and a remote controller. These instruments allow a test subject to turn a rope without feeling force from the rope. In the first experiment, each test subject interacted with an operator that turned a rope with a constant rhythm. 8 experiments were conducted for each test subject that wore combinations of masks. We measured the angular velocity of force between a test subject/the operator and a rope. We calculated error between the angular velocities of the force directions, and validated the error. In the second experiment, two test subjects interacted with each other. 1.6 - 2.4 Hz auditory rhythm was presented from headphones so as to inform target turning frequency. Addition to the auditory rhythm, the test subjects wore eye-masks. The first experiment showed that visual rhythm has little influence on rope-turning cooperation between humans. The second experiment provided firmer evidence for the same hypothesis because humans neglected their visual rhythms.

本文言語English
論文番号12
ジャーナルEurasip Journal on Audio, Speech, and Music Processing
2012
1
DOI
出版ステータスPublished - 2012

ASJC Scopus subject areas

  • 音響学および超音波学
  • 電子工学および電気工学

フィンガープリント

「A role of multi-modal rhythms in physical interaction and cooperation」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル