A role of multi-modal rhythms in physical interaction and cooperation

Kenta Yonekura, Chyon Hae Kim, Kazuhiro Nakadai, Hiroshi Tsujino, Shigeki Sugano

研究成果: Article

4 引用 (Scopus)

抜粋

As fundamental research for human-robot interaction, this paper addresses the rhythmic reference of a human while turning a rope with another human. We hypothyzed that when interpreting rhythm cues to make a rhythm reference, humans will use auditory and force rhythms more than visual ones. We examined 21-23 years old test subjects. We masked perception of each test subject using 3 kinds of masks, an eye-mask, headphones, and a force mask. The force mask is composed of a robot arm and a remote controller. These instruments allow a test subject to turn a rope without feeling force from the rope. In the first experiment, each test subject interacted with an operator that turned a rope with a constant rhythm. 8 experiments were conducted for each test subject that wore combinations of masks. We measured the angular velocity of force between a test subject/the operator and a rope. We calculated error between the angular velocities of the force directions, and validated the error. In the second experiment, two test subjects interacted with each other. 1.6 - 2.4 Hz auditory rhythm was presented from headphones so as to inform target turning frequency. Addition to the auditory rhythm, the test subjects wore eye-masks. The first experiment showed that visual rhythm has little influence on rope-turning cooperation between humans. The second experiment provided firmer evidence for the same hypothesis because humans neglected their visual rhythms.

元の言語English
記事番号12
ジャーナルEurasip Journal on Audio, Speech, and Music Processing
2012
発行部数1
DOI
出版物ステータスPublished - 2012 12 1

    フィンガープリント

ASJC Scopus subject areas

  • Acoustics and Ultrasonics
  • Electrical and Electronic Engineering

これを引用