A role of multi-modal rhythms in physical interaction and cooperation

Kenta Yonekura, Chyon Hae Kim, Kazuhiro Nakadai, Hiroshi Tsujino, Shigeki Sugano

    Research output: Contribution to journalArticle

    3 Citations (Scopus)

    Abstract

    As fundamental research for human-robot interaction, this paper addresses the rhythmic reference of a human while turning a rope with another human. We hypothyzed that when interpreting rhythm cues to make a rhythm reference, humans will use auditory and force rhythms more than visual ones. We examined 21-23 years old test subjects. We masked perception of each test subject using 3 kinds of masks, an eye-mask, headphones, and a force mask. The force mask is composed of a robot arm and a remote controller. These instruments allow a test subject to turn a rope without feeling force from the rope. In the first experiment, each test subject interacted with an operator that turned a rope with a constant rhythm. 8 experiments were conducted for each test subject that wore combinations of masks. We measured the angular velocity of force between a test subject/the operator and a rope. We calculated error between the angular velocities of the force directions, and validated the error. In the second experiment, two test subjects interacted with each other. 1.6 - 2.4 Hz auditory rhythm was presented from headphones so as to inform target turning frequency. Addition to the auditory rhythm, the test subjects wore eye-masks. The first experiment showed that visual rhythm has little influence on rope-turning cooperation between humans. The second experiment provided firmer evidence for the same hypothesis because humans neglected their visual rhythms.

    Original languageEnglish
    Article number12
    JournalEurasip Journal on Audio, Speech, and Music Processing
    Volume2012
    Issue number1
    DOIs
    Publication statusPublished - 2012

    Fingerprint

    rhythm
    Masks
    masks
    Headphones
    interactions
    Angular velocity
    Experiments
    angular velocity
    Wear of materials
    Human robot interaction
    robot arms
    sensory feedback
    operators
    wear tests
    cues
    robots
    Robots
    controllers
    Controllers

    ASJC Scopus subject areas

    • Electrical and Electronic Engineering
    • Acoustics and Ultrasonics

    Cite this

    A role of multi-modal rhythms in physical interaction and cooperation. / Yonekura, Kenta; Kim, Chyon Hae; Nakadai, Kazuhiro; Tsujino, Hiroshi; Sugano, Shigeki.

    In: Eurasip Journal on Audio, Speech, and Music Processing, Vol. 2012, No. 1, 12, 2012.

    Research output: Contribution to journalArticle

    Yonekura, Kenta ; Kim, Chyon Hae ; Nakadai, Kazuhiro ; Tsujino, Hiroshi ; Sugano, Shigeki. / A role of multi-modal rhythms in physical interaction and cooperation. In: Eurasip Journal on Audio, Speech, and Music Processing. 2012 ; Vol. 2012, No. 1.
    @article{8cb2f10ef3b441baac197f4ccb6ba1be,
    title = "A role of multi-modal rhythms in physical interaction and cooperation",
    abstract = "As fundamental research for human-robot interaction, this paper addresses the rhythmic reference of a human while turning a rope with another human. We hypothyzed that when interpreting rhythm cues to make a rhythm reference, humans will use auditory and force rhythms more than visual ones. We examined 21-23 years old test subjects. We masked perception of each test subject using 3 kinds of masks, an eye-mask, headphones, and a force mask. The force mask is composed of a robot arm and a remote controller. These instruments allow a test subject to turn a rope without feeling force from the rope. In the first experiment, each test subject interacted with an operator that turned a rope with a constant rhythm. 8 experiments were conducted for each test subject that wore combinations of masks. We measured the angular velocity of force between a test subject/the operator and a rope. We calculated error between the angular velocities of the force directions, and validated the error. In the second experiment, two test subjects interacted with each other. 1.6 - 2.4 Hz auditory rhythm was presented from headphones so as to inform target turning frequency. Addition to the auditory rhythm, the test subjects wore eye-masks. The first experiment showed that visual rhythm has little influence on rope-turning cooperation between humans. The second experiment provided firmer evidence for the same hypothesis because humans neglected their visual rhythms.",
    author = "Kenta Yonekura and Kim, {Chyon Hae} and Kazuhiro Nakadai and Hiroshi Tsujino and Shigeki Sugano",
    year = "2012",
    doi = "10.1186/1687-4722-2012-12",
    language = "English",
    volume = "2012",
    journal = "Eurasip Journal on Audio, Speech, and Music Processing",
    issn = "1687-4714",
    publisher = "Springer Publishing Company",
    number = "1",

    }

    TY - JOUR

    T1 - A role of multi-modal rhythms in physical interaction and cooperation

    AU - Yonekura, Kenta

    AU - Kim, Chyon Hae

    AU - Nakadai, Kazuhiro

    AU - Tsujino, Hiroshi

    AU - Sugano, Shigeki

    PY - 2012

    Y1 - 2012

    N2 - As fundamental research for human-robot interaction, this paper addresses the rhythmic reference of a human while turning a rope with another human. We hypothyzed that when interpreting rhythm cues to make a rhythm reference, humans will use auditory and force rhythms more than visual ones. We examined 21-23 years old test subjects. We masked perception of each test subject using 3 kinds of masks, an eye-mask, headphones, and a force mask. The force mask is composed of a robot arm and a remote controller. These instruments allow a test subject to turn a rope without feeling force from the rope. In the first experiment, each test subject interacted with an operator that turned a rope with a constant rhythm. 8 experiments were conducted for each test subject that wore combinations of masks. We measured the angular velocity of force between a test subject/the operator and a rope. We calculated error between the angular velocities of the force directions, and validated the error. In the second experiment, two test subjects interacted with each other. 1.6 - 2.4 Hz auditory rhythm was presented from headphones so as to inform target turning frequency. Addition to the auditory rhythm, the test subjects wore eye-masks. The first experiment showed that visual rhythm has little influence on rope-turning cooperation between humans. The second experiment provided firmer evidence for the same hypothesis because humans neglected their visual rhythms.

    AB - As fundamental research for human-robot interaction, this paper addresses the rhythmic reference of a human while turning a rope with another human. We hypothyzed that when interpreting rhythm cues to make a rhythm reference, humans will use auditory and force rhythms more than visual ones. We examined 21-23 years old test subjects. We masked perception of each test subject using 3 kinds of masks, an eye-mask, headphones, and a force mask. The force mask is composed of a robot arm and a remote controller. These instruments allow a test subject to turn a rope without feeling force from the rope. In the first experiment, each test subject interacted with an operator that turned a rope with a constant rhythm. 8 experiments were conducted for each test subject that wore combinations of masks. We measured the angular velocity of force between a test subject/the operator and a rope. We calculated error between the angular velocities of the force directions, and validated the error. In the second experiment, two test subjects interacted with each other. 1.6 - 2.4 Hz auditory rhythm was presented from headphones so as to inform target turning frequency. Addition to the auditory rhythm, the test subjects wore eye-masks. The first experiment showed that visual rhythm has little influence on rope-turning cooperation between humans. The second experiment provided firmer evidence for the same hypothesis because humans neglected their visual rhythms.

    UR - http://www.scopus.com/inward/record.url?scp=84873803156&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=84873803156&partnerID=8YFLogxK

    U2 - 10.1186/1687-4722-2012-12

    DO - 10.1186/1687-4722-2012-12

    M3 - Article

    VL - 2012

    JO - Eurasip Journal on Audio, Speech, and Music Processing

    JF - Eurasip Journal on Audio, Speech, and Music Processing

    SN - 1687-4714

    IS - 1

    M1 - 12

    ER -