Beat Tracking for Interactive Dancing Robots

Joao Lobato Oliveira, Gokhan Ince, Keisuke Nakamura, Kazuhiro Nakadai, Hiroshi G. Okuno, Fabien Gouyon, Luis Paulo Reis

    Research output: Contribution to journalArticle

    8 Citations (Scopus)

    Abstract

    Dance movement is intrinsically connected to the rhythm of music and is a fundamental form of nonverbal communication present in daily human interactions. In order to enable robots to interact with humans in natural real-world environments through dance, these robots must be able to listen to music while robustly tracking the beat of continuous musical stimuli and simultaneously responding to human speech. In this paper, we propose the integration of a real-time beat tracking system with state recovery with different preprocessing solutions used in robot audition for its application to interactive dancing robots. The proposed system is assessed under different real-world acoustic conditions of increasing complexity, which consider multiple audio sources of different kinds, multiple noise sources of different natures, continuous musical and speech stimuli, and the effects of beat-synchronous ego-motion noise and of jittering in ego noise (EN). The overall results suggest improved beat tracking accuracy with lower reaction times to music transitions, while still enhancing automatic speech recognition (ASR) run in parallel in the most challenging conditions. These results corroborate the application of the proposed system for interactive dancing robots.

    Original languageEnglish
    Article number1550023
    JournalInternational Journal of Humanoid Robotics
    Volume12
    Issue number4
    DOIs
    Publication statusPublished - 2015 Dec 1

    Fingerprint

    Robots
    Audition
    Speech recognition
    Acoustics
    Recovery
    Communication

    Keywords

    • beat tracking
    • human-robot interaction
    • noise suppression
    • Robot audition
    • robot dancing

    ASJC Scopus subject areas

    • Artificial Intelligence
    • Mechanical Engineering

    Cite this

    Oliveira, J. L., Ince, G., Nakamura, K., Nakadai, K., Okuno, H. G., Gouyon, F., & Reis, L. P. (2015). Beat Tracking for Interactive Dancing Robots. International Journal of Humanoid Robotics, 12(4), [1550023]. https://doi.org/10.1142/S0219843615500231

    Beat Tracking for Interactive Dancing Robots. / Oliveira, Joao Lobato; Ince, Gokhan; Nakamura, Keisuke; Nakadai, Kazuhiro; Okuno, Hiroshi G.; Gouyon, Fabien; Reis, Luis Paulo.

    In: International Journal of Humanoid Robotics, Vol. 12, No. 4, 1550023, 01.12.2015.

    Research output: Contribution to journalArticle

    Oliveira, JL, Ince, G, Nakamura, K, Nakadai, K, Okuno, HG, Gouyon, F & Reis, LP 2015, 'Beat Tracking for Interactive Dancing Robots', International Journal of Humanoid Robotics, vol. 12, no. 4, 1550023. https://doi.org/10.1142/S0219843615500231
    Oliveira, Joao Lobato ; Ince, Gokhan ; Nakamura, Keisuke ; Nakadai, Kazuhiro ; Okuno, Hiroshi G. ; Gouyon, Fabien ; Reis, Luis Paulo. / Beat Tracking for Interactive Dancing Robots. In: International Journal of Humanoid Robotics. 2015 ; Vol. 12, No. 4.
    @article{a3301a181c4f4d56bc1727e883b723b7,
    title = "Beat Tracking for Interactive Dancing Robots",
    abstract = "Dance movement is intrinsically connected to the rhythm of music and is a fundamental form of nonverbal communication present in daily human interactions. In order to enable robots to interact with humans in natural real-world environments through dance, these robots must be able to listen to music while robustly tracking the beat of continuous musical stimuli and simultaneously responding to human speech. In this paper, we propose the integration of a real-time beat tracking system with state recovery with different preprocessing solutions used in robot audition for its application to interactive dancing robots. The proposed system is assessed under different real-world acoustic conditions of increasing complexity, which consider multiple audio sources of different kinds, multiple noise sources of different natures, continuous musical and speech stimuli, and the effects of beat-synchronous ego-motion noise and of jittering in ego noise (EN). The overall results suggest improved beat tracking accuracy with lower reaction times to music transitions, while still enhancing automatic speech recognition (ASR) run in parallel in the most challenging conditions. These results corroborate the application of the proposed system for interactive dancing robots.",
    keywords = "beat tracking, human-robot interaction, noise suppression, Robot audition, robot dancing",
    author = "Oliveira, {Joao Lobato} and Gokhan Ince and Keisuke Nakamura and Kazuhiro Nakadai and Okuno, {Hiroshi G.} and Fabien Gouyon and Reis, {Luis Paulo}",
    year = "2015",
    month = "12",
    day = "1",
    doi = "10.1142/S0219843615500231",
    language = "English",
    volume = "12",
    journal = "International Journal of Humanoid Robotics",
    issn = "0219-8436",
    publisher = "World Scientific Publishing Co. Pte Ltd",
    number = "4",

    }

    TY - JOUR

    T1 - Beat Tracking for Interactive Dancing Robots

    AU - Oliveira, Joao Lobato

    AU - Ince, Gokhan

    AU - Nakamura, Keisuke

    AU - Nakadai, Kazuhiro

    AU - Okuno, Hiroshi G.

    AU - Gouyon, Fabien

    AU - Reis, Luis Paulo

    PY - 2015/12/1

    Y1 - 2015/12/1

    N2 - Dance movement is intrinsically connected to the rhythm of music and is a fundamental form of nonverbal communication present in daily human interactions. In order to enable robots to interact with humans in natural real-world environments through dance, these robots must be able to listen to music while robustly tracking the beat of continuous musical stimuli and simultaneously responding to human speech. In this paper, we propose the integration of a real-time beat tracking system with state recovery with different preprocessing solutions used in robot audition for its application to interactive dancing robots. The proposed system is assessed under different real-world acoustic conditions of increasing complexity, which consider multiple audio sources of different kinds, multiple noise sources of different natures, continuous musical and speech stimuli, and the effects of beat-synchronous ego-motion noise and of jittering in ego noise (EN). The overall results suggest improved beat tracking accuracy with lower reaction times to music transitions, while still enhancing automatic speech recognition (ASR) run in parallel in the most challenging conditions. These results corroborate the application of the proposed system for interactive dancing robots.

    AB - Dance movement is intrinsically connected to the rhythm of music and is a fundamental form of nonverbal communication present in daily human interactions. In order to enable robots to interact with humans in natural real-world environments through dance, these robots must be able to listen to music while robustly tracking the beat of continuous musical stimuli and simultaneously responding to human speech. In this paper, we propose the integration of a real-time beat tracking system with state recovery with different preprocessing solutions used in robot audition for its application to interactive dancing robots. The proposed system is assessed under different real-world acoustic conditions of increasing complexity, which consider multiple audio sources of different kinds, multiple noise sources of different natures, continuous musical and speech stimuli, and the effects of beat-synchronous ego-motion noise and of jittering in ego noise (EN). The overall results suggest improved beat tracking accuracy with lower reaction times to music transitions, while still enhancing automatic speech recognition (ASR) run in parallel in the most challenging conditions. These results corroborate the application of the proposed system for interactive dancing robots.

    KW - beat tracking

    KW - human-robot interaction

    KW - noise suppression

    KW - Robot audition

    KW - robot dancing

    UR - http://www.scopus.com/inward/record.url?scp=84948575266&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=84948575266&partnerID=8YFLogxK

    U2 - 10.1142/S0219843615500231

    DO - 10.1142/S0219843615500231

    M3 - Article

    AN - SCOPUS:84948575266

    VL - 12

    JO - International Journal of Humanoid Robotics

    JF - International Journal of Humanoid Robotics

    SN - 0219-8436

    IS - 4

    M1 - 1550023

    ER -