Extracting the Relationship between the Spatial Distribution and Types of Bird Vocalizations Using Robot Audition System HARK

Shinji Sumitani, Reiji Suzuki, Shiho Matsubayashi, Takaya Arita, Kazuhiro Nakadai, Hiroshi G. Okuno

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    1 Citation (Scopus)

    Abstract

    For a deeper understanding of ecological functions and semantics of wild bird vocalizations (i.e., songs and calls), it is important to clarify the fine-scaled and detailed relationships among their characteristics of vocalizations and their behavioral contexts. However, it takes a lot of time and effort to obtain such data using conventional recordings or by human observation. Bringing out a robot to a field is our approach to solve this problem. We are developing a portable observation system called HARKBird using a robot audition HARK and microphone arrays to understand temporal patterns of vocalizations characteristics and their behavioral contexts. In this paper, we introduce a prototype system to 2D localize vocalizations of wild birds in real-time, and to classify their song types after recording. We show that the system can estimate the position of songs of a target individual and classify their songs with a reasonable quality to discuss their song - behavior relationships.

    Original languageEnglish
    Title of host publication2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018
    PublisherInstitute of Electrical and Electronics Engineers Inc.
    Pages2485-2490
    Number of pages6
    ISBN (Electronic)9781538680940
    DOIs
    Publication statusPublished - 2018 Dec 27
    Event2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018 - Madrid, Spain
    Duration: 2018 Oct 12018 Oct 5

    Publication series

    NameIEEE International Conference on Intelligent Robots and Systems
    ISSN (Print)2153-0858
    ISSN (Electronic)2153-0866

    Conference

    Conference2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018
    CountrySpain
    CityMadrid
    Period18/10/118/10/5

    Fingerprint

    Birds
    Audition
    Spatial distribution
    Robots
    Microphones
    Semantics

    ASJC Scopus subject areas

    • Control and Systems Engineering
    • Software
    • Computer Vision and Pattern Recognition
    • Computer Science Applications

    Cite this

    Sumitani, S., Suzuki, R., Matsubayashi, S., Arita, T., Nakadai, K., & Okuno, H. G. (2018). Extracting the Relationship between the Spatial Distribution and Types of Bird Vocalizations Using Robot Audition System HARK. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018 (pp. 2485-2490). [8594130] (IEEE International Conference on Intelligent Robots and Systems). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IROS.2018.8594130

    Extracting the Relationship between the Spatial Distribution and Types of Bird Vocalizations Using Robot Audition System HARK. / Sumitani, Shinji; Suzuki, Reiji; Matsubayashi, Shiho; Arita, Takaya; Nakadai, Kazuhiro; Okuno, Hiroshi G.

    2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018. Institute of Electrical and Electronics Engineers Inc., 2018. p. 2485-2490 8594130 (IEEE International Conference on Intelligent Robots and Systems).

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Sumitani, S, Suzuki, R, Matsubayashi, S, Arita, T, Nakadai, K & Okuno, HG 2018, Extracting the Relationship between the Spatial Distribution and Types of Bird Vocalizations Using Robot Audition System HARK. in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018., 8594130, IEEE International Conference on Intelligent Robots and Systems, Institute of Electrical and Electronics Engineers Inc., pp. 2485-2490, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018, Madrid, Spain, 18/10/1. https://doi.org/10.1109/IROS.2018.8594130
    Sumitani S, Suzuki R, Matsubayashi S, Arita T, Nakadai K, Okuno HG. Extracting the Relationship between the Spatial Distribution and Types of Bird Vocalizations Using Robot Audition System HARK. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018. Institute of Electrical and Electronics Engineers Inc. 2018. p. 2485-2490. 8594130. (IEEE International Conference on Intelligent Robots and Systems). https://doi.org/10.1109/IROS.2018.8594130
    Sumitani, Shinji ; Suzuki, Reiji ; Matsubayashi, Shiho ; Arita, Takaya ; Nakadai, Kazuhiro ; Okuno, Hiroshi G. / Extracting the Relationship between the Spatial Distribution and Types of Bird Vocalizations Using Robot Audition System HARK. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018. Institute of Electrical and Electronics Engineers Inc., 2018. pp. 2485-2490 (IEEE International Conference on Intelligent Robots and Systems).
    @inproceedings{ec29e714da2d44c3a8f0be48ecb71717,
    title = "Extracting the Relationship between the Spatial Distribution and Types of Bird Vocalizations Using Robot Audition System HARK",
    abstract = "For a deeper understanding of ecological functions and semantics of wild bird vocalizations (i.e., songs and calls), it is important to clarify the fine-scaled and detailed relationships among their characteristics of vocalizations and their behavioral contexts. However, it takes a lot of time and effort to obtain such data using conventional recordings or by human observation. Bringing out a robot to a field is our approach to solve this problem. We are developing a portable observation system called HARKBird using a robot audition HARK and microphone arrays to understand temporal patterns of vocalizations characteristics and their behavioral contexts. In this paper, we introduce a prototype system to 2D localize vocalizations of wild birds in real-time, and to classify their song types after recording. We show that the system can estimate the position of songs of a target individual and classify their songs with a reasonable quality to discuss their song - behavior relationships.",
    author = "Shinji Sumitani and Reiji Suzuki and Shiho Matsubayashi and Takaya Arita and Kazuhiro Nakadai and Okuno, {Hiroshi G.}",
    year = "2018",
    month = "12",
    day = "27",
    doi = "10.1109/IROS.2018.8594130",
    language = "English",
    series = "IEEE International Conference on Intelligent Robots and Systems",
    publisher = "Institute of Electrical and Electronics Engineers Inc.",
    pages = "2485--2490",
    booktitle = "2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018",

    }

    TY - GEN

    T1 - Extracting the Relationship between the Spatial Distribution and Types of Bird Vocalizations Using Robot Audition System HARK

    AU - Sumitani, Shinji

    AU - Suzuki, Reiji

    AU - Matsubayashi, Shiho

    AU - Arita, Takaya

    AU - Nakadai, Kazuhiro

    AU - Okuno, Hiroshi G.

    PY - 2018/12/27

    Y1 - 2018/12/27

    N2 - For a deeper understanding of ecological functions and semantics of wild bird vocalizations (i.e., songs and calls), it is important to clarify the fine-scaled and detailed relationships among their characteristics of vocalizations and their behavioral contexts. However, it takes a lot of time and effort to obtain such data using conventional recordings or by human observation. Bringing out a robot to a field is our approach to solve this problem. We are developing a portable observation system called HARKBird using a robot audition HARK and microphone arrays to understand temporal patterns of vocalizations characteristics and their behavioral contexts. In this paper, we introduce a prototype system to 2D localize vocalizations of wild birds in real-time, and to classify their song types after recording. We show that the system can estimate the position of songs of a target individual and classify their songs with a reasonable quality to discuss their song - behavior relationships.

    AB - For a deeper understanding of ecological functions and semantics of wild bird vocalizations (i.e., songs and calls), it is important to clarify the fine-scaled and detailed relationships among their characteristics of vocalizations and their behavioral contexts. However, it takes a lot of time and effort to obtain such data using conventional recordings or by human observation. Bringing out a robot to a field is our approach to solve this problem. We are developing a portable observation system called HARKBird using a robot audition HARK and microphone arrays to understand temporal patterns of vocalizations characteristics and their behavioral contexts. In this paper, we introduce a prototype system to 2D localize vocalizations of wild birds in real-time, and to classify their song types after recording. We show that the system can estimate the position of songs of a target individual and classify their songs with a reasonable quality to discuss their song - behavior relationships.

    UR - http://www.scopus.com/inward/record.url?scp=85062998103&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=85062998103&partnerID=8YFLogxK

    U2 - 10.1109/IROS.2018.8594130

    DO - 10.1109/IROS.2018.8594130

    M3 - Conference contribution

    AN - SCOPUS:85062998103

    T3 - IEEE International Conference on Intelligent Robots and Systems

    SP - 2485

    EP - 2490

    BT - 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018

    PB - Institute of Electrical and Electronics Engineers Inc.

    ER -