Issues in humanoid audition and sound source localization by active audition

Kazuhiro Nakadai, Hiroshi G. Okuno, Hiroaki Kitano

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

In this paper, we present an active audition system which is implemented on the humanoid robot "SIG the humanoid". The audition system for highly intelligent humanoids localizes sound sources and recognizes auditory events in the auditory scene. Active audition reported in this paper enables SIG to track sources by integrating audition, vision, and motor movements. Given the multiple sound sources in the auditory scene, SIG actively moves its head to improve localization by aligning microphones orthogonal to the sound source and by capturing the possible sound sources by vision. However, such an active head movement inevitably creates motor noises.The system adaptively cancels motor noises using motor control signals and the cover acoustics. The experimental result demonstrates that active audition by integration of audition, vision, and motor control attains sound source tracking in variety of conditions.onditions.

Original languageEnglish
Pages (from-to)104-113
Number of pages10
JournalTransactions of the Japanese Society for Artificial Intelligence
Volume18
Issue number2
DOIs
Publication statusPublished - 2003
Externally publishedYes

Fingerprint

Audition
Acoustic waves
Microphones
Acoustic noise
Acoustics
Robots

Keywords

  • Active audition
  • Noise cancelation
  • Perception
  • Robots
  • Sensor fusion

ASJC Scopus subject areas

  • Artificial Intelligence

Cite this

Issues in humanoid audition and sound source localization by active audition. / Nakadai, Kazuhiro; Okuno, Hiroshi G.; Kitano, Hiroaki.

In: Transactions of the Japanese Society for Artificial Intelligence, Vol. 18, No. 2, 2003, p. 104-113.

Research output: Contribution to journalArticle

@article{1ef0bb11750846bbaf4235773c276b4f,
title = "Issues in humanoid audition and sound source localization by active audition",
abstract = "In this paper, we present an active audition system which is implemented on the humanoid robot {"}SIG the humanoid{"}. The audition system for highly intelligent humanoids localizes sound sources and recognizes auditory events in the auditory scene. Active audition reported in this paper enables SIG to track sources by integrating audition, vision, and motor movements. Given the multiple sound sources in the auditory scene, SIG actively moves its head to improve localization by aligning microphones orthogonal to the sound source and by capturing the possible sound sources by vision. However, such an active head movement inevitably creates motor noises.The system adaptively cancels motor noises using motor control signals and the cover acoustics. The experimental result demonstrates that active audition by integration of audition, vision, and motor control attains sound source tracking in variety of conditions.onditions.",
keywords = "Active audition, Noise cancelation, Perception, Robots, Sensor fusion",
author = "Kazuhiro Nakadai and Okuno, {Hiroshi G.} and Hiroaki Kitano",
year = "2003",
doi = "10.1527/tjsai.18.104",
language = "English",
volume = "18",
pages = "104--113",
journal = "Transactions of the Japanese Society for Artificial Intelligence",
issn = "1346-0714",
publisher = "Japanese Society for Artificial Intelligence",
number = "2",

}

TY - JOUR

T1 - Issues in humanoid audition and sound source localization by active audition

AU - Nakadai, Kazuhiro

AU - Okuno, Hiroshi G.

AU - Kitano, Hiroaki

PY - 2003

Y1 - 2003

N2 - In this paper, we present an active audition system which is implemented on the humanoid robot "SIG the humanoid". The audition system for highly intelligent humanoids localizes sound sources and recognizes auditory events in the auditory scene. Active audition reported in this paper enables SIG to track sources by integrating audition, vision, and motor movements. Given the multiple sound sources in the auditory scene, SIG actively moves its head to improve localization by aligning microphones orthogonal to the sound source and by capturing the possible sound sources by vision. However, such an active head movement inevitably creates motor noises.The system adaptively cancels motor noises using motor control signals and the cover acoustics. The experimental result demonstrates that active audition by integration of audition, vision, and motor control attains sound source tracking in variety of conditions.onditions.

AB - In this paper, we present an active audition system which is implemented on the humanoid robot "SIG the humanoid". The audition system for highly intelligent humanoids localizes sound sources and recognizes auditory events in the auditory scene. Active audition reported in this paper enables SIG to track sources by integrating audition, vision, and motor movements. Given the multiple sound sources in the auditory scene, SIG actively moves its head to improve localization by aligning microphones orthogonal to the sound source and by capturing the possible sound sources by vision. However, such an active head movement inevitably creates motor noises.The system adaptively cancels motor noises using motor control signals and the cover acoustics. The experimental result demonstrates that active audition by integration of audition, vision, and motor control attains sound source tracking in variety of conditions.onditions.

KW - Active audition

KW - Noise cancelation

KW - Perception

KW - Robots

KW - Sensor fusion

UR - http://www.scopus.com/inward/record.url?scp=18444394920&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=18444394920&partnerID=8YFLogxK

U2 - 10.1527/tjsai.18.104

DO - 10.1527/tjsai.18.104

M3 - Article

VL - 18

SP - 104

EP - 113

JO - Transactions of the Japanese Society for Artificial Intelligence

JF - Transactions of the Japanese Society for Artificial Intelligence

SN - 1346-0714

IS - 2

ER -