Issues in humanoid audition and sound source localization by active audition

Kazuhiro Nakadai*, Hiroshi G. Okuno, Hiroaki Kitano

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

In this paper, we present an active audition system which is implemented on the humanoid robot "SIG the humanoid". The audition system for highly intelligent humanoids localizes sound sources and recognizes auditory events in the auditory scene. Active audition reported in this paper enables SIG to track sources by integrating audition, vision, and motor movements. Given the multiple sound sources in the auditory scene, SIG actively moves its head to improve localization by aligning microphones orthogonal to the sound source and by capturing the possible sound sources by vision. However, such an active head movement inevitably creates motor noises.The system adaptively cancels motor noises using motor control signals and the cover acoustics. The experimental result demonstrates that active audition by integration of audition, vision, and motor control attains sound source tracking in variety of conditions.onditions.

Original languageEnglish
Pages (from-to)104-113
Number of pages10
JournalTransactions of the Japanese Society for Artificial Intelligence
Volume18
Issue number2
DOIs
Publication statusPublished - 2003
Externally publishedYes

Keywords

  • Active audition
  • Noise cancelation
  • Perception
  • Robots
  • Sensor fusion

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Issues in humanoid audition and sound source localization by active audition'. Together they form a unique fingerprint.

Cite this