Sound localization for a robot or an embedded system is usually solved by using Interaural Phase Difference (IPD) and Interaural Intensity Difference (IID). These values are calculated by using Head-Related Transfer Function (HRTF). However, HRTF depends on the shape of head and also changes as environments changes. Therefore, sound localization without HRTF is needed for real-world applications. In this paper, we present a new sound localization method based on auditory epipolar geometry with motion control. Auditory epipolar geometry is an extension of epipolar geometry in stereo vision to audition, and auditory and visual epipolar geometry can share the sound source direction. The key idea is to exploit additional inputs obtained by motor control in order to compensate damages in the IPD and IID caused by reverberation of the room and the body of a robot. The proposed system can localize and extract simultaneous two sound sources in a real-world room.
|ホスト出版物のタイトル||IEEE International Conference on Intelligent Robots and Systems|
|出版ステータス||Published - 2001|
|イベント||2001 IEEE/RSJ International Conference on Intelligent Robots and Systems - Maui, HI|
継続期間: 2001 10月 29 → 2001 11月 3
|Other||2001 IEEE/RSJ International Conference on Intelligent Robots and Systems|
|Period||01/10/29 → 01/11/3|
ASJC Scopus subject areas