Integrating auditory and visual perception for robotic soccer players

Hiroshi G. Okuno, Yukiko Nakagawa, Hiroaki Kitano

研究成果: Conference contribution

3 引用 (Scopus)

抄録

In this paper, we present a method of integrating auditory and visual perception for mobile robot for RoboCup. When humanoid robots are fielded for soccer game, they need to quickly react to the environment using all possible sensory inputs. While current robots heavily depends on visual inputs, auditory inputs actually play significant role in detecting events where visual inputs are not available, such as side and behind the face direction. Sound of other players kicking the ball, and shouting of teammates are critical cues for sophisticated teamwork play, such as offside trap. This paper presents integration of auditory and visual perception for identifying sound sources and separating sounds at high accuracy using both auditory and visual inputs.

元の言語English
ホスト出版物のタイトルProceedings of the IEEE International Conference on Systems, Man and Cybernetics
出版者IEEE
6
出版物ステータスPublished - 1999
外部発表Yes
イベント1999 IEEE International Conference on Systems, Man, and Cybernetics 'Human Communication and Cybernetics' - Tokyo, Jpn
継続期間: 1999 10 121999 10 15

Other

Other1999 IEEE International Conference on Systems, Man, and Cybernetics 'Human Communication and Cybernetics'
Tokyo, Jpn
期間99/10/1299/10/15

Fingerprint

Robotics
Acoustic waves
Robots
Mobile robots

ASJC Scopus subject areas

  • Hardware and Architecture
  • Control and Systems Engineering

これを引用

Okuno, H. G., Nakagawa, Y., & Kitano, H. (1999). Integrating auditory and visual perception for robotic soccer players. : Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (巻 6). IEEE.

Integrating auditory and visual perception for robotic soccer players. / Okuno, Hiroshi G.; Nakagawa, Yukiko; Kitano, Hiroaki.

Proceedings of the IEEE International Conference on Systems, Man and Cybernetics. 巻 6 IEEE, 1999.

研究成果: Conference contribution

Okuno, HG, Nakagawa, Y & Kitano, H 1999, Integrating auditory and visual perception for robotic soccer players. : Proceedings of the IEEE International Conference on Systems, Man and Cybernetics. 巻. 6, IEEE, 1999 IEEE International Conference on Systems, Man, and Cybernetics 'Human Communication and Cybernetics', Tokyo, Jpn, 99/10/12.
Okuno HG, Nakagawa Y, Kitano H. Integrating auditory and visual perception for robotic soccer players. : Proceedings of the IEEE International Conference on Systems, Man and Cybernetics. 巻 6. IEEE. 1999
Okuno, Hiroshi G. ; Nakagawa, Yukiko ; Kitano, Hiroaki. / Integrating auditory and visual perception for robotic soccer players. Proceedings of the IEEE International Conference on Systems, Man and Cybernetics. 巻 6 IEEE, 1999.
@inproceedings{a70ffa4990dc4028b80a392d2d0b77ab,
title = "Integrating auditory and visual perception for robotic soccer players",
abstract = "In this paper, we present a method of integrating auditory and visual perception for mobile robot for RoboCup. When humanoid robots are fielded for soccer game, they need to quickly react to the environment using all possible sensory inputs. While current robots heavily depends on visual inputs, auditory inputs actually play significant role in detecting events where visual inputs are not available, such as side and behind the face direction. Sound of other players kicking the ball, and shouting of teammates are critical cues for sophisticated teamwork play, such as offside trap. This paper presents integration of auditory and visual perception for identifying sound sources and separating sounds at high accuracy using both auditory and visual inputs.",
author = "Okuno, {Hiroshi G.} and Yukiko Nakagawa and Hiroaki Kitano",
year = "1999",
language = "English",
volume = "6",
booktitle = "Proceedings of the IEEE International Conference on Systems, Man and Cybernetics",
publisher = "IEEE",

}

TY - GEN

T1 - Integrating auditory and visual perception for robotic soccer players

AU - Okuno, Hiroshi G.

AU - Nakagawa, Yukiko

AU - Kitano, Hiroaki

PY - 1999

Y1 - 1999

N2 - In this paper, we present a method of integrating auditory and visual perception for mobile robot for RoboCup. When humanoid robots are fielded for soccer game, they need to quickly react to the environment using all possible sensory inputs. While current robots heavily depends on visual inputs, auditory inputs actually play significant role in detecting events where visual inputs are not available, such as side and behind the face direction. Sound of other players kicking the ball, and shouting of teammates are critical cues for sophisticated teamwork play, such as offside trap. This paper presents integration of auditory and visual perception for identifying sound sources and separating sounds at high accuracy using both auditory and visual inputs.

AB - In this paper, we present a method of integrating auditory and visual perception for mobile robot for RoboCup. When humanoid robots are fielded for soccer game, they need to quickly react to the environment using all possible sensory inputs. While current robots heavily depends on visual inputs, auditory inputs actually play significant role in detecting events where visual inputs are not available, such as side and behind the face direction. Sound of other players kicking the ball, and shouting of teammates are critical cues for sophisticated teamwork play, such as offside trap. This paper presents integration of auditory and visual perception for identifying sound sources and separating sounds at high accuracy using both auditory and visual inputs.

UR - http://www.scopus.com/inward/record.url?scp=16744367757&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=16744367757&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:16744367757

VL - 6

BT - Proceedings of the IEEE International Conference on Systems, Man and Cybernetics

PB - IEEE

ER -