抄録
The state space of a sensor-based robot in the most previous works has been determined based on human intuitions, however the state space constructed from human viewpoints is not always appropriate for the robot. The robot has a different body, sensors, and tasks, therefore, we consider the robot should have an original internal state space determined based on actions, sensors, and tasks. This paper proposes an approach to construct such a robot oriented state space by statistically analyzing the actions, sensor patterns, and rewards given as results of task executions. In the state space construction, the robot creates sensor pattern classifiers called Empirically Obtained Perceivers (EOPs) the combinations of which represents internal states of the robot. We have confirmed that the robot can construct original state spaces through its vision sensor and achieve navigation tasks with the obtained state spaces in a complicated simulated world.
本文言語 | English |
---|---|
ページ | 1496-1501 |
ページ数 | 6 |
出版ステータス | Published - 1996 12月 1 |
外部発表 | はい |
イベント | Proceedings of the 1996 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS. Part 3 (of 3) - Osaka, Jpn 継続期間: 1996 11月 4 → 1996 11月 8 |
Other
Other | Proceedings of the 1996 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS. Part 3 (of 3) |
---|---|
City | Osaka, Jpn |
Period | 96/11/4 → 96/11/8 |
ASJC Scopus subject areas
- 制御およびシステム工学
- ソフトウェア
- コンピュータ ビジョンおよびパターン認識
- コンピュータ サイエンスの応用