Computational model of depth perception based on fixational eye movements

Norio Tagawa, Todorka Alexandrova

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

The small vibration of the eye ball, which occurs when we fix our gaze on an object, is called "fixational eye movement." It has been reported that this function works also as a clue to monocular depth perception. Moreover, researches for a depth recovery method using camera motions based on an analogy of fixational eye movement are in progress. We suppose that depth perception with fixational eye movement is firstly carried out, and subsequently such depth information is supplementary used for binocular stereopsis. Especially in this study, using camera motions corresponding to the smallest type of fixational eye movement called "tremor," we construct depth perception algorithm which models camera motion as a irregular perturbation, and confirm its effectiveness.

Original languageEnglish
Title of host publicationVISAPP 2010 - Proceedings of the International Conference on Computer Vision Theory and Applications
Pages328-333
Number of pages6
Volume1
Publication statusPublished - 2010
Externally publishedYes
Event5th International Conference on Computer Vision Theory and Applications, VISAPP 2010 - Angers
Duration: 2010 May 172010 May 21

Other

Other5th International Conference on Computer Vision Theory and Applications, VISAPP 2010
CityAngers
Period10/5/1710/5/21

    Fingerprint

Keywords

  • Bayesian estimation
  • Depth perception
  • EM algorithm
  • Fixational eye movement
  • Structure from motion

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Computer Science Applications
  • Computer Vision and Pattern Recognition

Cite this

Tagawa, N., & Alexandrova, T. (2010). Computational model of depth perception based on fixational eye movements. In VISAPP 2010 - Proceedings of the International Conference on Computer Vision Theory and Applications (Vol. 1, pp. 328-333)