“Real-time Both Hands Tracking Using Feature Point Gathering by KLT Tracker for Man-Machine Interface”

Ryosuke Araki, Takeshi Ikenaga

Research output: Contribution to journalArticlepeer-review

Abstract

Intuitive man-machine interface based on a gesture with a touchpad device is becoming common. In the near future, the importance of gesture recognition using input from a video camera is expected to be high in order to widen applicable information terminals and their applications. Conventional works, however, use a complex input device combining plural cameras and sensors. Moreover, since most of their algorithms need high computational complexity and is good for images with a simple background, it's difficult to apply practical systems. This paper proposes a real-time single-input object tracking algorithm which can trace both hands precisely under a complex background. It makes it possible to attain both high accuracy and low complexity by applying a technique combining frame difference and color and decision of feature point gathering into the KLT (Kanade-Lucas-Tomasi) tracker, a kind of an optical flow. Software based evaluation results using a wide variety of test sequences (e.g. complex background and object shape change) show that the proposed algorithm achieves higher tracking accuracy compared with conventional ones. Furthermore, a processing performance is 13-16 frame per second, which means both hands can be tracked in real-time.

Original languageEnglish
Pages (from-to)833-841
Number of pages9
JournalJournal of the Institute of Image Electronics Engineers of Japan
Volume40
Issue number5
DOIs
Publication statusPublished - 2011

Keywords

  • KLT Tracker
  • man-machine interface
  • object tracking
  • optical flow

ASJC Scopus subject areas

  • Computer Science (miscellaneous)
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of '“Real-time Both Hands Tracking Using Feature Point Gathering by KLT Tracker for Man-Machine Interface”'. Together they form a unique fingerprint.

Cite this