Agent-typed multimodal interface using speech, pointing gestures and CG

Haru Ando, Hideaki Kikuchi, Nobuo Hataoka

Research output: Contribution to journalArticle

5 Citations (Scopus)

Abstract

This paper proposes a sophisticated agent-typed user interface using speech, pointing gestures and CG technologies. An "Agent-typed Interior Design System" has been implemented as a prototype for evaluating the proposed agent-typed interface, which has speech and pointing gestures as input modalities, and in which the agent is realized by 3 dimensional CG (3-D CG) and speech guidance. In this paper, the details of system implementation and evaluation results, which clarified the effectiveness of the agent-typed interface, are described.

Original languageEnglish
Pages (from-to)29-34
Number of pages6
JournalAdvances in Human Factors/Ergonomics
Volume20
Issue numberC
DOIs
Publication statusPublished - 1995 Dec 1
Externally publishedYes

ASJC Scopus subject areas

  • Human Factors and Ergonomics
  • Social Sciences (miscellaneous)

Fingerprint Dive into the research topics of 'Agent-typed multimodal interface using speech, pointing gestures and CG'. Together they form a unique fingerprint.

  • Cite this