A location-sensitive visual interface on the palm: interacting with common objects in an augmented space

Seokhwan Kim, Shin Takahashi, Jiro Tanaka

Research output: Contribution to journalArticle

Abstract

We have created a visual interface using the human palm that is location sensitive and always available. To accomplish this, we constructed an augmented space in an actual workspace by installing several depth cameras. To manage and connect the multiple depth cameras, we constructed a distributed system based on scalable client––server architecture. By merging depth images from different cameras, the distributed system can track the locations of users within their area of coverage. The system also has a convenient feature that allows users to collect the locations of objects while visualizing the objects via images from the depth cameras. Consequently, the locations of both users and objects are available to the system, thus providing a location-based context for determining which user is close to which object. As a result, the visual interface on the palm becomes location sensitive, which could lead to various applications in daily life. In this paper, we describe the implementation of the aforementioned system and demonstrate its potential applicability.

Original languageEnglish
Pages (from-to)175-187
Number of pages13
JournalPersonal and Ubiquitous Computing
Volume19
Issue number1
DOIs
Publication statusPublished - 2015 Jan 1
Externally publishedYes

Fingerprint

Cameras
Merging
Distributed systems

Keywords

  • Augmented reality
  • Interaction
  • Interface on body
  • Location awareness
  • System
  • Ubiquitous

ASJC Scopus subject areas

  • Hardware and Architecture
  • Computer Science Applications
  • Management Science and Operations Research

Cite this

A location-sensitive visual interface on the palm : interacting with common objects in an augmented space. / Kim, Seokhwan; Takahashi, Shin; Tanaka, Jiro.

In: Personal and Ubiquitous Computing, Vol. 19, No. 1, 01.01.2015, p. 175-187.

Research output: Contribution to journalArticle

@article{896084fb95f54c5eb36234576dcf2857,
title = "A location-sensitive visual interface on the palm: interacting with common objects in an augmented space",
abstract = "We have created a visual interface using the human palm that is location sensitive and always available. To accomplish this, we constructed an augmented space in an actual workspace by installing several depth cameras. To manage and connect the multiple depth cameras, we constructed a distributed system based on scalable client––server architecture. By merging depth images from different cameras, the distributed system can track the locations of users within their area of coverage. The system also has a convenient feature that allows users to collect the locations of objects while visualizing the objects via images from the depth cameras. Consequently, the locations of both users and objects are available to the system, thus providing a location-based context for determining which user is close to which object. As a result, the visual interface on the palm becomes location sensitive, which could lead to various applications in daily life. In this paper, we describe the implementation of the aforementioned system and demonstrate its potential applicability.",
keywords = "Augmented reality, Interaction, Interface on body, Location awareness, System, Ubiquitous",
author = "Seokhwan Kim and Shin Takahashi and Jiro Tanaka",
year = "2015",
month = "1",
day = "1",
doi = "10.1007/s00779-014-0769-0",
language = "English",
volume = "19",
pages = "175--187",
journal = "Personal and Ubiquitous Computing",
issn = "1617-4909",
publisher = "Springer London",
number = "1",

}

TY - JOUR

T1 - A location-sensitive visual interface on the palm

T2 - interacting with common objects in an augmented space

AU - Kim, Seokhwan

AU - Takahashi, Shin

AU - Tanaka, Jiro

PY - 2015/1/1

Y1 - 2015/1/1

N2 - We have created a visual interface using the human palm that is location sensitive and always available. To accomplish this, we constructed an augmented space in an actual workspace by installing several depth cameras. To manage and connect the multiple depth cameras, we constructed a distributed system based on scalable client––server architecture. By merging depth images from different cameras, the distributed system can track the locations of users within their area of coverage. The system also has a convenient feature that allows users to collect the locations of objects while visualizing the objects via images from the depth cameras. Consequently, the locations of both users and objects are available to the system, thus providing a location-based context for determining which user is close to which object. As a result, the visual interface on the palm becomes location sensitive, which could lead to various applications in daily life. In this paper, we describe the implementation of the aforementioned system and demonstrate its potential applicability.

AB - We have created a visual interface using the human palm that is location sensitive and always available. To accomplish this, we constructed an augmented space in an actual workspace by installing several depth cameras. To manage and connect the multiple depth cameras, we constructed a distributed system based on scalable client––server architecture. By merging depth images from different cameras, the distributed system can track the locations of users within their area of coverage. The system also has a convenient feature that allows users to collect the locations of objects while visualizing the objects via images from the depth cameras. Consequently, the locations of both users and objects are available to the system, thus providing a location-based context for determining which user is close to which object. As a result, the visual interface on the palm becomes location sensitive, which could lead to various applications in daily life. In this paper, we describe the implementation of the aforementioned system and demonstrate its potential applicability.

KW - Augmented reality

KW - Interaction

KW - Interface on body

KW - Location awareness

KW - System

KW - Ubiquitous

UR - http://www.scopus.com/inward/record.url?scp=84957975208&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84957975208&partnerID=8YFLogxK

U2 - 10.1007/s00779-014-0769-0

DO - 10.1007/s00779-014-0769-0

M3 - Article

AN - SCOPUS:84957975208

VL - 19

SP - 175

EP - 187

JO - Personal and Ubiquitous Computing

JF - Personal and Ubiquitous Computing

SN - 1617-4909

IS - 1

ER -