TY - JOUR
T1 - A location-sensitive visual interface on the palm
T2 - interacting with common objects in an augmented space
AU - Kim, Seokhwan
AU - Takahashi, Shin
AU - Tanaka, Jiro
N1 - Publisher Copyright:
© 2014, Springer-Verlag London.
PY - 2015/1/1
Y1 - 2015/1/1
N2 - We have created a visual interface using the human palm that is location sensitive and always available. To accomplish this, we constructed an augmented space in an actual workspace by installing several depth cameras. To manage and connect the multiple depth cameras, we constructed a distributed system based on scalable client––server architecture. By merging depth images from different cameras, the distributed system can track the locations of users within their area of coverage. The system also has a convenient feature that allows users to collect the locations of objects while visualizing the objects via images from the depth cameras. Consequently, the locations of both users and objects are available to the system, thus providing a location-based context for determining which user is close to which object. As a result, the visual interface on the palm becomes location sensitive, which could lead to various applications in daily life. In this paper, we describe the implementation of the aforementioned system and demonstrate its potential applicability.
AB - We have created a visual interface using the human palm that is location sensitive and always available. To accomplish this, we constructed an augmented space in an actual workspace by installing several depth cameras. To manage and connect the multiple depth cameras, we constructed a distributed system based on scalable client––server architecture. By merging depth images from different cameras, the distributed system can track the locations of users within their area of coverage. The system also has a convenient feature that allows users to collect the locations of objects while visualizing the objects via images from the depth cameras. Consequently, the locations of both users and objects are available to the system, thus providing a location-based context for determining which user is close to which object. As a result, the visual interface on the palm becomes location sensitive, which could lead to various applications in daily life. In this paper, we describe the implementation of the aforementioned system and demonstrate its potential applicability.
KW - Augmented reality
KW - Interaction
KW - Interface on body
KW - Location awareness
KW - System
KW - Ubiquitous
UR - http://www.scopus.com/inward/record.url?scp=84957975208&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84957975208&partnerID=8YFLogxK
U2 - 10.1007/s00779-014-0769-0
DO - 10.1007/s00779-014-0769-0
M3 - Article
AN - SCOPUS:84957975208
VL - 19
SP - 175
EP - 187
JO - Personal and Ubiquitous Computing
JF - Personal and Ubiquitous Computing
SN - 1617-4909
IS - 1
ER -