Ubiquitous supervisory system based on social contexts using ontology

Satoru Izumi, Kazuhiro Yamanaka, Yoshikazu Tokairin, Hideyuki Takahashi, Takuo Suganuma, Norio Shiratori

    Research output: Contribution to journalArticlepeer-review

    12 Citations (Scopus)

    Abstract

    As described in this paper, we propose a supervisory system that considers actual situations and social aspects of users in a ubiquitous computing environment. To realize gentle and safe supervision while providing efficient supervisory services, the system must recognize the situations of a watched person, such as the person's physical condition. To achieve this, we have proposed a ubiquitous supervisory system "uEyes", which introduces Social Context Awareness: a distinguishing feature for supervision. Using this feature, the system can combine environmental information acquired from sensors in the real world and common-sense knowledge related to human activities in daily life. As described in this paper, we specifically examine design of Social Context Awareness using ontology technologies. Based on this advanced feature, a live video streaming system is configured autonomously depending on the users' circumstances in runtime. We implemented a uEyes prototype for supervising elderly people and performed some experiments based on several scenarios. Based on those experimental results, we confirmed that the social contexts are handled effectively to support the supervision.

    Original languageEnglish
    Pages (from-to)141-163
    Number of pages23
    JournalMobile Information Systems
    Volume5
    Issue number2
    DOIs
    Publication statusPublished - 2009

    Keywords

    • Ontology
    • Social Context Awareness
    • Supervisory system
    • Temporal concept

    ASJC Scopus subject areas

    • Computer Networks and Communications
    • Computer Science Applications

    Fingerprint Dive into the research topics of 'Ubiquitous supervisory system based on social contexts using ontology'. Together they form a unique fingerprint.

    Cite this