Direct-manipulation interface for collaborative 3D drawing in the real world

Akihiro Osaki, Hiroyuki Taniguchi, Yoshiyuki Miwa

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    3 Citations (Scopus)

    Abstract

    This paper describes the collaborative augmented reality(AR) system with which multiple users can handwrite 3D lines in the air simultaneously and manipulate the lines directly in the real world. In addition, we propose a new technique for embodied communication utilizing the 3D drawing exercise. Up to now, the various 3D user interfaces have been proposed. Although most of them aim to solve the specific problems in the virtual environments, the possibility of the 3D drawing expression has not been explored yet. Accordingly, we paid special attention to the interaction with the real objects in daily life, and considered to manipulate real objects and 3D lines without any distinctions by the same action. The developed AR system consists of a head-mounted display, a drawing tool, 6DOF sensors, and the 3D user interface, which enables to push, grasp and pitch a 3D lines directly by use of the drawing tool. Additionally users can pick up desired color from either a landscape or a virtual line through the direct interaction with this tool. For sharing 3D lines among multiple users at the same place, the distributed-type AR system has been developed that mutually sends and receives drawn data between systems. With the developed system, users can proceed to design jointly in the real space through arranging each 3D drawing by direct manipulation. Moreover, a new application to the entertainment has become possible to play sports like catch, fencing match, or the like.

    Original languageEnglish
    Title of host publicationProceedings - IEEE International Workshop on Robot and Human Interactive Communication
    Pages793-798
    Number of pages6
    DOIs
    Publication statusPublished - 2006
    EventRO-MAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication - Hatfield
    Duration: 2006 Sep 62006 Sep 8

    Other

    OtherRO-MAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication
    CityHatfield
    Period06/9/606/9/8

    Fingerprint

    Augmented reality
    User interfaces
    Sports
    Virtual reality
    Display devices
    Color
    Communication
    Sensors
    Air

    ASJC Scopus subject areas

    • Engineering(all)

    Cite this

    Osaki, A., Taniguchi, H., & Miwa, Y. (2006). Direct-manipulation interface for collaborative 3D drawing in the real world. In Proceedings - IEEE International Workshop on Robot and Human Interactive Communication (pp. 793-798). [4107906] https://doi.org/10.1109/ROMAN.2006.314359

    Direct-manipulation interface for collaborative 3D drawing in the real world. / Osaki, Akihiro; Taniguchi, Hiroyuki; Miwa, Yoshiyuki.

    Proceedings - IEEE International Workshop on Robot and Human Interactive Communication. 2006. p. 793-798 4107906.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Osaki, A, Taniguchi, H & Miwa, Y 2006, Direct-manipulation interface for collaborative 3D drawing in the real world. in Proceedings - IEEE International Workshop on Robot and Human Interactive Communication., 4107906, pp. 793-798, RO-MAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication, Hatfield, 06/9/6. https://doi.org/10.1109/ROMAN.2006.314359
    Osaki A, Taniguchi H, Miwa Y. Direct-manipulation interface for collaborative 3D drawing in the real world. In Proceedings - IEEE International Workshop on Robot and Human Interactive Communication. 2006. p. 793-798. 4107906 https://doi.org/10.1109/ROMAN.2006.314359
    Osaki, Akihiro ; Taniguchi, Hiroyuki ; Miwa, Yoshiyuki. / Direct-manipulation interface for collaborative 3D drawing in the real world. Proceedings - IEEE International Workshop on Robot and Human Interactive Communication. 2006. pp. 793-798
    @inproceedings{efd7b927b32f40f4b08fce118ec3786c,
    title = "Direct-manipulation interface for collaborative 3D drawing in the real world",
    abstract = "This paper describes the collaborative augmented reality(AR) system with which multiple users can handwrite 3D lines in the air simultaneously and manipulate the lines directly in the real world. In addition, we propose a new technique for embodied communication utilizing the 3D drawing exercise. Up to now, the various 3D user interfaces have been proposed. Although most of them aim to solve the specific problems in the virtual environments, the possibility of the 3D drawing expression has not been explored yet. Accordingly, we paid special attention to the interaction with the real objects in daily life, and considered to manipulate real objects and 3D lines without any distinctions by the same action. The developed AR system consists of a head-mounted display, a drawing tool, 6DOF sensors, and the 3D user interface, which enables to push, grasp and pitch a 3D lines directly by use of the drawing tool. Additionally users can pick up desired color from either a landscape or a virtual line through the direct interaction with this tool. For sharing 3D lines among multiple users at the same place, the distributed-type AR system has been developed that mutually sends and receives drawn data between systems. With the developed system, users can proceed to design jointly in the real space through arranging each 3D drawing by direct manipulation. Moreover, a new application to the entertainment has become possible to play sports like catch, fencing match, or the like.",
    author = "Akihiro Osaki and Hiroyuki Taniguchi and Yoshiyuki Miwa",
    year = "2006",
    doi = "10.1109/ROMAN.2006.314359",
    language = "English",
    isbn = "1424405653",
    pages = "793--798",
    booktitle = "Proceedings - IEEE International Workshop on Robot and Human Interactive Communication",

    }

    TY - GEN

    T1 - Direct-manipulation interface for collaborative 3D drawing in the real world

    AU - Osaki, Akihiro

    AU - Taniguchi, Hiroyuki

    AU - Miwa, Yoshiyuki

    PY - 2006

    Y1 - 2006

    N2 - This paper describes the collaborative augmented reality(AR) system with which multiple users can handwrite 3D lines in the air simultaneously and manipulate the lines directly in the real world. In addition, we propose a new technique for embodied communication utilizing the 3D drawing exercise. Up to now, the various 3D user interfaces have been proposed. Although most of them aim to solve the specific problems in the virtual environments, the possibility of the 3D drawing expression has not been explored yet. Accordingly, we paid special attention to the interaction with the real objects in daily life, and considered to manipulate real objects and 3D lines without any distinctions by the same action. The developed AR system consists of a head-mounted display, a drawing tool, 6DOF sensors, and the 3D user interface, which enables to push, grasp and pitch a 3D lines directly by use of the drawing tool. Additionally users can pick up desired color from either a landscape or a virtual line through the direct interaction with this tool. For sharing 3D lines among multiple users at the same place, the distributed-type AR system has been developed that mutually sends and receives drawn data between systems. With the developed system, users can proceed to design jointly in the real space through arranging each 3D drawing by direct manipulation. Moreover, a new application to the entertainment has become possible to play sports like catch, fencing match, or the like.

    AB - This paper describes the collaborative augmented reality(AR) system with which multiple users can handwrite 3D lines in the air simultaneously and manipulate the lines directly in the real world. In addition, we propose a new technique for embodied communication utilizing the 3D drawing exercise. Up to now, the various 3D user interfaces have been proposed. Although most of them aim to solve the specific problems in the virtual environments, the possibility of the 3D drawing expression has not been explored yet. Accordingly, we paid special attention to the interaction with the real objects in daily life, and considered to manipulate real objects and 3D lines without any distinctions by the same action. The developed AR system consists of a head-mounted display, a drawing tool, 6DOF sensors, and the 3D user interface, which enables to push, grasp and pitch a 3D lines directly by use of the drawing tool. Additionally users can pick up desired color from either a landscape or a virtual line through the direct interaction with this tool. For sharing 3D lines among multiple users at the same place, the distributed-type AR system has been developed that mutually sends and receives drawn data between systems. With the developed system, users can proceed to design jointly in the real space through arranging each 3D drawing by direct manipulation. Moreover, a new application to the entertainment has become possible to play sports like catch, fencing match, or the like.

    UR - http://www.scopus.com/inward/record.url?scp=48349139099&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=48349139099&partnerID=8YFLogxK

    U2 - 10.1109/ROMAN.2006.314359

    DO - 10.1109/ROMAN.2006.314359

    M3 - Conference contribution

    AN - SCOPUS:48349139099

    SN - 1424405653

    SN - 9781424405657

    SP - 793

    EP - 798

    BT - Proceedings - IEEE International Workshop on Robot and Human Interactive Communication

    ER -