A rapid prototyping toolkit for touch sensitive objects using active acoustic sensing

Makoto Ono, Buntarou Shizuki, Jiro Tanaka

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present a prototyping toolkit for creating touch sensitive prototypes from everyday objects without needing special skills such as code writing or designing circuits. This toolkit consists of an acoustic based touch sensor module that captures the resonant properties of objects, software modules including one that recognizes how an object is touched by using machine learning, and plugins for visual programming environments such as Scratch and Max/MSP. As a result, our toolkit enables users to easily configure the response of touches using a wide variety of visual or audio responses. We believe that our toolkit expands the creativity of a nonspecialist, such as children and media artists.

Original languageEnglish
Title of host publicationUIST 2014 - Adjunct Publication of the 27th Annual ACM Symposium on User Interface Software and Technology
PublisherAssociation for Computing Machinery, Inc
Pages35-36
Number of pages2
ISBN (Electronic)9781450330688
DOIs
Publication statusPublished - 2014 Oct 5
Externally publishedYes
Event27th Annual ACM Symposium on User Interface Software and Technology, UIST 2014 - Honolulu, United States
Duration: 2014 Oct 52014 Oct 8

Other

Other27th Annual ACM Symposium on User Interface Software and Technology, UIST 2014
CountryUnited States
CityHonolulu
Period14/10/514/10/8

Fingerprint

Rapid prototyping
Computer programming
Learning systems
Acoustics
Networks (circuits)
Sensors

Keywords

  • Acoustic classification
  • machine learning
  • OpenSound Control
  • Piezo-electric sensor
  • Prototyping
  • Sensors
  • Support vector machine
  • Tangibles
  • Visual programming

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Software

Cite this

Ono, M., Shizuki, B., & Tanaka, J. (2014). A rapid prototyping toolkit for touch sensitive objects using active acoustic sensing. In UIST 2014 - Adjunct Publication of the 27th Annual ACM Symposium on User Interface Software and Technology (pp. 35-36). Association for Computing Machinery, Inc. https://doi.org/10.1145/2658779.2659101

A rapid prototyping toolkit for touch sensitive objects using active acoustic sensing. / Ono, Makoto; Shizuki, Buntarou; Tanaka, Jiro.

UIST 2014 - Adjunct Publication of the 27th Annual ACM Symposium on User Interface Software and Technology. Association for Computing Machinery, Inc, 2014. p. 35-36.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ono, M, Shizuki, B & Tanaka, J 2014, A rapid prototyping toolkit for touch sensitive objects using active acoustic sensing. in UIST 2014 - Adjunct Publication of the 27th Annual ACM Symposium on User Interface Software and Technology. Association for Computing Machinery, Inc, pp. 35-36, 27th Annual ACM Symposium on User Interface Software and Technology, UIST 2014, Honolulu, United States, 14/10/5. https://doi.org/10.1145/2658779.2659101
Ono M, Shizuki B, Tanaka J. A rapid prototyping toolkit for touch sensitive objects using active acoustic sensing. In UIST 2014 - Adjunct Publication of the 27th Annual ACM Symposium on User Interface Software and Technology. Association for Computing Machinery, Inc. 2014. p. 35-36 https://doi.org/10.1145/2658779.2659101
Ono, Makoto ; Shizuki, Buntarou ; Tanaka, Jiro. / A rapid prototyping toolkit for touch sensitive objects using active acoustic sensing. UIST 2014 - Adjunct Publication of the 27th Annual ACM Symposium on User Interface Software and Technology. Association for Computing Machinery, Inc, 2014. pp. 35-36
@inproceedings{1f4d3171a4164e3cbc8cfbd530d81795,
title = "A rapid prototyping toolkit for touch sensitive objects using active acoustic sensing",
abstract = "We present a prototyping toolkit for creating touch sensitive prototypes from everyday objects without needing special skills such as code writing or designing circuits. This toolkit consists of an acoustic based touch sensor module that captures the resonant properties of objects, software modules including one that recognizes how an object is touched by using machine learning, and plugins for visual programming environments such as Scratch and Max/MSP. As a result, our toolkit enables users to easily configure the response of touches using a wide variety of visual or audio responses. We believe that our toolkit expands the creativity of a nonspecialist, such as children and media artists.",
keywords = "Acoustic classification, machine learning, OpenSound Control, Piezo-electric sensor, Prototyping, Sensors, Support vector machine, Tangibles, Visual programming",
author = "Makoto Ono and Buntarou Shizuki and Jiro Tanaka",
year = "2014",
month = "10",
day = "5",
doi = "10.1145/2658779.2659101",
language = "English",
pages = "35--36",
booktitle = "UIST 2014 - Adjunct Publication of the 27th Annual ACM Symposium on User Interface Software and Technology",
publisher = "Association for Computing Machinery, Inc",

}

TY - GEN

T1 - A rapid prototyping toolkit for touch sensitive objects using active acoustic sensing

AU - Ono, Makoto

AU - Shizuki, Buntarou

AU - Tanaka, Jiro

PY - 2014/10/5

Y1 - 2014/10/5

N2 - We present a prototyping toolkit for creating touch sensitive prototypes from everyday objects without needing special skills such as code writing or designing circuits. This toolkit consists of an acoustic based touch sensor module that captures the resonant properties of objects, software modules including one that recognizes how an object is touched by using machine learning, and plugins for visual programming environments such as Scratch and Max/MSP. As a result, our toolkit enables users to easily configure the response of touches using a wide variety of visual or audio responses. We believe that our toolkit expands the creativity of a nonspecialist, such as children and media artists.

AB - We present a prototyping toolkit for creating touch sensitive prototypes from everyday objects without needing special skills such as code writing or designing circuits. This toolkit consists of an acoustic based touch sensor module that captures the resonant properties of objects, software modules including one that recognizes how an object is touched by using machine learning, and plugins for visual programming environments such as Scratch and Max/MSP. As a result, our toolkit enables users to easily configure the response of touches using a wide variety of visual or audio responses. We believe that our toolkit expands the creativity of a nonspecialist, such as children and media artists.

KW - Acoustic classification

KW - machine learning

KW - OpenSound Control

KW - Piezo-electric sensor

KW - Prototyping

KW - Sensors

KW - Support vector machine

KW - Tangibles

KW - Visual programming

UR - http://www.scopus.com/inward/record.url?scp=84910087541&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84910087541&partnerID=8YFLogxK

U2 - 10.1145/2658779.2659101

DO - 10.1145/2658779.2659101

M3 - Conference contribution

AN - SCOPUS:84910087541

SP - 35

EP - 36

BT - UIST 2014 - Adjunct Publication of the 27th Annual ACM Symposium on User Interface Software and Technology

PB - Association for Computing Machinery, Inc

ER -