抄録
Everyday, we receive large amounts of information through our various senses. Since the majority of information is perceived through eye sight, other senses are often left unutilized. As a result, in order to utilize disengaged senses in our daily interactions, especially auditory, we propose a method to extract and transform real-world's visual information to auditory information. Our method is based on converting a real-world's objects' location and distance attributes to sound attributes, which are later be combined to form music. To verify our approach, we developed a prototype that we called Music Sonar, with which we carried out a preliminary user study followed by a questionnaire. The objective of Music Sonar is to reduce the cognitive overload due to visual information by converting visual feedback to ambient auditory feedback, allowing us to use our visual attention for more essential tasks. The user study results confirmed general validity of our approach, despite some shortcomings. Participants also provided a number of usability and interactivity related insights regarding the music based feedback. Finally, we provided our future direction of this research project.
本文言語 | English |
---|---|
ホスト出版物のタイトル | ACE 2015 - 12th Advances in Computer Entertainment Technology Conference, Proceedings |
出版社 | Association for Computing Machinery |
巻 | 16-19-November-2015 |
ISBN(電子版) | 9781450338523 |
DOI | |
出版ステータス | Published - 2015 11月 16 |
イベント | 12th Advances in Computer Entertainment Technology Conference, ACE 2015 - Iskandar, Malaysia 継続期間: 2015 11月 16 → 2015 11月 19 |
Other
Other | 12th Advances in Computer Entertainment Technology Conference, ACE 2015 |
---|---|
国/地域 | Malaysia |
City | Iskandar |
Period | 15/11/16 → 15/11/19 |
ASJC Scopus subject areas
- 人間とコンピュータの相互作用
- コンピュータ ネットワークおよび通信
- コンピュータ ビジョンおよびパターン認識
- ソフトウェア