Visualization of auditory awareness based on sound source positions estimated by depth sensor and microphone array

Takahiro Iyama, Osamu Sugiyama, Takuma Otsuka, Katsutoshi Itoyama, Hiroshi G. Okuno

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

We have developed a system for visualizing auditory awareness on the basis of sound source locations estimated using a depth sensor and microphone array. Previous studies on visualizing the acoustic environment viewed the level of sound pressures directly on the captured image, so the visualization was often based on a mixture of several sound sources. As a result, which targets to focus on was not intuitive. To help users selectively to find the targets and focus on the target analysis, we should extract the captured acoustic information and selectively propose it with the user demand. We have designed a three-layer visualization model for auditory awareness consisting of a sound source distribution layer, a sound location layer, and a sound saliency layer. The model extracts acoustic information by using the depth image and multi-directional sound sources captured with a depth sensor and microphone array. This model is used in the system we developed for visualizing auditory awareness.

Original languageEnglish
Title of host publicationIEEE International Conference on Intelligent Robots and Systems
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1908-1913
Number of pages6
ISBN (Print)9781479969340
DOIs
Publication statusPublished - 2014 Oct 31
Externally publishedYes
Event2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2014 - Chicago
Duration: 2014 Sep 142014 Sep 18

Other

Other2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2014
CityChicago
Period14/9/1414/9/18

Fingerprint

Microphones
Visualization
Acoustic waves
Sensors
Acoustics

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Cite this

Iyama, T., Sugiyama, O., Otsuka, T., Itoyama, K., & Okuno, H. G. (2014). Visualization of auditory awareness based on sound source positions estimated by depth sensor and microphone array. In IEEE International Conference on Intelligent Robots and Systems (pp. 1908-1913). [6942814] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IROS.2014.6942814

Visualization of auditory awareness based on sound source positions estimated by depth sensor and microphone array. / Iyama, Takahiro; Sugiyama, Osamu; Otsuka, Takuma; Itoyama, Katsutoshi; Okuno, Hiroshi G.

IEEE International Conference on Intelligent Robots and Systems. Institute of Electrical and Electronics Engineers Inc., 2014. p. 1908-1913 6942814.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Iyama, T, Sugiyama, O, Otsuka, T, Itoyama, K & Okuno, HG 2014, Visualization of auditory awareness based on sound source positions estimated by depth sensor and microphone array. in IEEE International Conference on Intelligent Robots and Systems., 6942814, Institute of Electrical and Electronics Engineers Inc., pp. 1908-1913, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2014, Chicago, 14/9/14. https://doi.org/10.1109/IROS.2014.6942814
Iyama T, Sugiyama O, Otsuka T, Itoyama K, Okuno HG. Visualization of auditory awareness based on sound source positions estimated by depth sensor and microphone array. In IEEE International Conference on Intelligent Robots and Systems. Institute of Electrical and Electronics Engineers Inc. 2014. p. 1908-1913. 6942814 https://doi.org/10.1109/IROS.2014.6942814
Iyama, Takahiro ; Sugiyama, Osamu ; Otsuka, Takuma ; Itoyama, Katsutoshi ; Okuno, Hiroshi G. / Visualization of auditory awareness based on sound source positions estimated by depth sensor and microphone array. IEEE International Conference on Intelligent Robots and Systems. Institute of Electrical and Electronics Engineers Inc., 2014. pp. 1908-1913
@inproceedings{004d7845b69c4c159e3258ca16c38cd8,
title = "Visualization of auditory awareness based on sound source positions estimated by depth sensor and microphone array",
abstract = "We have developed a system for visualizing auditory awareness on the basis of sound source locations estimated using a depth sensor and microphone array. Previous studies on visualizing the acoustic environment viewed the level of sound pressures directly on the captured image, so the visualization was often based on a mixture of several sound sources. As a result, which targets to focus on was not intuitive. To help users selectively to find the targets and focus on the target analysis, we should extract the captured acoustic information and selectively propose it with the user demand. We have designed a three-layer visualization model for auditory awareness consisting of a sound source distribution layer, a sound location layer, and a sound saliency layer. The model extracts acoustic information by using the depth image and multi-directional sound sources captured with a depth sensor and microphone array. This model is used in the system we developed for visualizing auditory awareness.",
author = "Takahiro Iyama and Osamu Sugiyama and Takuma Otsuka and Katsutoshi Itoyama and Okuno, {Hiroshi G.}",
year = "2014",
month = "10",
day = "31",
doi = "10.1109/IROS.2014.6942814",
language = "English",
isbn = "9781479969340",
pages = "1908--1913",
booktitle = "IEEE International Conference on Intelligent Robots and Systems",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Visualization of auditory awareness based on sound source positions estimated by depth sensor and microphone array

AU - Iyama, Takahiro

AU - Sugiyama, Osamu

AU - Otsuka, Takuma

AU - Itoyama, Katsutoshi

AU - Okuno, Hiroshi G.

PY - 2014/10/31

Y1 - 2014/10/31

N2 - We have developed a system for visualizing auditory awareness on the basis of sound source locations estimated using a depth sensor and microphone array. Previous studies on visualizing the acoustic environment viewed the level of sound pressures directly on the captured image, so the visualization was often based on a mixture of several sound sources. As a result, which targets to focus on was not intuitive. To help users selectively to find the targets and focus on the target analysis, we should extract the captured acoustic information and selectively propose it with the user demand. We have designed a three-layer visualization model for auditory awareness consisting of a sound source distribution layer, a sound location layer, and a sound saliency layer. The model extracts acoustic information by using the depth image and multi-directional sound sources captured with a depth sensor and microphone array. This model is used in the system we developed for visualizing auditory awareness.

AB - We have developed a system for visualizing auditory awareness on the basis of sound source locations estimated using a depth sensor and microphone array. Previous studies on visualizing the acoustic environment viewed the level of sound pressures directly on the captured image, so the visualization was often based on a mixture of several sound sources. As a result, which targets to focus on was not intuitive. To help users selectively to find the targets and focus on the target analysis, we should extract the captured acoustic information and selectively propose it with the user demand. We have designed a three-layer visualization model for auditory awareness consisting of a sound source distribution layer, a sound location layer, and a sound saliency layer. The model extracts acoustic information by using the depth image and multi-directional sound sources captured with a depth sensor and microphone array. This model is used in the system we developed for visualizing auditory awareness.

UR - http://www.scopus.com/inward/record.url?scp=84911497299&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84911497299&partnerID=8YFLogxK

U2 - 10.1109/IROS.2014.6942814

DO - 10.1109/IROS.2014.6942814

M3 - Conference contribution

AN - SCOPUS:84911497299

SN - 9781479969340

SP - 1908

EP - 1913

BT - IEEE International Conference on Intelligent Robots and Systems

PB - Institute of Electrical and Electronics Engineers Inc.

ER -