A robotic auditory system for imitating human listening behavior

Hideyuki Sawada, Chika Udaka

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In the vocal communication, we employ not only the speech and auditory ability but also the body actions such as gestures and gaze to react to the speech. For example, a human nods to a speaker for an agreement, and makes eye contact during the conversation. The purpose of this study is to realize a human-like auditory system using a robotic arm, which listens to a human voice like a human, and interacts with human speakers. In this paper, the imitation of human listening behavior for realizing the natural communication between a human and a robot by the robotic auditory system is introduced.

Original languageEnglish
Title of host publication2013 IEEE International Conference on Mechatronics and Automation, IEEE ICMA 2013
Pages773-778
Number of pages6
DOIs
Publication statusPublished - 2013
Externally publishedYes
Event2013 10th IEEE International Conference on Mechatronics and Automation, IEEE ICMA 2013 - Takamastu
Duration: 2013 Aug 42013 Aug 7

Other

Other2013 10th IEEE International Conference on Mechatronics and Automation, IEEE ICMA 2013
CityTakamastu
Period13/8/413/8/7

Fingerprint

Robotics
Robotic arms
Communication
Robots

Keywords

  • Acoustic features
  • Arm robot
  • Auditory system
  • Microphone array
  • Sound tracking

ASJC Scopus subject areas

  • Artificial Intelligence
  • Electrical and Electronic Engineering
  • Mechanical Engineering

Cite this

Sawada, H., & Udaka, C. (2013). A robotic auditory system for imitating human listening behavior. In 2013 IEEE International Conference on Mechatronics and Automation, IEEE ICMA 2013 (pp. 773-778). [6618014] https://doi.org/10.1109/ICMA.2013.6618014

A robotic auditory system for imitating human listening behavior. / Sawada, Hideyuki; Udaka, Chika.

2013 IEEE International Conference on Mechatronics and Automation, IEEE ICMA 2013. 2013. p. 773-778 6618014.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Sawada, H & Udaka, C 2013, A robotic auditory system for imitating human listening behavior. in 2013 IEEE International Conference on Mechatronics and Automation, IEEE ICMA 2013., 6618014, pp. 773-778, 2013 10th IEEE International Conference on Mechatronics and Automation, IEEE ICMA 2013, Takamastu, 13/8/4. https://doi.org/10.1109/ICMA.2013.6618014
Sawada H, Udaka C. A robotic auditory system for imitating human listening behavior. In 2013 IEEE International Conference on Mechatronics and Automation, IEEE ICMA 2013. 2013. p. 773-778. 6618014 https://doi.org/10.1109/ICMA.2013.6618014
Sawada, Hideyuki ; Udaka, Chika. / A robotic auditory system for imitating human listening behavior. 2013 IEEE International Conference on Mechatronics and Automation, IEEE ICMA 2013. 2013. pp. 773-778
@inproceedings{808dabf4ddbf431faf3f01af1385ee48,
title = "A robotic auditory system for imitating human listening behavior",
abstract = "In the vocal communication, we employ not only the speech and auditory ability but also the body actions such as gestures and gaze to react to the speech. For example, a human nods to a speaker for an agreement, and makes eye contact during the conversation. The purpose of this study is to realize a human-like auditory system using a robotic arm, which listens to a human voice like a human, and interacts with human speakers. In this paper, the imitation of human listening behavior for realizing the natural communication between a human and a robot by the robotic auditory system is introduced.",
keywords = "Acoustic features, Arm robot, Auditory system, Microphone array, Sound tracking",
author = "Hideyuki Sawada and Chika Udaka",
year = "2013",
doi = "10.1109/ICMA.2013.6618014",
language = "English",
isbn = "9781467355582",
pages = "773--778",
booktitle = "2013 IEEE International Conference on Mechatronics and Automation, IEEE ICMA 2013",

}

TY - GEN

T1 - A robotic auditory system for imitating human listening behavior

AU - Sawada, Hideyuki

AU - Udaka, Chika

PY - 2013

Y1 - 2013

N2 - In the vocal communication, we employ not only the speech and auditory ability but also the body actions such as gestures and gaze to react to the speech. For example, a human nods to a speaker for an agreement, and makes eye contact during the conversation. The purpose of this study is to realize a human-like auditory system using a robotic arm, which listens to a human voice like a human, and interacts with human speakers. In this paper, the imitation of human listening behavior for realizing the natural communication between a human and a robot by the robotic auditory system is introduced.

AB - In the vocal communication, we employ not only the speech and auditory ability but also the body actions such as gestures and gaze to react to the speech. For example, a human nods to a speaker for an agreement, and makes eye contact during the conversation. The purpose of this study is to realize a human-like auditory system using a robotic arm, which listens to a human voice like a human, and interacts with human speakers. In this paper, the imitation of human listening behavior for realizing the natural communication between a human and a robot by the robotic auditory system is introduced.

KW - Acoustic features

KW - Arm robot

KW - Auditory system

KW - Microphone array

KW - Sound tracking

UR - http://www.scopus.com/inward/record.url?scp=84887889367&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84887889367&partnerID=8YFLogxK

U2 - 10.1109/ICMA.2013.6618014

DO - 10.1109/ICMA.2013.6618014

M3 - Conference contribution

AN - SCOPUS:84887889367

SN - 9781467355582

SP - 773

EP - 778

BT - 2013 IEEE International Conference on Mechatronics and Automation, IEEE ICMA 2013

ER -