TY - GEN
T1 - Development of a real-time instrument tracking system for enabling the musical interaction with the Waseda flutist robot
AU - Petersen, Klaus
AU - Solis, Jorge
AU - Takanishi, Atsuo
PY - 2008/12/1
Y1 - 2008/12/1
N2 - The aim of this paper is to create an interface for human-robot interaction. Specifically, musical performance parameters (i.e. vibrato expression) of the Waseda Flutist Robot No.4 Refined IV (WF-4RIV) are to be manipulated. Our research is focused on enabling the WF-4RIV to interact with human players (musicians) in a natural way. In this paper, as a first approach, a vision processing algorithm, that is able to track the 3D-orientation and position of a musical instrument, was developed. In particular, the robot acquires image data through two cameras attached to its head. Using color histogram matching and a particle filter, the position of the musician's hands on the instrument are tracked. Analysis of this data determines orientation and location of the instrument. These parameters are mapped to manipulate the musical expression of the WF-4RIV, more specifically sound vibrato and volume values.We present preliminary experiments to determine if the robot may dynamically change musical parameters while interacting with a human player (i.e. vibrato etc.). From the experimental results, we may confirm the feasibility of the interaction during a performance, although further research must be carried out to consider the physical constraints of the flutist robot.
AB - The aim of this paper is to create an interface for human-robot interaction. Specifically, musical performance parameters (i.e. vibrato expression) of the Waseda Flutist Robot No.4 Refined IV (WF-4RIV) are to be manipulated. Our research is focused on enabling the WF-4RIV to interact with human players (musicians) in a natural way. In this paper, as a first approach, a vision processing algorithm, that is able to track the 3D-orientation and position of a musical instrument, was developed. In particular, the robot acquires image data through two cameras attached to its head. Using color histogram matching and a particle filter, the position of the musician's hands on the instrument are tracked. Analysis of this data determines orientation and location of the instrument. These parameters are mapped to manipulate the musical expression of the WF-4RIV, more specifically sound vibrato and volume values.We present preliminary experiments to determine if the robot may dynamically change musical parameters while interacting with a human player (i.e. vibrato etc.). From the experimental results, we may confirm the feasibility of the interaction during a performance, although further research must be carried out to consider the physical constraints of the flutist robot.
UR - http://www.scopus.com/inward/record.url?scp=69549116703&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=69549116703&partnerID=8YFLogxK
U2 - 10.1109/IROS.2008.4650831
DO - 10.1109/IROS.2008.4650831
M3 - Conference contribution
AN - SCOPUS:69549116703
SN - 9781424420582
T3 - 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS
SP - 313
EP - 318
BT - 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS
T2 - 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS
Y2 - 22 September 2008 through 26 September 2008
ER -