This paper presents an inertial measurement unit-based human gesture recognition system for a robot instrument player to understand the instructions dictated by an orchestra conductor and accordingly adapt its musical performance. It is an extension of our previous publications on natural human-robot musical interaction. With this system, the robot can understand the real-time variations in musical parameters dictated by the conductor’s movements, adding expression to its performance while being synchronized with all the other human partner musicians. The enhanced interaction ability would obviously lead to an improvement of the overall live performance, but also allow the partner musicians, as well as the conductor, to better appreciate a joint musical performance, thanks to the complete naturalness of the interaction.
ASJC Scopus subject areas
- Control and Systems Engineering
- Human-Computer Interaction
- Hardware and Architecture
- Computer Science Applications