During several years of development, the hardware of the anthropomorphic flutist robot Waseda Flutist WF-4RIV has been continuously improved. The robot is currently able to play the flute at the level of an intermediate human player. Lately we have been focusing our research on the interactivity of the performance of the robot. Initially the robot has only been able to play a static performance that was not actively controllable by a partner musician of the robot. In a realistic performance set-up, in a band or an orchestra, musicians need to interact in order to create a performance that gives a natural and dynamic impression to the audience. In this publication we present the latest developments on the integration a Musical-based Interaction System (MbIS) with WF-4RIV. Such a human robot interaction system is to allow human musicians to do natural musical communication with the flutist robot through audio-visual cues. Here we would like to summarize our previous results, present the latest extensions to the system and especially concentrate on experimenting with applications of the system. We evaluate our interactive performance system using three different methods: A comparison of a passive (non-interactive) and an interactive performance, evaluation of the technical functionality of the interaction system as a whole and by examining the MbIS from a user perspective with a user survey including amateur and professional musicians. We present experiment results that show that our Musical-based Interaction System extends the anthropomorphic design of the flutist robot, to allow increasingly interactive, natural musical performances with human musicians.