There is a population in the world who loses the function of upper extremity due to the accidence or disease. The upper-extremity disorders significantly reduce the people's quality of life due to losing the ability to carry out the activities of daily living, which mostly require the upper-limb function. Therefore, the needs of the upper-limb assistance devices for the upper extremity increased. In this research, we proposed a motion intention recognition system based on the Kinect® v2 sensor. The sensor directly detected the user's motion and further control the device with the corresponding angles instead of using the pre-trajectory to control the device. Since the body dimensions have the individual difference, we considered the unconstrained user-device interface by using two pressure sensor trays on each robot arm to support the user's forearm and upper arm, respectively. The unconstrained user-device system can slightly compensate not only the individual difference but the control error. Therefore, the unconstrained user-device model was established to obtain the relationship between the user and the device, and further control the device using the recorded user's motion. Additionally, the Kinect® sensor can capture the coordination of human joints and further calculate the arm length of the user, which can realize the adaptivity of different user. To realize the real-time control and assistance, the Kalman filter which has prediction function was exploited. The feasibility of assistance was confirmed by the system response. The results proved that the proposed motion recognition system and the unconstrained user-device system can successfully provide adequate assistance with a lesser time delay compared with the system without Kalman filter.