Extended roles of robots for activities of daily living (ADL) lead to researchers’ increasing attention to human-robot interaction. Emotional recognition has been regarded as an important issue from the human mental aspect. We are developing an assistive walking device which considers the correlation between physical assistance and mental conditions for the user. To connect the assistive device and user mental conditions, it is necessary to evaluate emotion in real-time. This study aims to develop a new method of two-dimensional valence-arousal model emotion evaluation with multiple physiological signals. We elicit users’ emotion change based on normative affective stimuli database, and further extract multiple physiological signals from the subjects. Moreover, we implement various algorithms (k-means, T method of MTS (Mahalanobis Taguchi System) and DNN (deep neural network)) for determining the emotional state from physiological data. Finally, the findings indicate that deep neural network method can precisely recognize the human emotional state.