The understanding of emotions in humans is really important in the Human-Robot Interaction field. The affective state of a person can be expressed in several ways, one of them being through the way we walk and our gait can convey emotional clues in social context. Those clues can be used to improve the personal interactions with our peers or add meaning to any message we want to express. However, only a few studies in humanoid robotics were done on the effects of the emotions on the walking. In this paper, we propose to assess the emotional walking patterns created from motion capture data with a survey. Those patterns represent different emotions (sadness, happiness) with different intensities (middle, high and exaggerated). Those emotional walking patterns achieved a high recognition rate of the emotions and the subjects (N=13) could recognize whole body emotions without facial expression on our humanoid robot. Additionally, we found out that at first people might perform poorly at recognizing emotions and their intensities but can get better, even without correction or feedback on their performances.