In this paper, we propose a method for quantifying human-robot physical contact states based on tactile sensory data, as a first step to realize real-time contact state identification systems. Artificial tactile cognition for robots constructed by this method, which nearly copies human's tactile cognition performance, is herein presented. First, we have made robots learn the relationship between characteristics of tactile stimuli sensed and expressions to the stimuli verbalized by a human (receiver) when he/she is touched by other people. As a result of learning by a neural network called MCP (modified counter propagation), self-organizing maps that contain the quantitative relationship are formed. Next, in order to quantify the performance of receiver's tactile probability among contact states is proposed. Connection weights in the neural network are applied to calculate it. Confusion matrix enables robots that come into contact with a human, to recognize and infer the aspect of contact states almost the same as the receiver represents, based on only tactile sensing. Finally, from experiments, we confirmed that the proposed method is useful for quantifying human-robot contact states.