Effect of a flexible spine emotional belly dancing robot on human perceptions

Jimmy Or, Atsuo Takanishi

    研究成果: Article

    6 引用 (Scopus)

    抄録

    Recently, there has been a growing interest in human-robot interaction. Researchers in artificial intelligence and robotics have built various types of social robots which can express emotions through speech, facial expressions and hand gestures. Although some of these robots are able to interact with humans in interesting ways, they cannot move as naturally as we do because of the limited number of degrees of freedom in their body torsos (some of them do not even have a torso). Since we often express and perceive each other's emotions and motives at a distance using body language alone, it would be good for the next generation of humanoid robots to possess similar capabilities. As a first step towards this goal, we developed a 28-DOF full-body humanoid robot as an experimental platform. Unlike the current generation of humanoid robots, our robot has a flexible spine. This feature is very important because counterbalancing movements of the spine are required to maintain dynamic stability in humans and humanoid robots. Our robot can belly dance and communicate affective motions via full-body movements. Using a Central Pattern Generator (CPG) based controller, we generated rhythmic motions for the arms, upper and lower bodies. We then conducted psychological experiments using both the robot and a human actors. Statistical analyses were carried out to test our hypotheses on human perception of affective movements. Experimental results show that human subjects were able to perceive emotions from the robot based only on its body motions, sometimes as well as recognizing the movements being performed by the human actor. Our robot can be used to examine the relationship between the movement of the spine, shoulders, arms, neck and head when attempting to reproduce affective movements. Psychologists, actors, dancers and animators can benefit from this line of research by learning how emotions can be conveyed through body motions and knowing how body part movements combine to communicate emotional expressions.

    元の言語English
    ページ(範囲)21-48
    ページ数28
    ジャーナルInternational Journal of Humanoid Robotics
    4
    発行部数1
    DOI
    出版物ステータスPublished - 2007 3

    Fingerprint

    Robots
    Human robot interaction
    End effectors
    Artificial intelligence
    Robotics
    Controllers

    ASJC Scopus subject areas

    • Artificial Intelligence
    • Mechanical Engineering

    これを引用

    @article{1ec87bc38c744e5780b8a8bbb74b756e,
    title = "Effect of a flexible spine emotional belly dancing robot on human perceptions",
    abstract = "Recently, there has been a growing interest in human-robot interaction. Researchers in artificial intelligence and robotics have built various types of social robots which can express emotions through speech, facial expressions and hand gestures. Although some of these robots are able to interact with humans in interesting ways, they cannot move as naturally as we do because of the limited number of degrees of freedom in their body torsos (some of them do not even have a torso). Since we often express and perceive each other's emotions and motives at a distance using body language alone, it would be good for the next generation of humanoid robots to possess similar capabilities. As a first step towards this goal, we developed a 28-DOF full-body humanoid robot as an experimental platform. Unlike the current generation of humanoid robots, our robot has a flexible spine. This feature is very important because counterbalancing movements of the spine are required to maintain dynamic stability in humans and humanoid robots. Our robot can belly dance and communicate affective motions via full-body movements. Using a Central Pattern Generator (CPG) based controller, we generated rhythmic motions for the arms, upper and lower bodies. We then conducted psychological experiments using both the robot and a human actors. Statistical analyses were carried out to test our hypotheses on human perception of affective movements. Experimental results show that human subjects were able to perceive emotions from the robot based only on its body motions, sometimes as well as recognizing the movements being performed by the human actor. Our robot can be used to examine the relationship between the movement of the spine, shoulders, arms, neck and head when attempting to reproduce affective movements. Psychologists, actors, dancers and animators can benefit from this line of research by learning how emotions can be conveyed through body motions and knowing how body part movements combine to communicate emotional expressions.",
    keywords = "Body language, Body postures, Emotional belly dancing, Flexible spine humanoid robot, Full-body, Human-robot interaction",
    author = "Jimmy Or and Atsuo Takanishi",
    year = "2007",
    month = "3",
    doi = "10.1142/S0219843607000935",
    language = "English",
    volume = "4",
    pages = "21--48",
    journal = "International Journal of Humanoid Robotics",
    issn = "0219-8436",
    publisher = "World Scientific Publishing Co. Pte Ltd",
    number = "1",

    }

    TY - JOUR

    T1 - Effect of a flexible spine emotional belly dancing robot on human perceptions

    AU - Or, Jimmy

    AU - Takanishi, Atsuo

    PY - 2007/3

    Y1 - 2007/3

    N2 - Recently, there has been a growing interest in human-robot interaction. Researchers in artificial intelligence and robotics have built various types of social robots which can express emotions through speech, facial expressions and hand gestures. Although some of these robots are able to interact with humans in interesting ways, they cannot move as naturally as we do because of the limited number of degrees of freedom in their body torsos (some of them do not even have a torso). Since we often express and perceive each other's emotions and motives at a distance using body language alone, it would be good for the next generation of humanoid robots to possess similar capabilities. As a first step towards this goal, we developed a 28-DOF full-body humanoid robot as an experimental platform. Unlike the current generation of humanoid robots, our robot has a flexible spine. This feature is very important because counterbalancing movements of the spine are required to maintain dynamic stability in humans and humanoid robots. Our robot can belly dance and communicate affective motions via full-body movements. Using a Central Pattern Generator (CPG) based controller, we generated rhythmic motions for the arms, upper and lower bodies. We then conducted psychological experiments using both the robot and a human actors. Statistical analyses were carried out to test our hypotheses on human perception of affective movements. Experimental results show that human subjects were able to perceive emotions from the robot based only on its body motions, sometimes as well as recognizing the movements being performed by the human actor. Our robot can be used to examine the relationship between the movement of the spine, shoulders, arms, neck and head when attempting to reproduce affective movements. Psychologists, actors, dancers and animators can benefit from this line of research by learning how emotions can be conveyed through body motions and knowing how body part movements combine to communicate emotional expressions.

    AB - Recently, there has been a growing interest in human-robot interaction. Researchers in artificial intelligence and robotics have built various types of social robots which can express emotions through speech, facial expressions and hand gestures. Although some of these robots are able to interact with humans in interesting ways, they cannot move as naturally as we do because of the limited number of degrees of freedom in their body torsos (some of them do not even have a torso). Since we often express and perceive each other's emotions and motives at a distance using body language alone, it would be good for the next generation of humanoid robots to possess similar capabilities. As a first step towards this goal, we developed a 28-DOF full-body humanoid robot as an experimental platform. Unlike the current generation of humanoid robots, our robot has a flexible spine. This feature is very important because counterbalancing movements of the spine are required to maintain dynamic stability in humans and humanoid robots. Our robot can belly dance and communicate affective motions via full-body movements. Using a Central Pattern Generator (CPG) based controller, we generated rhythmic motions for the arms, upper and lower bodies. We then conducted psychological experiments using both the robot and a human actors. Statistical analyses were carried out to test our hypotheses on human perception of affective movements. Experimental results show that human subjects were able to perceive emotions from the robot based only on its body motions, sometimes as well as recognizing the movements being performed by the human actor. Our robot can be used to examine the relationship between the movement of the spine, shoulders, arms, neck and head when attempting to reproduce affective movements. Psychologists, actors, dancers and animators can benefit from this line of research by learning how emotions can be conveyed through body motions and knowing how body part movements combine to communicate emotional expressions.

    KW - Body language

    KW - Body postures

    KW - Emotional belly dancing

    KW - Flexible spine humanoid robot

    KW - Full-body

    KW - Human-robot interaction

    UR - http://www.scopus.com/inward/record.url?scp=34248549403&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=34248549403&partnerID=8YFLogxK

    U2 - 10.1142/S0219843607000935

    DO - 10.1142/S0219843607000935

    M3 - Article

    AN - SCOPUS:34248549403

    VL - 4

    SP - 21

    EP - 48

    JO - International Journal of Humanoid Robotics

    JF - International Journal of Humanoid Robotics

    SN - 0219-8436

    IS - 1

    ER -