Interactive virtual reality speech simulation system using autonomous audience with natural non-verbal behavior

Justin Andrew Liao, Nobuyuki Jincho, Hideaki Kikuchi

    Research output: Contribution to journalArticle

    Abstract

    Public speaking anxiety (PSA) is a fear of speaking in front of others. Most people experience a certain amount of anxiety in public speaking situation. This study aims to help people overcome PSA using an interactive VR simulation system with real-life scenarios. We present a multimodal VR speech simulation system using autonomous audience with natural non-verbal behavior to enhance users' sense of presence. Additionally, real-time multimodal feedback is produced by virtual audience based on users' public speaking behavior which automatically analyzed by multimodal sensors (e.g. microphone, motion capture, heart rate monitor). We perform an evaluation based on self-assessment questionnaires and biometry to investigate three study conditions: (I) control condition (baseline), (II) interactive virtual audience, and (III) virtual audience with natural non-verbal behavior. We divided participants into two groups with different conditions: interactive virtual audience condition (n = 7) and virtual audience with nature non-verbal behavior condition (n = 9). The results indicate that the usage of a virtual audience with natural non-verbal behavior increased a higher sense of presence and more anxiety-provoking.

    Original languageEnglish
    Pages (from-to)404-407
    Number of pages4
    JournalInternational Journal of Machine Learning and Computing
    Volume8
    Issue number4
    DOIs
    Publication statusPublished - 2018 Aug 1

    Fingerprint

    Microphones
    Virtual reality
    Feedback
    Sensors
    System simulation
    Anxiety

    Keywords

    • Autonomous virtual audience
    • Non-verbal behavior
    • Public speaking anxiety
    • Virtual reality

    ASJC Scopus subject areas

    • Computer Science Applications
    • Information Systems and Management
    • Artificial Intelligence

    Cite this

    Interactive virtual reality speech simulation system using autonomous audience with natural non-verbal behavior. / Liao, Justin Andrew; Jincho, Nobuyuki; Kikuchi, Hideaki.

    In: International Journal of Machine Learning and Computing, Vol. 8, No. 4, 01.08.2018, p. 404-407.

    Research output: Contribution to journalArticle

    @article{039a80b6a43545e5b6fece7f256400f9,
    title = "Interactive virtual reality speech simulation system using autonomous audience with natural non-verbal behavior",
    abstract = "Public speaking anxiety (PSA) is a fear of speaking in front of others. Most people experience a certain amount of anxiety in public speaking situation. This study aims to help people overcome PSA using an interactive VR simulation system with real-life scenarios. We present a multimodal VR speech simulation system using autonomous audience with natural non-verbal behavior to enhance users' sense of presence. Additionally, real-time multimodal feedback is produced by virtual audience based on users' public speaking behavior which automatically analyzed by multimodal sensors (e.g. microphone, motion capture, heart rate monitor). We perform an evaluation based on self-assessment questionnaires and biometry to investigate three study conditions: (I) control condition (baseline), (II) interactive virtual audience, and (III) virtual audience with natural non-verbal behavior. We divided participants into two groups with different conditions: interactive virtual audience condition (n = 7) and virtual audience with nature non-verbal behavior condition (n = 9). The results indicate that the usage of a virtual audience with natural non-verbal behavior increased a higher sense of presence and more anxiety-provoking.",
    keywords = "Autonomous virtual audience, Non-verbal behavior, Public speaking anxiety, Virtual reality",
    author = "Liao, {Justin Andrew} and Nobuyuki Jincho and Hideaki Kikuchi",
    year = "2018",
    month = "8",
    day = "1",
    doi = "10.18178/ijmlc.2018.8.4.720",
    language = "English",
    volume = "8",
    pages = "404--407",
    journal = "International Journal of Machine Learning and Computing",
    issn = "2010-3700",
    publisher = "International Association of Computer Science and Information Technology",
    number = "4",

    }

    TY - JOUR

    T1 - Interactive virtual reality speech simulation system using autonomous audience with natural non-verbal behavior

    AU - Liao, Justin Andrew

    AU - Jincho, Nobuyuki

    AU - Kikuchi, Hideaki

    PY - 2018/8/1

    Y1 - 2018/8/1

    N2 - Public speaking anxiety (PSA) is a fear of speaking in front of others. Most people experience a certain amount of anxiety in public speaking situation. This study aims to help people overcome PSA using an interactive VR simulation system with real-life scenarios. We present a multimodal VR speech simulation system using autonomous audience with natural non-verbal behavior to enhance users' sense of presence. Additionally, real-time multimodal feedback is produced by virtual audience based on users' public speaking behavior which automatically analyzed by multimodal sensors (e.g. microphone, motion capture, heart rate monitor). We perform an evaluation based on self-assessment questionnaires and biometry to investigate three study conditions: (I) control condition (baseline), (II) interactive virtual audience, and (III) virtual audience with natural non-verbal behavior. We divided participants into two groups with different conditions: interactive virtual audience condition (n = 7) and virtual audience with nature non-verbal behavior condition (n = 9). The results indicate that the usage of a virtual audience with natural non-verbal behavior increased a higher sense of presence and more anxiety-provoking.

    AB - Public speaking anxiety (PSA) is a fear of speaking in front of others. Most people experience a certain amount of anxiety in public speaking situation. This study aims to help people overcome PSA using an interactive VR simulation system with real-life scenarios. We present a multimodal VR speech simulation system using autonomous audience with natural non-verbal behavior to enhance users' sense of presence. Additionally, real-time multimodal feedback is produced by virtual audience based on users' public speaking behavior which automatically analyzed by multimodal sensors (e.g. microphone, motion capture, heart rate monitor). We perform an evaluation based on self-assessment questionnaires and biometry to investigate three study conditions: (I) control condition (baseline), (II) interactive virtual audience, and (III) virtual audience with natural non-verbal behavior. We divided participants into two groups with different conditions: interactive virtual audience condition (n = 7) and virtual audience with nature non-verbal behavior condition (n = 9). The results indicate that the usage of a virtual audience with natural non-verbal behavior increased a higher sense of presence and more anxiety-provoking.

    KW - Autonomous virtual audience

    KW - Non-verbal behavior

    KW - Public speaking anxiety

    KW - Virtual reality

    UR - http://www.scopus.com/inward/record.url?scp=85051846119&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=85051846119&partnerID=8YFLogxK

    U2 - 10.18178/ijmlc.2018.8.4.720

    DO - 10.18178/ijmlc.2018.8.4.720

    M3 - Article

    VL - 8

    SP - 404

    EP - 407

    JO - International Journal of Machine Learning and Computing

    JF - International Journal of Machine Learning and Computing

    SN - 2010-3700

    IS - 4

    ER -