Independent component analysis minimizing convex divergence

Yasuo Matsuyama, Naoto Katsumata, Ryo Kawamura

    Research output: Contribution to journalArticle

    8 Citations (Scopus)

    Abstract

    A new class of learning algorithms for independent component analysis (ICA) is presented. Starting from theoretical discussions on convex divergence, this information measure is minimized to derive new ICA algorithms. Since the convex divergence includes logarithmic information measures as special cases, the presented method comprises faster algorithms than existing logarithmic ones. Another important feature of this paper's ICA algorithm is to accept supervisory information. This ability is utilized to reduce the permutation indeterminacy which is inherent in usual ICA. By this method, the most important activation pattern can be found as the top one. The total algorithm is tested through applications to brain map distillation from functional MRI data. The derived algorithm is faster than logarithmic ones with little additional memory requirement, and can find task related brain maps successfully via conventional personal computer.

    Original languageEnglish
    Pages (from-to)27-34
    Number of pages8
    JournalLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    Volume2714
    Publication statusPublished - 2003

    Fingerprint

    Independent component analysis
    Independent Component Analysis
    Divergence
    Logarithmic
    Information Measure
    Brain
    Indeterminacy
    Distillation
    Personal Computer
    Fast Algorithm
    Activation
    Learning Algorithm
    Permutation
    Aptitude
    Personal computers
    Learning algorithms
    Microcomputers
    Chemical activation
    Requirements
    Data storage equipment

    ASJC Scopus subject areas

    • Computer Science(all)
    • Biochemistry, Genetics and Molecular Biology(all)
    • Theoretical Computer Science

    Cite this

    Independent component analysis minimizing convex divergence. / Matsuyama, Yasuo; Katsumata, Naoto; Kawamura, Ryo.

    In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Vol. 2714, 2003, p. 27-34.

    Research output: Contribution to journalArticle

    @article{9b3e92ccc3d94a7c94727859e115d79f,
    title = "Independent component analysis minimizing convex divergence",
    abstract = "A new class of learning algorithms for independent component analysis (ICA) is presented. Starting from theoretical discussions on convex divergence, this information measure is minimized to derive new ICA algorithms. Since the convex divergence includes logarithmic information measures as special cases, the presented method comprises faster algorithms than existing logarithmic ones. Another important feature of this paper's ICA algorithm is to accept supervisory information. This ability is utilized to reduce the permutation indeterminacy which is inherent in usual ICA. By this method, the most important activation pattern can be found as the top one. The total algorithm is tested through applications to brain map distillation from functional MRI data. The derived algorithm is faster than logarithmic ones with little additional memory requirement, and can find task related brain maps successfully via conventional personal computer.",
    author = "Yasuo Matsuyama and Naoto Katsumata and Ryo Kawamura",
    year = "2003",
    language = "English",
    volume = "2714",
    pages = "27--34",
    journal = "Lecture Notes in Computer Science",
    issn = "0302-9743",
    publisher = "Springer Verlag",

    }

    TY - JOUR

    T1 - Independent component analysis minimizing convex divergence

    AU - Matsuyama, Yasuo

    AU - Katsumata, Naoto

    AU - Kawamura, Ryo

    PY - 2003

    Y1 - 2003

    N2 - A new class of learning algorithms for independent component analysis (ICA) is presented. Starting from theoretical discussions on convex divergence, this information measure is minimized to derive new ICA algorithms. Since the convex divergence includes logarithmic information measures as special cases, the presented method comprises faster algorithms than existing logarithmic ones. Another important feature of this paper's ICA algorithm is to accept supervisory information. This ability is utilized to reduce the permutation indeterminacy which is inherent in usual ICA. By this method, the most important activation pattern can be found as the top one. The total algorithm is tested through applications to brain map distillation from functional MRI data. The derived algorithm is faster than logarithmic ones with little additional memory requirement, and can find task related brain maps successfully via conventional personal computer.

    AB - A new class of learning algorithms for independent component analysis (ICA) is presented. Starting from theoretical discussions on convex divergence, this information measure is minimized to derive new ICA algorithms. Since the convex divergence includes logarithmic information measures as special cases, the presented method comprises faster algorithms than existing logarithmic ones. Another important feature of this paper's ICA algorithm is to accept supervisory information. This ability is utilized to reduce the permutation indeterminacy which is inherent in usual ICA. By this method, the most important activation pattern can be found as the top one. The total algorithm is tested through applications to brain map distillation from functional MRI data. The derived algorithm is faster than logarithmic ones with little additional memory requirement, and can find task related brain maps successfully via conventional personal computer.

    UR - http://www.scopus.com/inward/record.url?scp=21144450306&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=21144450306&partnerID=8YFLogxK

    M3 - Article

    AN - SCOPUS:21144450306

    VL - 2714

    SP - 27

    EP - 34

    JO - Lecture Notes in Computer Science

    JF - Lecture Notes in Computer Science

    SN - 0302-9743

    ER -