Multiple kernel learning with gaussianity measures

Hideitsu Hino, Nima Reyhani, Noboru Murata

    Research output: Contribution to journalArticle

    5 Citations (Scopus)

    Abstract

    Kernel methods are known to be effective for nonlinear multivariate analysis. One of the main issues in the practical use of kernel methods is the selection of kernel. There have been a lot of studies on kernel selection and kernel learning. Multiple kernel learning (MKL) is one of the promising kernel optimization approaches. Kernel methods are applied to various classifiers including Fisher discriminant analysis (FDA). FDA gives the Bayes optimal classification axis if the data distribution of each class in the feature space is a gaussian with a shared covariance structure. Based on this fact, an MKL framework based on the notion of gaussianity is proposed. As a concrete implementation, an empirical characteristic function is adopted to measure gaussianity in the feature space associated with a convex combination of kernel functions, and twoMKL algorithms are derived. From experimental results on some data sets, we show that the proposed kernel learning followed by FDAoffers strong classification power.

    Original languageEnglish
    Pages (from-to)1853-1881
    Number of pages29
    JournalNeural Computation
    Volume24
    Issue number7
    Publication statusPublished - 2012

    Fingerprint

    Learning
    Discriminant Analysis
    Multivariate Analysis
    Kernel
    Kernel Methods
    Power (Psychology)
    Datasets

    ASJC Scopus subject areas

    • Cognitive Neuroscience
    • Arts and Humanities (miscellaneous)

    Cite this

    Multiple kernel learning with gaussianity measures. / Hino, Hideitsu; Reyhani, Nima; Murata, Noboru.

    In: Neural Computation, Vol. 24, No. 7, 2012, p. 1853-1881.

    Research output: Contribution to journalArticle

    Hino, H, Reyhani, N & Murata, N 2012, 'Multiple kernel learning with gaussianity measures', Neural Computation, vol. 24, no. 7, pp. 1853-1881.
    Hino, Hideitsu ; Reyhani, Nima ; Murata, Noboru. / Multiple kernel learning with gaussianity measures. In: Neural Computation. 2012 ; Vol. 24, No. 7. pp. 1853-1881.
    @article{007251c31f5a48679198b32ae9e488ff,
    title = "Multiple kernel learning with gaussianity measures",
    abstract = "Kernel methods are known to be effective for nonlinear multivariate analysis. One of the main issues in the practical use of kernel methods is the selection of kernel. There have been a lot of studies on kernel selection and kernel learning. Multiple kernel learning (MKL) is one of the promising kernel optimization approaches. Kernel methods are applied to various classifiers including Fisher discriminant analysis (FDA). FDA gives the Bayes optimal classification axis if the data distribution of each class in the feature space is a gaussian with a shared covariance structure. Based on this fact, an MKL framework based on the notion of gaussianity is proposed. As a concrete implementation, an empirical characteristic function is adopted to measure gaussianity in the feature space associated with a convex combination of kernel functions, and twoMKL algorithms are derived. From experimental results on some data sets, we show that the proposed kernel learning followed by FDAoffers strong classification power.",
    author = "Hideitsu Hino and Nima Reyhani and Noboru Murata",
    year = "2012",
    language = "English",
    volume = "24",
    pages = "1853--1881",
    journal = "Neural Computation",
    issn = "0899-7667",
    publisher = "MIT Press Journals",
    number = "7",

    }

    TY - JOUR

    T1 - Multiple kernel learning with gaussianity measures

    AU - Hino, Hideitsu

    AU - Reyhani, Nima

    AU - Murata, Noboru

    PY - 2012

    Y1 - 2012

    N2 - Kernel methods are known to be effective for nonlinear multivariate analysis. One of the main issues in the practical use of kernel methods is the selection of kernel. There have been a lot of studies on kernel selection and kernel learning. Multiple kernel learning (MKL) is one of the promising kernel optimization approaches. Kernel methods are applied to various classifiers including Fisher discriminant analysis (FDA). FDA gives the Bayes optimal classification axis if the data distribution of each class in the feature space is a gaussian with a shared covariance structure. Based on this fact, an MKL framework based on the notion of gaussianity is proposed. As a concrete implementation, an empirical characteristic function is adopted to measure gaussianity in the feature space associated with a convex combination of kernel functions, and twoMKL algorithms are derived. From experimental results on some data sets, we show that the proposed kernel learning followed by FDAoffers strong classification power.

    AB - Kernel methods are known to be effective for nonlinear multivariate analysis. One of the main issues in the practical use of kernel methods is the selection of kernel. There have been a lot of studies on kernel selection and kernel learning. Multiple kernel learning (MKL) is one of the promising kernel optimization approaches. Kernel methods are applied to various classifiers including Fisher discriminant analysis (FDA). FDA gives the Bayes optimal classification axis if the data distribution of each class in the feature space is a gaussian with a shared covariance structure. Based on this fact, an MKL framework based on the notion of gaussianity is proposed. As a concrete implementation, an empirical characteristic function is adopted to measure gaussianity in the feature space associated with a convex combination of kernel functions, and twoMKL algorithms are derived. From experimental results on some data sets, we show that the proposed kernel learning followed by FDAoffers strong classification power.

    UR - http://www.scopus.com/inward/record.url?scp=84874025051&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=84874025051&partnerID=8YFLogxK

    M3 - Article

    AN - SCOPUS:84874025051

    VL - 24

    SP - 1853

    EP - 1881

    JO - Neural Computation

    JF - Neural Computation

    SN - 0899-7667

    IS - 7

    ER -