Tutorial series on brain-inspired computing part 6

Geometrical structure of boosting algorithm

Takafumi Kanamori, Takashi Takenouchi, Noboru Murata

    Research output: Contribution to journalArticle

    1 Citation (Scopus)

    Abstract

    In this article, several boosting methods are discussed, which are notable implementations of the ensemble learning. Starting from the firstly introduced "boosting by filter" which is an embodiment of the proverb "Two heads are better than one", more advanced versions of boosting methods "AdaBoost" and "U-Boost" are introduced. A geometrical structure and some statistical properties such as consistency and robustness of boosting algorithms are discussed, and then simulation studies are presented for confirming discussed behaviors of algorithms.

    Original languageEnglish
    Pages (from-to)117-141
    Number of pages25
    JournalNew Generation Computing
    Volume25
    Issue number1
    DOIs
    Publication statusPublished - 2007

    Fingerprint

    Boosting
    Brain
    Adaptive boosting
    Series
    Computing
    Ensemble Learning
    Embodiment
    AdaBoost
    Statistical property
    Simulation Study
    Filter
    Robustness

    Keywords

    • Boosting
    • Classification problem
    • Large-scale learning machine
    • Statistical learning theory

    ASJC Scopus subject areas

    • Hardware and Architecture
    • Theoretical Computer Science
    • Computational Theory and Mathematics

    Cite this

    Tutorial series on brain-inspired computing part 6 : Geometrical structure of boosting algorithm. / Kanamori, Takafumi; Takenouchi, Takashi; Murata, Noboru.

    In: New Generation Computing, Vol. 25, No. 1, 2007, p. 117-141.

    Research output: Contribution to journalArticle

    @article{58c3450079a0478eab59c927186c4b25,
    title = "Tutorial series on brain-inspired computing part 6: Geometrical structure of boosting algorithm",
    abstract = "In this article, several boosting methods are discussed, which are notable implementations of the ensemble learning. Starting from the firstly introduced {"}boosting by filter{"} which is an embodiment of the proverb {"}Two heads are better than one{"}, more advanced versions of boosting methods {"}AdaBoost{"} and {"}U-Boost{"} are introduced. A geometrical structure and some statistical properties such as consistency and robustness of boosting algorithms are discussed, and then simulation studies are presented for confirming discussed behaviors of algorithms.",
    keywords = "Boosting, Classification problem, Large-scale learning machine, Statistical learning theory",
    author = "Takafumi Kanamori and Takashi Takenouchi and Noboru Murata",
    year = "2007",
    doi = "10.1007/s00354-006-0006-0",
    language = "English",
    volume = "25",
    pages = "117--141",
    journal = "New Generation Computing",
    issn = "0288-3635",
    publisher = "Springer Japan",
    number = "1",

    }

    TY - JOUR

    T1 - Tutorial series on brain-inspired computing part 6

    T2 - Geometrical structure of boosting algorithm

    AU - Kanamori, Takafumi

    AU - Takenouchi, Takashi

    AU - Murata, Noboru

    PY - 2007

    Y1 - 2007

    N2 - In this article, several boosting methods are discussed, which are notable implementations of the ensemble learning. Starting from the firstly introduced "boosting by filter" which is an embodiment of the proverb "Two heads are better than one", more advanced versions of boosting methods "AdaBoost" and "U-Boost" are introduced. A geometrical structure and some statistical properties such as consistency and robustness of boosting algorithms are discussed, and then simulation studies are presented for confirming discussed behaviors of algorithms.

    AB - In this article, several boosting methods are discussed, which are notable implementations of the ensemble learning. Starting from the firstly introduced "boosting by filter" which is an embodiment of the proverb "Two heads are better than one", more advanced versions of boosting methods "AdaBoost" and "U-Boost" are introduced. A geometrical structure and some statistical properties such as consistency and robustness of boosting algorithms are discussed, and then simulation studies are presented for confirming discussed behaviors of algorithms.

    KW - Boosting

    KW - Classification problem

    KW - Large-scale learning machine

    KW - Statistical learning theory

    UR - http://www.scopus.com/inward/record.url?scp=33846302076&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=33846302076&partnerID=8YFLogxK

    U2 - 10.1007/s00354-006-0006-0

    DO - 10.1007/s00354-006-0006-0

    M3 - Article

    VL - 25

    SP - 117

    EP - 141

    JO - New Generation Computing

    JF - New Generation Computing

    SN - 0288-3635

    IS - 1

    ER -