The most robust loss function for boosting

Takafumi Kanamori, Takashi Takenouchi, Shinto Eguchi, Noboru Murata

    Research output: Contribution to journalArticle

    10 Citations (Scopus)

    Abstract

    Boosting algorithm is understood as the gradient descent algorithm of a loss function. It is often pointed out that the typical boosting algorithm, Adaboost, is seriously affected by the outliers. In this paper, loss functions for robust boosting are studied. Based on a concept of the robust statistics, we propose a positive-part-truncation of the loss function which makes the boosting algorithm robust against extreme outliers. Numerical experiments show that the proposed boosting algorithm is useful for highly noisy data in comparison with other competitors.

    Original languageEnglish
    Pages (from-to)496-501
    Number of pages6
    JournalLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    Volume3316
    Publication statusPublished - 2004

    Fingerprint

    Boosting
    Loss Function
    Outlier
    Robust Statistics
    Adaptive boosting
    Descent Algorithm
    AdaBoost
    Robust Algorithm
    Gradient Algorithm
    Gradient Descent
    Noisy Data
    Truncation
    Extremes
    Numerical Experiment
    Statistics
    Experiments

    ASJC Scopus subject areas

    • Computer Science(all)
    • Biochemistry, Genetics and Molecular Biology(all)
    • Theoretical Computer Science

    Cite this

    The most robust loss function for boosting. / Kanamori, Takafumi; Takenouchi, Takashi; Eguchi, Shinto; Murata, Noboru.

    In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Vol. 3316, 2004, p. 496-501.

    Research output: Contribution to journalArticle

    @article{6811686b27a64cb9acc66f72fd1d7b49,
    title = "The most robust loss function for boosting",
    abstract = "Boosting algorithm is understood as the gradient descent algorithm of a loss function. It is often pointed out that the typical boosting algorithm, Adaboost, is seriously affected by the outliers. In this paper, loss functions for robust boosting are studied. Based on a concept of the robust statistics, we propose a positive-part-truncation of the loss function which makes the boosting algorithm robust against extreme outliers. Numerical experiments show that the proposed boosting algorithm is useful for highly noisy data in comparison with other competitors.",
    author = "Takafumi Kanamori and Takashi Takenouchi and Shinto Eguchi and Noboru Murata",
    year = "2004",
    language = "English",
    volume = "3316",
    pages = "496--501",
    journal = "Lecture Notes in Computer Science",
    issn = "0302-9743",
    publisher = "Springer Verlag",

    }

    TY - JOUR

    T1 - The most robust loss function for boosting

    AU - Kanamori, Takafumi

    AU - Takenouchi, Takashi

    AU - Eguchi, Shinto

    AU - Murata, Noboru

    PY - 2004

    Y1 - 2004

    N2 - Boosting algorithm is understood as the gradient descent algorithm of a loss function. It is often pointed out that the typical boosting algorithm, Adaboost, is seriously affected by the outliers. In this paper, loss functions for robust boosting are studied. Based on a concept of the robust statistics, we propose a positive-part-truncation of the loss function which makes the boosting algorithm robust against extreme outliers. Numerical experiments show that the proposed boosting algorithm is useful for highly noisy data in comparison with other competitors.

    AB - Boosting algorithm is understood as the gradient descent algorithm of a loss function. It is often pointed out that the typical boosting algorithm, Adaboost, is seriously affected by the outliers. In this paper, loss functions for robust boosting are studied. Based on a concept of the robust statistics, we propose a positive-part-truncation of the loss function which makes the boosting algorithm robust against extreme outliers. Numerical experiments show that the proposed boosting algorithm is useful for highly noisy data in comparison with other competitors.

    UR - http://www.scopus.com/inward/record.url?scp=35048894955&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=35048894955&partnerID=8YFLogxK

    M3 - Article

    AN - SCOPUS:35048894955

    VL - 3316

    SP - 496

    EP - 501

    JO - Lecture Notes in Computer Science

    JF - Lecture Notes in Computer Science

    SN - 0302-9743

    ER -