Robust loss functions for boosting

Takafumi Kanamori, Takashi Takenouchi, Shinto Eguchi, Noboru Murata

    Research output: Contribution to journalArticle

    27 Citations (Scopus)

    Abstract

    Boosting is known as a gradient descent algorithm over loss functions. It is often pointed out that the typical boosting algorithm, Adaboost, is highly affected by outliers. In this letter, loss functions for robust boosting are studied. Based on the concept of robust statistics, we propose a transformation of loss functions that makes boosting algorithms robust against extreme outliers. Next, the truncation of loss functions is applied to contamination models that describe the occurrence of mislabels near decision boundaries. Numerical experiments illustrate that the proposed loss functions derived from the contamination models are useful for handling highly noisy data in comparison with other loss functions.

    Original languageEnglish
    Pages (from-to)2183-2244
    Number of pages62
    JournalNeural Computation
    Volume19
    Issue number8
    Publication statusPublished - 2007 Aug

    Fingerprint

    Contamination
    Adaptive boosting
    Statistics
    Experiments
    Outliers
    Descent
    Experiment

    ASJC Scopus subject areas

    • Control and Systems Engineering
    • Artificial Intelligence
    • Neuroscience(all)

    Cite this

    Kanamori, T., Takenouchi, T., Eguchi, S., & Murata, N. (2007). Robust loss functions for boosting. Neural Computation, 19(8), 2183-2244.

    Robust loss functions for boosting. / Kanamori, Takafumi; Takenouchi, Takashi; Eguchi, Shinto; Murata, Noboru.

    In: Neural Computation, Vol. 19, No. 8, 08.2007, p. 2183-2244.

    Research output: Contribution to journalArticle

    Kanamori, T, Takenouchi, T, Eguchi, S & Murata, N 2007, 'Robust loss functions for boosting', Neural Computation, vol. 19, no. 8, pp. 2183-2244.
    Kanamori T, Takenouchi T, Eguchi S, Murata N. Robust loss functions for boosting. Neural Computation. 2007 Aug;19(8):2183-2244.
    Kanamori, Takafumi ; Takenouchi, Takashi ; Eguchi, Shinto ; Murata, Noboru. / Robust loss functions for boosting. In: Neural Computation. 2007 ; Vol. 19, No. 8. pp. 2183-2244.
    @article{bc38ceb461164d42a6049a02cfaac65e,
    title = "Robust loss functions for boosting",
    abstract = "Boosting is known as a gradient descent algorithm over loss functions. It is often pointed out that the typical boosting algorithm, Adaboost, is highly affected by outliers. In this letter, loss functions for robust boosting are studied. Based on the concept of robust statistics, we propose a transformation of loss functions that makes boosting algorithms robust against extreme outliers. Next, the truncation of loss functions is applied to contamination models that describe the occurrence of mislabels near decision boundaries. Numerical experiments illustrate that the proposed loss functions derived from the contamination models are useful for handling highly noisy data in comparison with other loss functions.",
    author = "Takafumi Kanamori and Takashi Takenouchi and Shinto Eguchi and Noboru Murata",
    year = "2007",
    month = "8",
    language = "English",
    volume = "19",
    pages = "2183--2244",
    journal = "Neural Computation",
    issn = "0899-7667",
    publisher = "MIT Press Journals",
    number = "8",

    }

    TY - JOUR

    T1 - Robust loss functions for boosting

    AU - Kanamori, Takafumi

    AU - Takenouchi, Takashi

    AU - Eguchi, Shinto

    AU - Murata, Noboru

    PY - 2007/8

    Y1 - 2007/8

    N2 - Boosting is known as a gradient descent algorithm over loss functions. It is often pointed out that the typical boosting algorithm, Adaboost, is highly affected by outliers. In this letter, loss functions for robust boosting are studied. Based on the concept of robust statistics, we propose a transformation of loss functions that makes boosting algorithms robust against extreme outliers. Next, the truncation of loss functions is applied to contamination models that describe the occurrence of mislabels near decision boundaries. Numerical experiments illustrate that the proposed loss functions derived from the contamination models are useful for handling highly noisy data in comparison with other loss functions.

    AB - Boosting is known as a gradient descent algorithm over loss functions. It is often pointed out that the typical boosting algorithm, Adaboost, is highly affected by outliers. In this letter, loss functions for robust boosting are studied. Based on the concept of robust statistics, we propose a transformation of loss functions that makes boosting algorithms robust against extreme outliers. Next, the truncation of loss functions is applied to contamination models that describe the occurrence of mislabels near decision boundaries. Numerical experiments illustrate that the proposed loss functions derived from the contamination models are useful for handling highly noisy data in comparison with other loss functions.

    UR - http://www.scopus.com/inward/record.url?scp=34548033981&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=34548033981&partnerID=8YFLogxK

    M3 - Article

    VL - 19

    SP - 2183

    EP - 2244

    JO - Neural Computation

    JF - Neural Computation

    SN - 0899-7667

    IS - 8

    ER -