Abstract
In this article, several boosting methods are discussed, which are notable implementations of the ensemble learning. Starting from the firstly introduced "boosting by filter" which is an embodiment of the proverb "Two heads are better than one", more advanced versions of boosting methods "AdaBoost" and "U-Boost" are introduced. A geometrical structure and some statistical properties such as consistency and robustness of boosting algorithms are discussed, and then simulation studies are presented for confirming discussed behaviors of algorithms.
Original language | English |
---|---|
Pages (from-to) | 117-141 |
Number of pages | 25 |
Journal | New Generation Computing |
Volume | 25 |
Issue number | 1 |
DOIs | |
Publication status | Published - 2007 Jan 24 |
Keywords
- Boosting
- Classification problem
- Large-scale learning machine
- Statistical learning theory
ASJC Scopus subject areas
- Software
- Theoretical Computer Science
- Hardware and Architecture
- Computer Networks and Communications