Abstract
A machine learning method that integrates multiple component predictors into one predictor is referred to as ensemble learning. In this paper, we derive a tight lower bound of the generalization error in ensemble learning based on asymptotic analysis using a mathematical model based on an exponential mixture of probability distributions and the Kullback-Leibler divergence. In addition, we derive the explicit expression of the weight parameter used in the exponential mixture of probability distributions that minimizes the tight lower bound and discuss the property of the derived weight parameter.
Original language | English |
---|---|
Title of host publication | 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 1130-1133 |
Number of pages | 4 |
ISBN (Electronic) | 9781479959556 |
DOIs | |
Publication status | Published - 2014 Feb 18 |
Externally published | Yes |
Event | 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 - Kitakyushu, Japan Duration: 2014 Dec 3 → 2014 Dec 6 |
Other
Other | 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 |
---|---|
Country | Japan |
City | Kitakyushu |
Period | 14/12/3 → 14/12/6 |
Keywords
- asymptotic analysis
- ensemble learning
- exponential mixture model
- generalization error
- parameter estimation
ASJC Scopus subject areas
- Software
- Artificial Intelligence