抄録
A machine learning method that integrates multiple component predictors into one predictor is referred to as ensemble learning. In this paper, we derive a tight lower bound of the generalization error in ensemble learning based on asymptotic analysis using a mathematical model based on an exponential mixture of probability distributions and the Kullback-Leibler divergence. In addition, we derive the explicit expression of the weight parameter used in the exponential mixture of probability distributions that minimizes the tight lower bound and discuss the property of the derived weight parameter.
本文言語 | English |
---|---|
ホスト出版物のタイトル | 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 |
出版社 | Institute of Electrical and Electronics Engineers Inc. |
ページ | 1130-1133 |
ページ数 | 4 |
ISBN(電子版) | 9781479959556 |
DOI | |
出版ステータス | Published - 2014 2 18 |
外部発表 | はい |
イベント | 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 - Kitakyushu, Japan 継続期間: 2014 12 3 → 2014 12 6 |
Other
Other | 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 |
---|---|
Country | Japan |
City | Kitakyushu |
Period | 14/12/3 → 14/12/6 |
ASJC Scopus subject areas
- Software
- Artificial Intelligence