Tight lower bound of generalization error in ensemble learning

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

A machine learning method that integrates multiple component predictors into one predictor is referred to as ensemble learning. In this paper, we derive a tight lower bound of the generalization error in ensemble learning based on asymptotic analysis using a mathematical model based on an exponential mixture of probability distributions and the Kullback-Leibler divergence. In addition, we derive the explicit expression of the weight parameter used in the exponential mixture of probability distributions that minimizes the tight lower bound and discuss the property of the derived weight parameter.

Original languageEnglish
Title of host publication2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1130-1133
Number of pages4
ISBN (Electronic)9781479959556
DOIs
Publication statusPublished - 2014 Feb 18
Externally publishedYes
Event2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 - Kitakyushu, Japan
Duration: 2014 Dec 32014 Dec 6

Other

Other2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014
CountryJapan
CityKitakyushu
Period14/12/314/12/6

Fingerprint

Probability distributions
Asymptotic analysis
Learning systems
Mathematical models

Keywords

  • asymptotic analysis
  • ensemble learning
  • exponential mixture model
  • generalization error
  • parameter estimation

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Uchida, M. (2014). Tight lower bound of generalization error in ensemble learning. In 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 (pp. 1130-1133). [7044723] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/SCIS-ISIS.2014.7044723

Tight lower bound of generalization error in ensemble learning. / Uchida, Masato.

2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014. Institute of Electrical and Electronics Engineers Inc., 2014. p. 1130-1133 7044723.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Uchida, M 2014, Tight lower bound of generalization error in ensemble learning. in 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014., 7044723, Institute of Electrical and Electronics Engineers Inc., pp. 1130-1133, 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014, Kitakyushu, Japan, 14/12/3. https://doi.org/10.1109/SCIS-ISIS.2014.7044723
Uchida M. Tight lower bound of generalization error in ensemble learning. In 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014. Institute of Electrical and Electronics Engineers Inc. 2014. p. 1130-1133. 7044723 https://doi.org/10.1109/SCIS-ISIS.2014.7044723
Uchida, Masato. / Tight lower bound of generalization error in ensemble learning. 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014. Institute of Electrical and Electronics Engineers Inc., 2014. pp. 1130-1133
@inproceedings{e3c5b7ef186c4dcea803ce81dbc099e3,
title = "Tight lower bound of generalization error in ensemble learning",
abstract = "A machine learning method that integrates multiple component predictors into one predictor is referred to as ensemble learning. In this paper, we derive a tight lower bound of the generalization error in ensemble learning based on asymptotic analysis using a mathematical model based on an exponential mixture of probability distributions and the Kullback-Leibler divergence. In addition, we derive the explicit expression of the weight parameter used in the exponential mixture of probability distributions that minimizes the tight lower bound and discuss the property of the derived weight parameter.",
keywords = "asymptotic analysis, ensemble learning, exponential mixture model, generalization error, parameter estimation",
author = "Masato Uchida",
year = "2014",
month = "2",
day = "18",
doi = "10.1109/SCIS-ISIS.2014.7044723",
language = "English",
pages = "1130--1133",
booktitle = "2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - Tight lower bound of generalization error in ensemble learning

AU - Uchida, Masato

PY - 2014/2/18

Y1 - 2014/2/18

N2 - A machine learning method that integrates multiple component predictors into one predictor is referred to as ensemble learning. In this paper, we derive a tight lower bound of the generalization error in ensemble learning based on asymptotic analysis using a mathematical model based on an exponential mixture of probability distributions and the Kullback-Leibler divergence. In addition, we derive the explicit expression of the weight parameter used in the exponential mixture of probability distributions that minimizes the tight lower bound and discuss the property of the derived weight parameter.

AB - A machine learning method that integrates multiple component predictors into one predictor is referred to as ensemble learning. In this paper, we derive a tight lower bound of the generalization error in ensemble learning based on asymptotic analysis using a mathematical model based on an exponential mixture of probability distributions and the Kullback-Leibler divergence. In addition, we derive the explicit expression of the weight parameter used in the exponential mixture of probability distributions that minimizes the tight lower bound and discuss the property of the derived weight parameter.

KW - asymptotic analysis

KW - ensemble learning

KW - exponential mixture model

KW - generalization error

KW - parameter estimation

UR - http://www.scopus.com/inward/record.url?scp=84988248499&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84988248499&partnerID=8YFLogxK

U2 - 10.1109/SCIS-ISIS.2014.7044723

DO - 10.1109/SCIS-ISIS.2014.7044723

M3 - Conference contribution

SP - 1130

EP - 1133

BT - 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014

PB - Institute of Electrical and Electronics Engineers Inc.

ER -