Unsupervised Weight Parameter Estimation Method for Ensemble Learning

Masato Uchida, Yousuke Maehara, Hiroyuki Shioya

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

When there are multiple trained predictors, one may want to integrate them into one predictor. However, this is challenging if the performances of the trained predictors are unknown and labeled data for evaluating their performances are not given. In this paper, a method is described that uses unlabeled data to estimate the weight parameters needed to build an ensemble predictor integrating multiple trained component predictors. It is readily derived from a mathematical model of ensemble learning based on a generalized mixture of probability density functions and corresponding information divergence measures. Numerical experiments demonstrated that the performance of our method is much better than that of simple average-based ensemble learning, even when the assumption placed on the performances of the component predictors does not hold exactly.

Original languageEnglish
Pages (from-to)307-322
Number of pages16
JournalJournal of Mathematical Modelling and Algorithms
Volume10
Issue number4
DOIs
Publication statusPublished - 2011 Dec
Externally publishedYes

Fingerprint

Ensemble Learning
Parameter estimation
Probability density function
Parameter Estimation
Predictors
Mathematical models
Experiments
Divergence Measure
Information Measure
Ensemble
Integrate
Numerical Experiment
Mathematical Model
Unknown
Estimate

Keywords

  • Ensemble learning
  • Exponential mixture model
  • Kullback-Leibler divergence
  • Unsupervised learning

ASJC Scopus subject areas

  • Modelling and Simulation
  • Applied Mathematics

Cite this

Unsupervised Weight Parameter Estimation Method for Ensemble Learning. / Uchida, Masato; Maehara, Yousuke; Shioya, Hiroyuki.

In: Journal of Mathematical Modelling and Algorithms, Vol. 10, No. 4, 12.2011, p. 307-322.

Research output: Contribution to journalArticle

@article{6a289e86442849d08788069f527c153c,
title = "Unsupervised Weight Parameter Estimation Method for Ensemble Learning",
abstract = "When there are multiple trained predictors, one may want to integrate them into one predictor. However, this is challenging if the performances of the trained predictors are unknown and labeled data for evaluating their performances are not given. In this paper, a method is described that uses unlabeled data to estimate the weight parameters needed to build an ensemble predictor integrating multiple trained component predictors. It is readily derived from a mathematical model of ensemble learning based on a generalized mixture of probability density functions and corresponding information divergence measures. Numerical experiments demonstrated that the performance of our method is much better than that of simple average-based ensemble learning, even when the assumption placed on the performances of the component predictors does not hold exactly.",
keywords = "Ensemble learning, Exponential mixture model, Kullback-Leibler divergence, Unsupervised learning",
author = "Masato Uchida and Yousuke Maehara and Hiroyuki Shioya",
year = "2011",
month = "12",
doi = "10.1007/s10852-011-9157-1",
language = "English",
volume = "10",
pages = "307--322",
journal = "Journal of Mathematical Modelling and Algorithms",
issn = "1570-1166",
publisher = "Springer Netherlands",
number = "4",

}

TY - JOUR

T1 - Unsupervised Weight Parameter Estimation Method for Ensemble Learning

AU - Uchida, Masato

AU - Maehara, Yousuke

AU - Shioya, Hiroyuki

PY - 2011/12

Y1 - 2011/12

N2 - When there are multiple trained predictors, one may want to integrate them into one predictor. However, this is challenging if the performances of the trained predictors are unknown and labeled data for evaluating their performances are not given. In this paper, a method is described that uses unlabeled data to estimate the weight parameters needed to build an ensemble predictor integrating multiple trained component predictors. It is readily derived from a mathematical model of ensemble learning based on a generalized mixture of probability density functions and corresponding information divergence measures. Numerical experiments demonstrated that the performance of our method is much better than that of simple average-based ensemble learning, even when the assumption placed on the performances of the component predictors does not hold exactly.

AB - When there are multiple trained predictors, one may want to integrate them into one predictor. However, this is challenging if the performances of the trained predictors are unknown and labeled data for evaluating their performances are not given. In this paper, a method is described that uses unlabeled data to estimate the weight parameters needed to build an ensemble predictor integrating multiple trained component predictors. It is readily derived from a mathematical model of ensemble learning based on a generalized mixture of probability density functions and corresponding information divergence measures. Numerical experiments demonstrated that the performance of our method is much better than that of simple average-based ensemble learning, even when the assumption placed on the performances of the component predictors does not hold exactly.

KW - Ensemble learning

KW - Exponential mixture model

KW - Kullback-Leibler divergence

KW - Unsupervised learning

UR - http://www.scopus.com/inward/record.url?scp=82355169807&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=82355169807&partnerID=8YFLogxK

U2 - 10.1007/s10852-011-9157-1

DO - 10.1007/s10852-011-9157-1

M3 - Article

AN - SCOPUS:82355169807

VL - 10

SP - 307

EP - 322

JO - Journal of Mathematical Modelling and Algorithms

JF - Journal of Mathematical Modelling and Algorithms

SN - 1570-1166

IS - 4

ER -