Abstract
When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no training samples for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and generalize this method.
Original language | English |
---|---|
Pages (from-to) | 2349-2353 |
Number of pages | 5 |
Journal | IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences |
Volume | E98A |
Issue number | 11 |
DOIs | |
Publication status | Published - 2015 Nov |
Externally published | Yes |
Keywords
- Ensemble learning
- Exponential mixture model
- Parameter estimation
- Symmetric Kullback-Leibler divergence
ASJC Scopus subject areas
- Signal Processing
- Computer Graphics and Computer-Aided Design
- Electrical and Electronic Engineering
- Applied Mathematics