Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric kullback-leibler divergence

Research output: Contribution to journalArticle

Abstract

When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no training samples for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and generalize this method.

Original languageEnglish
Pages (from-to)2349-2353
Number of pages5
JournalIEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
VolumeE98A
Issue number11
DOIs
Publication statusPublished - 2015 Nov 1
Externally publishedYes

    Fingerprint

Keywords

  • Ensemble learning
  • Exponential mixture model
  • Parameter estimation
  • Symmetric Kullback-Leibler divergence

ASJC Scopus subject areas

  • Signal Processing
  • Computer Graphics and Computer-Aided Design
  • Applied Mathematics
  • Electrical and Electronic Engineering

Cite this