Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no data for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and discuss the features of this method.

Original languageEnglish
Title of host publication2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1126-1129
Number of pages4
ISBN (Electronic)9781479959556
DOIs
Publication statusPublished - 2014 Feb 18
Externally publishedYes
Event2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 - Kitakyushu, Japan
Duration: 2014 Dec 32014 Dec 6

Other

Other2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014
CountryJapan
CityKitakyushu
Period14/12/314/12/6

    Fingerprint

Keywords

  • ensemble learning
  • exponential mixture model
  • parameter estimation
  • symmetric Kullback-Leibler divergence

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Uchida, M. (2014). Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence. In 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 (pp. 1126-1129). [7044722] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/SCIS-ISIS.2014.7044722