Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric kullback-leibler divergence

研究成果: Article

抜粋

When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no training samples for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and generalize this method.

元の言語English
ページ(範囲)2349-2353
ページ数5
ジャーナルIEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
E98A
発行部数11
DOI
出版物ステータスPublished - 2015 11 1
外部発表Yes

ASJC Scopus subject areas

  • Signal Processing
  • Computer Graphics and Computer-Aided Design
  • Applied Mathematics
  • Electrical and Electronic Engineering

フィンガープリント Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric kullback-leibler divergence' の研究トピックを掘り下げます。これらはともに一意のフィンガープリントを構成します。

  • これを引用