Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric kullback-leibler divergence

研究成果: Article

抄録

When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no training samples for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and generalize this method.

元の言語English
ページ(範囲)2349-2353
ページ数5
ジャーナルIEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
E98A
発行部数11
DOI
出版物ステータスPublished - 2015 11 1
外部発表Yes

Fingerprint

Mixture Distribution
Kullback-Leibler Divergence
Exponential distribution
Parameter estimation
Probability distributions
Parameter Estimation
Predictors
Probability Distribution
Exponential Model
Mixture Model
Integrate
Stochastic models
Training Samples
Equilibrium Point
Stochastic Model
Performance Evaluation
Reasoning
Generalise
Estimate

ASJC Scopus subject areas

  • Signal Processing
  • Computer Graphics and Computer-Aided Design
  • Applied Mathematics
  • Electrical and Electronic Engineering

これを引用

@article{bb40db1ddf444c9eb6898812f045cc79,
title = "Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric kullback-leibler divergence",
abstract = "When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no training samples for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and generalize this method.",
keywords = "Ensemble learning, Exponential mixture model, Parameter estimation, Symmetric Kullback-Leibler divergence",
author = "Masato Uchida",
year = "2015",
month = "11",
day = "1",
doi = "10.1587/transfun.E98.A.2349",
language = "English",
volume = "E98A",
pages = "2349--2353",
journal = "IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences",
issn = "0916-8508",
publisher = "Maruzen Co., Ltd/Maruzen Kabushikikaisha",
number = "11",

}

TY - JOUR

T1 - Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric kullback-leibler divergence

AU - Uchida, Masato

PY - 2015/11/1

Y1 - 2015/11/1

N2 - When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no training samples for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and generalize this method.

AB - When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no training samples for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and generalize this method.

KW - Ensemble learning

KW - Exponential mixture model

KW - Parameter estimation

KW - Symmetric Kullback-Leibler divergence

UR - http://www.scopus.com/inward/record.url?scp=84947998264&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84947998264&partnerID=8YFLogxK

U2 - 10.1587/transfun.E98.A.2349

DO - 10.1587/transfun.E98.A.2349

M3 - Article

AN - SCOPUS:84947998264

VL - E98A

SP - 2349

EP - 2353

JO - IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences

JF - IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences

SN - 0916-8508

IS - 11

ER -