Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no data for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and discuss the features of this method.

Original languageEnglish
Title of host publication2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1126-1129
Number of pages4
ISBN (Electronic)9781479959556
DOIs
Publication statusPublished - 2014 Feb 18
Externally publishedYes
Event2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 - Kitakyushu, Japan
Duration: 2014 Dec 32014 Dec 6

Other

Other2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014
CountryJapan
CityKitakyushu
Period14/12/314/12/6

Fingerprint

Parameter estimation
Probability distributions
Stochastic models

Keywords

  • ensemble learning
  • exponential mixture model
  • parameter estimation
  • symmetric Kullback-Leibler divergence

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Uchida, M. (2014). Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence. In 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 (pp. 1126-1129). [7044722] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/SCIS-ISIS.2014.7044722

Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence. / Uchida, Masato.

2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014. Institute of Electrical and Electronics Engineers Inc., 2014. p. 1126-1129 7044722.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Uchida, M 2014, Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence. in 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014., 7044722, Institute of Electrical and Electronics Engineers Inc., pp. 1126-1129, 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014, Kitakyushu, Japan, 14/12/3. https://doi.org/10.1109/SCIS-ISIS.2014.7044722
Uchida M. Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence. In 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014. Institute of Electrical and Electronics Engineers Inc. 2014. p. 1126-1129. 7044722 https://doi.org/10.1109/SCIS-ISIS.2014.7044722
Uchida, Masato. / Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence. 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014. Institute of Electrical and Electronics Engineers Inc., 2014. pp. 1126-1129
@inproceedings{20beeff8c16840e7afd43cff5bde5da7,
title = "Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence",
abstract = "When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no data for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and discuss the features of this method.",
keywords = "ensemble learning, exponential mixture model, parameter estimation, symmetric Kullback-Leibler divergence",
author = "Masato Uchida",
year = "2014",
month = "2",
day = "18",
doi = "10.1109/SCIS-ISIS.2014.7044722",
language = "English",
pages = "1126--1129",
booktitle = "2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence

AU - Uchida, Masato

PY - 2014/2/18

Y1 - 2014/2/18

N2 - When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no data for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and discuss the features of this method.

AB - When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no data for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and discuss the features of this method.

KW - ensemble learning

KW - exponential mixture model

KW - parameter estimation

KW - symmetric Kullback-Leibler divergence

UR - http://www.scopus.com/inward/record.url?scp=84988288801&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84988288801&partnerID=8YFLogxK

U2 - 10.1109/SCIS-ISIS.2014.7044722

DO - 10.1109/SCIS-ISIS.2014.7044722

M3 - Conference contribution

AN - SCOPUS:84988288801

SP - 1126

EP - 1129

BT - 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014

PB - Institute of Electrical and Electronics Engineers Inc.

ER -