### Abstract

When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no data for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and discuss the features of this method.

Original language | English |
---|---|

Title of host publication | 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 |

Publisher | Institute of Electrical and Electronics Engineers Inc. |

Pages | 1126-1129 |

Number of pages | 4 |

ISBN (Electronic) | 9781479959556 |

DOIs | |

Publication status | Published - 2014 Feb 18 |

Externally published | Yes |

Event | 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 - Kitakyushu, Japan Duration: 2014 Dec 3 → 2014 Dec 6 |

### Other

Other | 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 |
---|---|

Country | Japan |

City | Kitakyushu |

Period | 14/12/3 → 14/12/6 |

### Fingerprint

### Keywords

- ensemble learning
- exponential mixture model
- parameter estimation
- symmetric Kullback-Leibler divergence

### ASJC Scopus subject areas

- Software
- Artificial Intelligence

### Cite this

*2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014*(pp. 1126-1129). [7044722] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/SCIS-ISIS.2014.7044722

**Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence.** / Uchida, Masato.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014.*, 7044722, Institute of Electrical and Electronics Engineers Inc., pp. 1126-1129, 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014, Kitakyushu, Japan, 14/12/3. https://doi.org/10.1109/SCIS-ISIS.2014.7044722

}

TY - GEN

T1 - Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence

AU - Uchida, Masato

PY - 2014/2/18

Y1 - 2014/2/18

N2 - When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no data for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and discuss the features of this method.

AB - When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no data for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and discuss the features of this method.

KW - ensemble learning

KW - exponential mixture model

KW - parameter estimation

KW - symmetric Kullback-Leibler divergence

UR - http://www.scopus.com/inward/record.url?scp=84988288801&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84988288801&partnerID=8YFLogxK

U2 - 10.1109/SCIS-ISIS.2014.7044722

DO - 10.1109/SCIS-ISIS.2014.7044722

M3 - Conference contribution

AN - SCOPUS:84988288801

SP - 1126

EP - 1129

BT - 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014

PB - Institute of Electrical and Electronics Engineers Inc.

ER -