A modified em algorithm for mixture models based on Bregman divergence

Research output: Contribution to journalArticle

7 Citations (Scopus)

Abstract

The EM algorithm is a sophisticated method for estimating statistical models with hidden variables based on the Kullback-Leibler divergence. A natural extension of the Kullback-Leibler divergence is given by a class of Bregman divergences, which in general enjoy robustness to contamination data in statistical inference. In this paper, a modification of the EM algorithm based on the Bregman divergence is proposed for estimating finite mixture models. The proposed algorithm is geometrically interpreted as a sequence of projections induced from the Bregman divergence. Since a rigorous algorithm includes a nonlinear optimization procedure, two simplification methods for reducing computational difficulty are also discussed from a geometrical viewpoint. Numerical experiments on a toy problem are carried out to confirm appropriateness of the simplifications.

Original languageEnglish
Pages (from-to)3-25
Number of pages23
JournalAnnals of the Institute of Statistical Mathematics
Volume59
Issue number1
DOIs
Publication statusPublished - 2007 Mar

Fingerprint

Bregman Divergence
EM Algorithm
Mixture Model
Kullback-Leibler Divergence
Model-based
Simplification
Finite Mixture Models
Hidden Variables
Natural Extension
Nonlinear Optimization
Statistical Inference
Contamination
Statistical Model
Numerical Experiment
Projection
Robustness

Keywords

  • Bregman divergence
  • EM algorithm
  • Finite mixture models

ASJC Scopus subject areas

  • Mathematics(all)
  • Statistics and Probability

Cite this

@article{f1df19d25de84f05afb47f2ae5aff18b,
title = "A modified em algorithm for mixture models based on Bregman divergence",
abstract = "The EM algorithm is a sophisticated method for estimating statistical models with hidden variables based on the Kullback-Leibler divergence. A natural extension of the Kullback-Leibler divergence is given by a class of Bregman divergences, which in general enjoy robustness to contamination data in statistical inference. In this paper, a modification of the EM algorithm based on the Bregman divergence is proposed for estimating finite mixture models. The proposed algorithm is geometrically interpreted as a sequence of projections induced from the Bregman divergence. Since a rigorous algorithm includes a nonlinear optimization procedure, two simplification methods for reducing computational difficulty are also discussed from a geometrical viewpoint. Numerical experiments on a toy problem are carried out to confirm appropriateness of the simplifications.",
keywords = "Bregman divergence, EM algorithm, Finite mixture models",
author = "Yu Fujimoto and Noboru Murata",
year = "2007",
month = "3",
doi = "10.1007/s10463-006-0097-x",
language = "English",
volume = "59",
pages = "3--25",
journal = "Annals of the Institute of Statistical Mathematics",
issn = "0020-3157",
publisher = "Springer Netherlands",
number = "1",

}

TY - JOUR

T1 - A modified em algorithm for mixture models based on Bregman divergence

AU - Fujimoto, Yu

AU - Murata, Noboru

PY - 2007/3

Y1 - 2007/3

N2 - The EM algorithm is a sophisticated method for estimating statistical models with hidden variables based on the Kullback-Leibler divergence. A natural extension of the Kullback-Leibler divergence is given by a class of Bregman divergences, which in general enjoy robustness to contamination data in statistical inference. In this paper, a modification of the EM algorithm based on the Bregman divergence is proposed for estimating finite mixture models. The proposed algorithm is geometrically interpreted as a sequence of projections induced from the Bregman divergence. Since a rigorous algorithm includes a nonlinear optimization procedure, two simplification methods for reducing computational difficulty are also discussed from a geometrical viewpoint. Numerical experiments on a toy problem are carried out to confirm appropriateness of the simplifications.

AB - The EM algorithm is a sophisticated method for estimating statistical models with hidden variables based on the Kullback-Leibler divergence. A natural extension of the Kullback-Leibler divergence is given by a class of Bregman divergences, which in general enjoy robustness to contamination data in statistical inference. In this paper, a modification of the EM algorithm based on the Bregman divergence is proposed for estimating finite mixture models. The proposed algorithm is geometrically interpreted as a sequence of projections induced from the Bregman divergence. Since a rigorous algorithm includes a nonlinear optimization procedure, two simplification methods for reducing computational difficulty are also discussed from a geometrical viewpoint. Numerical experiments on a toy problem are carried out to confirm appropriateness of the simplifications.

KW - Bregman divergence

KW - EM algorithm

KW - Finite mixture models

UR - http://www.scopus.com/inward/record.url?scp=33847231874&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33847231874&partnerID=8YFLogxK

U2 - 10.1007/s10463-006-0097-x

DO - 10.1007/s10463-006-0097-x

M3 - Article

VL - 59

SP - 3

EP - 25

JO - Annals of the Institute of Statistical Mathematics

JF - Annals of the Institute of Statistical Mathematics

SN - 0020-3157

IS - 1

ER -