## 抄録

The α-EM algorithm is a proper extension of the traditional log-EM algorithm. This new algorithm is based on the α-logarithm, while the traditional one uses the logarithm. The case of α = -1 corresponds to the log-EM algorithm. Since the speed of the α-EM algorithm was reported for learning problems, this paper shows that closed-form E-steps can be obtained for a wide class of problems. There is a set of common techniques. That is, a cookbook for the α-EM algorithm is presented. The recipes include unsupervised neural networks, supervised neural networks for various gating, hidden Markov models and Markov random fields for moving object segmentation. Reasoning for the speedup is also given.

本文言語 | English |
---|---|

ホスト出版物のタイトル | Proceedings of the International Joint Conference on Neural Networks |

Place of Publication | United States |

出版社 | IEEE |

ページ | 1368-1373 |

ページ数 | 6 |

巻 | 2 |

出版ステータス | Published - 1999 |

イベント | International Joint Conference on Neural Networks (IJCNN'99) - Washington, DC, USA 継続期間: 1999 7 10 → 1999 7 16 |

### Other

Other | International Joint Conference on Neural Networks (IJCNN'99) |
---|---|

City | Washington, DC, USA |

Period | 99/7/10 → 99/7/16 |

## ASJC Scopus subject areas

- Software