### Abstract

Bayesian theory is effective in statistics, lossless, source coding, machine learning, etc. It is often, however, computationally expensive since the calculation of posterior probabilities and of mixture distributions is not tractable. In this paper, we propose a new method for approximately calculating mixture distributions in a discrete hypothesis class.

Original language | English |
---|---|

Title of host publication | Proceedings of the IEEE International Conference on Systems, Man and Cybernetics |

Publisher | IEEE |

Pages | 2533-2538 |

Number of pages | 6 |

Volume | 3 |

Publication status | Published - 1997 |

Externally published | Yes |

Event | Proceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics. Part 3 (of 5) - Orlando, FL, USA Duration: 1997 Oct 12 → 1997 Oct 15 |

### Other

Other | Proceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics. Part 3 (of 5) |
---|---|

City | Orlando, FL, USA |

Period | 97/10/12 → 97/10/15 |

### Fingerprint

### ASJC Scopus subject areas

- Hardware and Architecture
- Control and Systems Engineering

### Cite this

*Proceedings of the IEEE International Conference on Systems, Man and Cybernetics*(Vol. 3, pp. 2533-2538). IEEE.

**Machine learning by a subset of hypotheses.** / Mukouchi, Takafumi; Matsushima, Toshiyasu; Hirasawa, Shigeichi.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*Proceedings of the IEEE International Conference on Systems, Man and Cybernetics.*vol. 3, IEEE, pp. 2533-2538, Proceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics. Part 3 (of 5), Orlando, FL, USA, 97/10/12.

}

TY - GEN

T1 - Machine learning by a subset of hypotheses

AU - Mukouchi, Takafumi

AU - Matsushima, Toshiyasu

AU - Hirasawa, Shigeichi

PY - 1997

Y1 - 1997

N2 - Bayesian theory is effective in statistics, lossless, source coding, machine learning, etc. It is often, however, computationally expensive since the calculation of posterior probabilities and of mixture distributions is not tractable. In this paper, we propose a new method for approximately calculating mixture distributions in a discrete hypothesis class.

AB - Bayesian theory is effective in statistics, lossless, source coding, machine learning, etc. It is often, however, computationally expensive since the calculation of posterior probabilities and of mixture distributions is not tractable. In this paper, we propose a new method for approximately calculating mixture distributions in a discrete hypothesis class.

UR - http://www.scopus.com/inward/record.url?scp=0031375754&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0031375754&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0031375754

VL - 3

SP - 2533

EP - 2538

BT - Proceedings of the IEEE International Conference on Systems, Man and Cybernetics

PB - IEEE

ER -