### Abstract

Clustering is a representative of unsupervised learning and one of the important approaches in exploratory data analysis. By its very nature, clustering without strong assumption on data distribution is desirable. Information-theoretic clustering is a class of clustering methods that optimize information-theoretic quantities such as entropy and mutual information. These quantities can be estimated in a nonparametric manner, and information-theoretic clustering algorithms are capable of capturing various intrinsic data structures. It is also possible to estimate information-theoretic quantities using a data set with sampling weight for each datum. Assuming the data set is sampled from a certain cluster and assigning different sampling weights depending on the clusters, the cluster-conditional information-theoretic quantities are estimated. In this letter, a simple iterative clustering algorithm is proposed based on a nonparametric estimator of the log likelihood for weighted data sets. The clustering algorithm is also derived from the principle of conditional entropy minimization with maximum entropy regularization. The proposed algorithm does not contain a tuning parameter. The algorithm is experimentally shown to be comparable to or outperform conventional nonparametric clustering methods.

Original language | English |
---|---|

Pages (from-to) | 2074-2101 |

Number of pages | 28 |

Journal | Neural Computation |

Volume | 26 |

Issue number | 9 |

DOIs | |

Publication status | Published - 2014 Sep 13 |

### Fingerprint

### ASJC Scopus subject areas

- Cognitive Neuroscience
- Arts and Humanities (miscellaneous)

### Cite this

*Neural Computation*,

*26*(9), 2074-2101. https://doi.org/10.1162/NECO_a_00628