Multiple kernel learning by conditional entropy minimization

Hideitsu Hino*, Nima Reyhani, Noboru Murata

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

Kernel methods have been successfully used in many practical machine learning problems. Choosing a suitable kernel is left to the practitioner. A common way to an automatic selection of optimal kernels is to learn a linear combination of element kernels. In this paper, a novel framework of multiple kernel learning is proposed based on conditional entropy minimization criterion. For the proposed framework, three multiple kernel learning algorithms are derived. The algorithms are experimentally shown to be comparable to or outperform kernel Fisher discriminant analysis and other multiple kernel learning algorithms on benchmark data sets.

Original languageEnglish
Title of host publicationProceedings - 9th International Conference on Machine Learning and Applications, ICMLA 2010
Pages223-228
Number of pages6
DOIs
Publication statusPublished - 2010 Dec 1
Event9th International Conference on Machine Learning and Applications, ICMLA 2010 - Washington, DC, United States
Duration: 2010 Dec 122010 Dec 14

Publication series

NameProceedings - 9th International Conference on Machine Learning and Applications, ICMLA 2010

Conference

Conference9th International Conference on Machine Learning and Applications, ICMLA 2010
Country/TerritoryUnited States
CityWashington, DC
Period10/12/1210/12/14

Keywords

  • Discriminant analysis
  • Entropy
  • Kernel methods
  • Multiple Kernel Learning

ASJC Scopus subject areas

  • Computer Science Applications
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Multiple kernel learning by conditional entropy minimization'. Together they form a unique fingerprint.

Cite this