Localized multiple kernel learning

Mehmet Gönen, Ethem Alpaydin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

253 Scopus citations

Abstract

Recently, instead of selecting a single kernel, multiple kernel learning (MKL) has been proposed which uses a convex combination of kernels, where the weight of each kernel is optimized during training. However, MKL assigns the same weight to a kernel over the whole input space. In this paper, we develop a localized multiple kernel learning (LMKL) algorithm using a gating model for selecting the appropriate kernel function locally. The localizing gating model and the kernel-based classifier are coupled and their optimization is done in a joint mariner. Empirical results on ten benchmark and two bioinformatics data sets validate the applicability of our approach. LMKL achieves statistically similar accuracy results compared with MKL by storing fewer support vectors. LMKL can also combine multiple copies of the same kernel function localized in different parts. For example, LMKL with multiple linear kernels gives better accuracy results than using a single linear kernel on bioinformatics data sets.

Original languageEnglish (US)
Title of host publicationProceedings of the 25th International Conference on Machine Learning
PublisherAssociation for Computing Machinery (ACM)
Pages352-359
Number of pages8
ISBN (Print)9781605582054
DOIs
StatePublished - 2008
Externally publishedYes
Event25th International Conference on Machine Learning - Helsinki, Finland
Duration: Jul 5 2008Jul 9 2008

Publication series

NameProceedings of the 25th International Conference on Machine Learning

Other

Other25th International Conference on Machine Learning
Country/TerritoryFinland
CityHelsinki
Period7/5/087/9/08

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Software

Fingerprint

Dive into the research topics of 'Localized multiple kernel learning'. Together they form a unique fingerprint.

Cite this