Learning properties of support vector machines with p-norm

Kazushi Ikeda, Noboru Murata

Research output: Contribution to journalConference article

2 Citations (Scopus)

Abstract

Support Vector Machines (SVMs) are a new classification technique which has a high generalization ability, yet a heavy computational load since margin maximization results in a quadratic programming problem. It is known that this maximization task results in a pth-order programming problem if we employ the p-norm instead of the Euclidean norm, that is. When p = 1, for example, it is a linear programming problem with a much lower computational load. In this article, we theoretically show that p has very little affect on the generalization performance of SVMs in practice by considering its geometrical meaning.

Original languageEnglish
Pages (from-to)III69-III72
JournalMidwest Symposium on Circuits and Systems
Volume3
Publication statusPublished - 2004 Dec 1
EventThe 2004 47th Midwest Symposium on Circuits and Systems - Conference Proceedings - Hiroshima, Japan
Duration: 2004 Jul 252004 Jul 28

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Learning properties of support vector machines with p-norm'. Together they form a unique fingerprint.

  • Cite this