### Abstract

Support Vector Machines (SVMs) are a new classification technique which has a high generalization ability, yet a heavy computational load since margin maximization results in a quadratic programming problem. It is known that this maximization task results in a pth-order programming problem if we employ the p-norm instead of the Euclidean norm, that is. When p = 1, for example, it is a linear programming problem with a much lower computational load. In this article, we theoretically show that p has very little affect on the generalization performance of SVMs in practice by considering its geometrical meaning.

Original language | English |
---|---|

Pages (from-to) | III69-III72 |

Journal | Midwest Symposium on Circuits and Systems |

Volume | 3 |

Publication status | Published - 2004 Dec 1 |

Event | The 2004 47th Midwest Symposium on Circuits and Systems - Conference Proceedings - Hiroshima, Japan Duration: 2004 Jul 25 → 2004 Jul 28 |

### ASJC Scopus subject areas

- Electronic, Optical and Magnetic Materials
- Electrical and Electronic Engineering

## Fingerprint Dive into the research topics of 'Learning properties of support vector machines with p-norm'. Together they form a unique fingerprint.

## Cite this

Ikeda, K., & Murata, N. (2004). Learning properties of support vector machines with p-norm.

*Midwest Symposium on Circuits and Systems*,*3*, III69-III72.