### Abstract

Support Vector Machines (SVMs) are a new classification technique which has a high generalization ability, yet a heavy computational load since margin maximization results in a quadratic programming problem. It is known that this maximization task results in a pth-order programming problem if we employ the p-norm instead of the Euclidean norm, that is. When p = 1, for example, it is a linear programming problem with a much lower computational load. In this article, we theoretically show that p has very little affect on the generalization performance of SVMs in practice by considering its geometrical meaning.

Original language | English |
---|---|

Title of host publication | Midwest Symposium on Circuits and Systems |

Volume | 3 |

Publication status | Published - 2004 |

Event | The 2004 47th Midwest Symposium on Circuits and Systems - Conference Proceedings - Hiroshima, Japan Duration: 2004 Jul 25 → 2004 Jul 28 |

### Other

Other | The 2004 47th Midwest Symposium on Circuits and Systems - Conference Proceedings |
---|---|

Country | Japan |

City | Hiroshima |

Period | 04/7/25 → 04/7/28 |

### Fingerprint

### ASJC Scopus subject areas

- Electrical and Electronic Engineering
- Electronic, Optical and Magnetic Materials

### Cite this

*Midwest Symposium on Circuits and Systems*(Vol. 3)

**Learning properties of support vector machines with p-norm.** / Ikeda, Kazushi; Murata, Noboru.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*Midwest Symposium on Circuits and Systems.*vol. 3, The 2004 47th Midwest Symposium on Circuits and Systems - Conference Proceedings, Hiroshima, Japan, 04/7/25.

}

TY - GEN

T1 - Learning properties of support vector machines with p-norm

AU - Ikeda, Kazushi

AU - Murata, Noboru

PY - 2004

Y1 - 2004

N2 - Support Vector Machines (SVMs) are a new classification technique which has a high generalization ability, yet a heavy computational load since margin maximization results in a quadratic programming problem. It is known that this maximization task results in a pth-order programming problem if we employ the p-norm instead of the Euclidean norm, that is. When p = 1, for example, it is a linear programming problem with a much lower computational load. In this article, we theoretically show that p has very little affect on the generalization performance of SVMs in practice by considering its geometrical meaning.

AB - Support Vector Machines (SVMs) are a new classification technique which has a high generalization ability, yet a heavy computational load since margin maximization results in a quadratic programming problem. It is known that this maximization task results in a pth-order programming problem if we employ the p-norm instead of the Euclidean norm, that is. When p = 1, for example, it is a linear programming problem with a much lower computational load. In this article, we theoretically show that p has very little affect on the generalization performance of SVMs in practice by considering its geometrical meaning.

UR - http://www.scopus.com/inward/record.url?scp=11144316920&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=11144316920&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:11144316920

VL - 3

BT - Midwest Symposium on Circuits and Systems

ER -