Multi-class support vector machine simplification

Ducdung Nguyen, Kazunori Matsumoto, Kazuo Hashimoto, Yasuhiro Takishima, Daichi Takatori, Masahiro Terabe

研究成果: Conference contribution

2 引用 (Scopus)

抄録

In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution - support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy.

元の言語English
ホスト出版物のタイトルLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
ページ799-808
ページ数10
5351 LNAI
DOI
出版物ステータスPublished - 2008
外部発表Yes
イベント10th Pacific Rim International Conference on Artificial Intelligence, PRICAI 2008 - Hanoi
継続期間: 2008 12 152008 12 19

出版物シリーズ

名前Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
5351 LNAI
ISSN(印刷物)03029743
ISSN(電子版)16113349

Other

Other10th Pacific Rim International Conference on Artificial Intelligence, PRICAI 2008
Hanoi
期間08/12/1508/12/19

Fingerprint

Support Vector
Multi-class
Simplification
Support vector machines
Support Vector Machine
Testing
Bottom-up
Computational Complexity
Speedup
Linearly
Classifier
Heuristics
Binary
Computational complexity
Classifiers
Experimental Results

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science

これを引用

Nguyen, D., Matsumoto, K., Hashimoto, K., Takishima, Y., Takatori, D., & Terabe, M. (2008). Multi-class support vector machine simplification. : Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (巻 5351 LNAI, pp. 799-808). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); 巻数 5351 LNAI). https://doi.org/10.1007/978-3-540-89197-0_74

Multi-class support vector machine simplification. / Nguyen, Ducdung; Matsumoto, Kazunori; Hashimoto, Kazuo; Takishima, Yasuhiro; Takatori, Daichi; Terabe, Masahiro.

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 巻 5351 LNAI 2008. p. 799-808 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); 巻 5351 LNAI).

研究成果: Conference contribution

Nguyen, D, Matsumoto, K, Hashimoto, K, Takishima, Y, Takatori, D & Terabe, M 2008, Multi-class support vector machine simplification. : Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 巻. 5351 LNAI, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 巻. 5351 LNAI, pp. 799-808, 10th Pacific Rim International Conference on Artificial Intelligence, PRICAI 2008, Hanoi, 08/12/15. https://doi.org/10.1007/978-3-540-89197-0_74
Nguyen D, Matsumoto K, Hashimoto K, Takishima Y, Takatori D, Terabe M. Multi-class support vector machine simplification. : Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 巻 5351 LNAI. 2008. p. 799-808. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-540-89197-0_74
Nguyen, Ducdung ; Matsumoto, Kazunori ; Hashimoto, Kazuo ; Takishima, Yasuhiro ; Takatori, Daichi ; Terabe, Masahiro. / Multi-class support vector machine simplification. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 巻 5351 LNAI 2008. pp. 799-808 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{0e72068f350944949af35c9d84855337,
title = "Multi-class support vector machine simplification",
abstract = "In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution - support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy.",
keywords = "Kernel-based methods, Reduced set method, Support vector machines",
author = "Ducdung Nguyen and Kazunori Matsumoto and Kazuo Hashimoto and Yasuhiro Takishima and Daichi Takatori and Masahiro Terabe",
year = "2008",
doi = "10.1007/978-3-540-89197-0_74",
language = "English",
isbn = "354089196X",
volume = "5351 LNAI",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
pages = "799--808",
booktitle = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",

}

TY - GEN

T1 - Multi-class support vector machine simplification

AU - Nguyen, Ducdung

AU - Matsumoto, Kazunori

AU - Hashimoto, Kazuo

AU - Takishima, Yasuhiro

AU - Takatori, Daichi

AU - Terabe, Masahiro

PY - 2008

Y1 - 2008

N2 - In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution - support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy.

AB - In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution - support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy.

KW - Kernel-based methods

KW - Reduced set method

KW - Support vector machines

UR - http://www.scopus.com/inward/record.url?scp=58349097796&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=58349097796&partnerID=8YFLogxK

U2 - 10.1007/978-3-540-89197-0_74

DO - 10.1007/978-3-540-89197-0_74

M3 - Conference contribution

AN - SCOPUS:58349097796

SN - 354089196X

SN - 9783540891963

VL - 5351 LNAI

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 799

EP - 808

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

ER -