Multi-class support vector machine simplification

Ducdung Nguyen, Kazunori Matsumoto, Kazuo Hashimoto, Yasuhiro Takishima, Daichi Takatori, Masahiro Terabe

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution - support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy.

Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Pages799-808
Number of pages10
Volume5351 LNAI
DOIs
Publication statusPublished - 2008
Externally publishedYes
Event10th Pacific Rim International Conference on Artificial Intelligence, PRICAI 2008 - Hanoi
Duration: 2008 Dec 152008 Dec 19

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume5351 LNAI
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other10th Pacific Rim International Conference on Artificial Intelligence, PRICAI 2008
CityHanoi
Period08/12/1508/12/19

Fingerprint

Support Vector
Multi-class
Simplification
Support vector machines
Support Vector Machine
Testing
Bottom-up
Computational Complexity
Speedup
Linearly
Classifier
Heuristics
Binary
Computational complexity
Classifiers
Experimental Results

Keywords

  • Kernel-based methods
  • Reduced set method
  • Support vector machines

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science

Cite this

Nguyen, D., Matsumoto, K., Hashimoto, K., Takishima, Y., Takatori, D., & Terabe, M. (2008). Multi-class support vector machine simplification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5351 LNAI, pp. 799-808). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 5351 LNAI). https://doi.org/10.1007/978-3-540-89197-0_74

Multi-class support vector machine simplification. / Nguyen, Ducdung; Matsumoto, Kazunori; Hashimoto, Kazuo; Takishima, Yasuhiro; Takatori, Daichi; Terabe, Masahiro.

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 5351 LNAI 2008. p. 799-808 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 5351 LNAI).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Nguyen, D, Matsumoto, K, Hashimoto, K, Takishima, Y, Takatori, D & Terabe, M 2008, Multi-class support vector machine simplification. in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). vol. 5351 LNAI, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 5351 LNAI, pp. 799-808, 10th Pacific Rim International Conference on Artificial Intelligence, PRICAI 2008, Hanoi, 08/12/15. https://doi.org/10.1007/978-3-540-89197-0_74
Nguyen D, Matsumoto K, Hashimoto K, Takishima Y, Takatori D, Terabe M. Multi-class support vector machine simplification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 5351 LNAI. 2008. p. 799-808. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-540-89197-0_74
Nguyen, Ducdung ; Matsumoto, Kazunori ; Hashimoto, Kazuo ; Takishima, Yasuhiro ; Takatori, Daichi ; Terabe, Masahiro. / Multi-class support vector machine simplification. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 5351 LNAI 2008. pp. 799-808 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{0e72068f350944949af35c9d84855337,
title = "Multi-class support vector machine simplification",
abstract = "In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution - support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy.",
keywords = "Kernel-based methods, Reduced set method, Support vector machines",
author = "Ducdung Nguyen and Kazunori Matsumoto and Kazuo Hashimoto and Yasuhiro Takishima and Daichi Takatori and Masahiro Terabe",
year = "2008",
doi = "10.1007/978-3-540-89197-0_74",
language = "English",
isbn = "354089196X",
volume = "5351 LNAI",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
pages = "799--808",
booktitle = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",

}

TY - GEN

T1 - Multi-class support vector machine simplification

AU - Nguyen, Ducdung

AU - Matsumoto, Kazunori

AU - Hashimoto, Kazuo

AU - Takishima, Yasuhiro

AU - Takatori, Daichi

AU - Terabe, Masahiro

PY - 2008

Y1 - 2008

N2 - In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution - support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy.

AB - In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution - support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy.

KW - Kernel-based methods

KW - Reduced set method

KW - Support vector machines

UR - http://www.scopus.com/inward/record.url?scp=58349097796&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=58349097796&partnerID=8YFLogxK

U2 - 10.1007/978-3-540-89197-0_74

DO - 10.1007/978-3-540-89197-0_74

M3 - Conference contribution

SN - 354089196X

SN - 9783540891963

VL - 5351 LNAI

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 799

EP - 808

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

ER -