Non-local information for a mixture of multiple linear classifiers

Weite Li, Peifeng Liang, Xin Yuan, Takayuki Furuzuki

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Citations (Scopus)

Abstract

For many problems in machine learning fields, the data are nonlinearly distributed. One popular way to tackle this kind of data is training a local kernel machine or a mixture of several locally linear models. However, both of these approaches heavily relies on local information, such as neighbor relations of each data sample, to capture potential data distribution. In this paper, we show the non-local information is more efficient for data representation. With an implementation of a winner-take-all autoencoder, several non-local templates are trained to trace the data distribution and to represent each sample in different subspaces with a suitable weight. By training a linear model for each subspace in a divide and conquer manner, one single support vector machine can be formulated to solve nonlinear classification problems. Experimental results demonstrate that a mixture of multiple linear classifiers from non-local information performs better than or is at least competitive with state-of-the-art mixtures of locally linear models.

Original languageEnglish
Title of host publication2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3741-3746
Number of pages6
Volume2017-May
ISBN (Electronic)9781509061815
DOIs
Publication statusPublished - 2017 Jun 30
Event2017 International Joint Conference on Neural Networks, IJCNN 2017 - Anchorage, United States
Duration: 2017 May 142017 May 19

Other

Other2017 International Joint Conference on Neural Networks, IJCNN 2017
CountryUnited States
CityAnchorage
Period17/5/1417/5/19

Fingerprint

Classifiers
Support vector machines
Learning systems

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Li, W., Liang, P., Yuan, X., & Furuzuki, T. (2017). Non-local information for a mixture of multiple linear classifiers. In 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings (Vol. 2017-May, pp. 3741-3746). [7966327] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IJCNN.2017.7966327

Non-local information for a mixture of multiple linear classifiers. / Li, Weite; Liang, Peifeng; Yuan, Xin; Furuzuki, Takayuki.

2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. Vol. 2017-May Institute of Electrical and Electronics Engineers Inc., 2017. p. 3741-3746 7966327.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Li, W, Liang, P, Yuan, X & Furuzuki, T 2017, Non-local information for a mixture of multiple linear classifiers. in 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. vol. 2017-May, 7966327, Institute of Electrical and Electronics Engineers Inc., pp. 3741-3746, 2017 International Joint Conference on Neural Networks, IJCNN 2017, Anchorage, United States, 17/5/14. https://doi.org/10.1109/IJCNN.2017.7966327
Li W, Liang P, Yuan X, Furuzuki T. Non-local information for a mixture of multiple linear classifiers. In 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. Vol. 2017-May. Institute of Electrical and Electronics Engineers Inc. 2017. p. 3741-3746. 7966327 https://doi.org/10.1109/IJCNN.2017.7966327
Li, Weite ; Liang, Peifeng ; Yuan, Xin ; Furuzuki, Takayuki. / Non-local information for a mixture of multiple linear classifiers. 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. Vol. 2017-May Institute of Electrical and Electronics Engineers Inc., 2017. pp. 3741-3746
@inproceedings{a7eb4f622ab1457f9a950dcb22c6a95f,
title = "Non-local information for a mixture of multiple linear classifiers",
abstract = "For many problems in machine learning fields, the data are nonlinearly distributed. One popular way to tackle this kind of data is training a local kernel machine or a mixture of several locally linear models. However, both of these approaches heavily relies on local information, such as neighbor relations of each data sample, to capture potential data distribution. In this paper, we show the non-local information is more efficient for data representation. With an implementation of a winner-take-all autoencoder, several non-local templates are trained to trace the data distribution and to represent each sample in different subspaces with a suitable weight. By training a linear model for each subspace in a divide and conquer manner, one single support vector machine can be formulated to solve nonlinear classification problems. Experimental results demonstrate that a mixture of multiple linear classifiers from non-local information performs better than or is at least competitive with state-of-the-art mixtures of locally linear models.",
author = "Weite Li and Peifeng Liang and Xin Yuan and Takayuki Furuzuki",
year = "2017",
month = "6",
day = "30",
doi = "10.1109/IJCNN.2017.7966327",
language = "English",
volume = "2017-May",
pages = "3741--3746",
booktitle = "2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - Non-local information for a mixture of multiple linear classifiers

AU - Li, Weite

AU - Liang, Peifeng

AU - Yuan, Xin

AU - Furuzuki, Takayuki

PY - 2017/6/30

Y1 - 2017/6/30

N2 - For many problems in machine learning fields, the data are nonlinearly distributed. One popular way to tackle this kind of data is training a local kernel machine or a mixture of several locally linear models. However, both of these approaches heavily relies on local information, such as neighbor relations of each data sample, to capture potential data distribution. In this paper, we show the non-local information is more efficient for data representation. With an implementation of a winner-take-all autoencoder, several non-local templates are trained to trace the data distribution and to represent each sample in different subspaces with a suitable weight. By training a linear model for each subspace in a divide and conquer manner, one single support vector machine can be formulated to solve nonlinear classification problems. Experimental results demonstrate that a mixture of multiple linear classifiers from non-local information performs better than or is at least competitive with state-of-the-art mixtures of locally linear models.

AB - For many problems in machine learning fields, the data are nonlinearly distributed. One popular way to tackle this kind of data is training a local kernel machine or a mixture of several locally linear models. However, both of these approaches heavily relies on local information, such as neighbor relations of each data sample, to capture potential data distribution. In this paper, we show the non-local information is more efficient for data representation. With an implementation of a winner-take-all autoencoder, several non-local templates are trained to trace the data distribution and to represent each sample in different subspaces with a suitable weight. By training a linear model for each subspace in a divide and conquer manner, one single support vector machine can be formulated to solve nonlinear classification problems. Experimental results demonstrate that a mixture of multiple linear classifiers from non-local information performs better than or is at least competitive with state-of-the-art mixtures of locally linear models.

UR - http://www.scopus.com/inward/record.url?scp=85031040703&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85031040703&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2017.7966327

DO - 10.1109/IJCNN.2017.7966327

M3 - Conference contribution

AN - SCOPUS:85031040703

VL - 2017-May

SP - 3741

EP - 3746

BT - 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -