A mixture of multiple linear classifiers with sample weight and manifold regularization

Weite Li, Benhui Chen, Bo Zhou, Takayuki Furuzuki

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

A mixture of multiple linear classifiers is famous for its efficiency and effectiveness to tackle nonlinear classification problems. Each classifier contains one linear function multiplied with a gated function, which restricts its corresponding classifier to a local region. Previous researches mainly focus on the partition of local regions, since its quality directly determines the performance of mixture models. However, in real-world data sets, imbalanced and insufficient labeled data are two frequently encountered problems, which also have large influences on the performance of learned classifiers but are seldom considered or explored in the context of mixture models. In this paper, these missing components are introduced into the original formulation of mixture models, namely, a sample weighting scheme for imbalanced data distributions and a manifold regularization to leverage unlabeled data. Then, two solutions with closed form are provided for parameter optimization. Experimental results in the end of our paper exhibit the significance of the added components. As a result, a mixture of multiple linear classifiers can be extended to imbalanced and semi-supervised learning problems.

Original languageEnglish
Title of host publication2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3747-3752
Number of pages6
Volume2017-May
ISBN (Electronic)9781509061815
DOIs
Publication statusPublished - 2017 Jun 30
Event2017 International Joint Conference on Neural Networks, IJCNN 2017 - Anchorage, United States
Duration: 2017 May 142017 May 19

Other

Other2017 International Joint Conference on Neural Networks, IJCNN 2017
CountryUnited States
CityAnchorage
Period17/5/1417/5/19

Fingerprint

Classifiers
Supervised learning

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Li, W., Chen, B., Zhou, B., & Furuzuki, T. (2017). A mixture of multiple linear classifiers with sample weight and manifold regularization. In 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings (Vol. 2017-May, pp. 3747-3752). [7966328] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IJCNN.2017.7966328

A mixture of multiple linear classifiers with sample weight and manifold regularization. / Li, Weite; Chen, Benhui; Zhou, Bo; Furuzuki, Takayuki.

2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. Vol. 2017-May Institute of Electrical and Electronics Engineers Inc., 2017. p. 3747-3752 7966328.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Li, W, Chen, B, Zhou, B & Furuzuki, T 2017, A mixture of multiple linear classifiers with sample weight and manifold regularization. in 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. vol. 2017-May, 7966328, Institute of Electrical and Electronics Engineers Inc., pp. 3747-3752, 2017 International Joint Conference on Neural Networks, IJCNN 2017, Anchorage, United States, 17/5/14. https://doi.org/10.1109/IJCNN.2017.7966328
Li W, Chen B, Zhou B, Furuzuki T. A mixture of multiple linear classifiers with sample weight and manifold regularization. In 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. Vol. 2017-May. Institute of Electrical and Electronics Engineers Inc. 2017. p. 3747-3752. 7966328 https://doi.org/10.1109/IJCNN.2017.7966328
Li, Weite ; Chen, Benhui ; Zhou, Bo ; Furuzuki, Takayuki. / A mixture of multiple linear classifiers with sample weight and manifold regularization. 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. Vol. 2017-May Institute of Electrical and Electronics Engineers Inc., 2017. pp. 3747-3752
@inproceedings{f0bbb2bdc1fe44e18552f7559ebd0262,
title = "A mixture of multiple linear classifiers with sample weight and manifold regularization",
abstract = "A mixture of multiple linear classifiers is famous for its efficiency and effectiveness to tackle nonlinear classification problems. Each classifier contains one linear function multiplied with a gated function, which restricts its corresponding classifier to a local region. Previous researches mainly focus on the partition of local regions, since its quality directly determines the performance of mixture models. However, in real-world data sets, imbalanced and insufficient labeled data are two frequently encountered problems, which also have large influences on the performance of learned classifiers but are seldom considered or explored in the context of mixture models. In this paper, these missing components are introduced into the original formulation of mixture models, namely, a sample weighting scheme for imbalanced data distributions and a manifold regularization to leverage unlabeled data. Then, two solutions with closed form are provided for parameter optimization. Experimental results in the end of our paper exhibit the significance of the added components. As a result, a mixture of multiple linear classifiers can be extended to imbalanced and semi-supervised learning problems.",
author = "Weite Li and Benhui Chen and Bo Zhou and Takayuki Furuzuki",
year = "2017",
month = "6",
day = "30",
doi = "10.1109/IJCNN.2017.7966328",
language = "English",
volume = "2017-May",
pages = "3747--3752",
booktitle = "2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - A mixture of multiple linear classifiers with sample weight and manifold regularization

AU - Li, Weite

AU - Chen, Benhui

AU - Zhou, Bo

AU - Furuzuki, Takayuki

PY - 2017/6/30

Y1 - 2017/6/30

N2 - A mixture of multiple linear classifiers is famous for its efficiency and effectiveness to tackle nonlinear classification problems. Each classifier contains one linear function multiplied with a gated function, which restricts its corresponding classifier to a local region. Previous researches mainly focus on the partition of local regions, since its quality directly determines the performance of mixture models. However, in real-world data sets, imbalanced and insufficient labeled data are two frequently encountered problems, which also have large influences on the performance of learned classifiers but are seldom considered or explored in the context of mixture models. In this paper, these missing components are introduced into the original formulation of mixture models, namely, a sample weighting scheme for imbalanced data distributions and a manifold regularization to leverage unlabeled data. Then, two solutions with closed form are provided for parameter optimization. Experimental results in the end of our paper exhibit the significance of the added components. As a result, a mixture of multiple linear classifiers can be extended to imbalanced and semi-supervised learning problems.

AB - A mixture of multiple linear classifiers is famous for its efficiency and effectiveness to tackle nonlinear classification problems. Each classifier contains one linear function multiplied with a gated function, which restricts its corresponding classifier to a local region. Previous researches mainly focus on the partition of local regions, since its quality directly determines the performance of mixture models. However, in real-world data sets, imbalanced and insufficient labeled data are two frequently encountered problems, which also have large influences on the performance of learned classifiers but are seldom considered or explored in the context of mixture models. In this paper, these missing components are introduced into the original formulation of mixture models, namely, a sample weighting scheme for imbalanced data distributions and a manifold regularization to leverage unlabeled data. Then, two solutions with closed form are provided for parameter optimization. Experimental results in the end of our paper exhibit the significance of the added components. As a result, a mixture of multiple linear classifiers can be extended to imbalanced and semi-supervised learning problems.

UR - http://www.scopus.com/inward/record.url?scp=85031040239&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85031040239&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2017.7966328

DO - 10.1109/IJCNN.2017.7966328

M3 - Conference contribution

VL - 2017-May

SP - 3747

EP - 3752

BT - 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -