Linear discriminant analysis with maximum correntropy criterion

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

Linear Discriminant Analysis (LDA) is a famous supervised feature extraction method for subspace learning in computer vision and pattern recognition. In this paper, a novel method of LDA based on a new Maximum Correntropy Criterion optimization technique is proposed. The conventional LDA, which is based on L2-norm, is sensitivity to the presence of outliers. The proposed method has several advantages: first, it is robust to large outliers. Second, it is invariant to rotations. Third, it can be effectively solved by half-quadratic optimization algorithm. And in each iteration step, the complex optimization problem can be reduced to a quadratic problem that can be efficiently solved by a weighted eigenvalue optimization method. The proposed method is capable of analyzing non-Gaussian noise to reduce the influence of large outliers substantially, resulting in a robust classification. Performance assessment in several datasets shows that the proposed approach is more effectiveness to address outlier issue than traditional ones.

Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Pages500-511
Number of pages12
Volume7724 LNCS
EditionPART 1
DOIs
Publication statusPublished - 2013
Event11th Asian Conference on Computer Vision, ACCV 2012 - Daejeon
Duration: 2012 Nov 52012 Nov 9

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 1
Volume7724 LNCS
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other11th Asian Conference on Computer Vision, ACCV 2012
CityDaejeon
Period12/11/512/11/9

Fingerprint

Discriminant analysis
Discriminant Analysis
Outlier
Eigenvalue Optimization
Non-Gaussian Noise
Quadratic Optimization
Performance Assessment
Computer Vision
Computer vision
Optimization Techniques
Feature Extraction
Pattern Recognition
Pattern recognition
Optimization Methods
Feature extraction
Optimization Algorithm
Subspace
Optimization Problem
Iteration
Norm

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science

Cite this

Zhou, W., & Kamata, S. (2013). Linear discriminant analysis with maximum correntropy criterion. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (PART 1 ed., Vol. 7724 LNCS, pp. 500-511). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 7724 LNCS, No. PART 1). https://doi.org/10.1007/978-3-642-37331-2_38

Linear discriminant analysis with maximum correntropy criterion. / Zhou, Wei; Kamata, Seiichiro.

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 7724 LNCS PART 1. ed. 2013. p. 500-511 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 7724 LNCS, No. PART 1).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Zhou, W & Kamata, S 2013, Linear discriminant analysis with maximum correntropy criterion. in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). PART 1 edn, vol. 7724 LNCS, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), no. PART 1, vol. 7724 LNCS, pp. 500-511, 11th Asian Conference on Computer Vision, ACCV 2012, Daejeon, 12/11/5. https://doi.org/10.1007/978-3-642-37331-2_38
Zhou W, Kamata S. Linear discriminant analysis with maximum correntropy criterion. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). PART 1 ed. Vol. 7724 LNCS. 2013. p. 500-511. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); PART 1). https://doi.org/10.1007/978-3-642-37331-2_38
Zhou, Wei ; Kamata, Seiichiro. / Linear discriminant analysis with maximum correntropy criterion. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 7724 LNCS PART 1. ed. 2013. pp. 500-511 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); PART 1).
@inproceedings{48cc0d47accb47ae8f96a7dc221e5165,
title = "Linear discriminant analysis with maximum correntropy criterion",
abstract = "Linear Discriminant Analysis (LDA) is a famous supervised feature extraction method for subspace learning in computer vision and pattern recognition. In this paper, a novel method of LDA based on a new Maximum Correntropy Criterion optimization technique is proposed. The conventional LDA, which is based on L2-norm, is sensitivity to the presence of outliers. The proposed method has several advantages: first, it is robust to large outliers. Second, it is invariant to rotations. Third, it can be effectively solved by half-quadratic optimization algorithm. And in each iteration step, the complex optimization problem can be reduced to a quadratic problem that can be efficiently solved by a weighted eigenvalue optimization method. The proposed method is capable of analyzing non-Gaussian noise to reduce the influence of large outliers substantially, resulting in a robust classification. Performance assessment in several datasets shows that the proposed approach is more effectiveness to address outlier issue than traditional ones.",
author = "Wei Zhou and Seiichiro Kamata",
year = "2013",
doi = "10.1007/978-3-642-37331-2_38",
language = "English",
isbn = "9783642373305",
volume = "7724 LNCS",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
number = "PART 1",
pages = "500--511",
booktitle = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
edition = "PART 1",

}

TY - GEN

T1 - Linear discriminant analysis with maximum correntropy criterion

AU - Zhou, Wei

AU - Kamata, Seiichiro

PY - 2013

Y1 - 2013

N2 - Linear Discriminant Analysis (LDA) is a famous supervised feature extraction method for subspace learning in computer vision and pattern recognition. In this paper, a novel method of LDA based on a new Maximum Correntropy Criterion optimization technique is proposed. The conventional LDA, which is based on L2-norm, is sensitivity to the presence of outliers. The proposed method has several advantages: first, it is robust to large outliers. Second, it is invariant to rotations. Third, it can be effectively solved by half-quadratic optimization algorithm. And in each iteration step, the complex optimization problem can be reduced to a quadratic problem that can be efficiently solved by a weighted eigenvalue optimization method. The proposed method is capable of analyzing non-Gaussian noise to reduce the influence of large outliers substantially, resulting in a robust classification. Performance assessment in several datasets shows that the proposed approach is more effectiveness to address outlier issue than traditional ones.

AB - Linear Discriminant Analysis (LDA) is a famous supervised feature extraction method for subspace learning in computer vision and pattern recognition. In this paper, a novel method of LDA based on a new Maximum Correntropy Criterion optimization technique is proposed. The conventional LDA, which is based on L2-norm, is sensitivity to the presence of outliers. The proposed method has several advantages: first, it is robust to large outliers. Second, it is invariant to rotations. Third, it can be effectively solved by half-quadratic optimization algorithm. And in each iteration step, the complex optimization problem can be reduced to a quadratic problem that can be efficiently solved by a weighted eigenvalue optimization method. The proposed method is capable of analyzing non-Gaussian noise to reduce the influence of large outliers substantially, resulting in a robust classification. Performance assessment in several datasets shows that the proposed approach is more effectiveness to address outlier issue than traditional ones.

UR - http://www.scopus.com/inward/record.url?scp=84875905563&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84875905563&partnerID=8YFLogxK

U2 - 10.1007/978-3-642-37331-2_38

DO - 10.1007/978-3-642-37331-2_38

M3 - Conference contribution

AN - SCOPUS:84875905563

SN - 9783642373305

VL - 7724 LNCS

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 500

EP - 511

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

ER -