A kernel approach to implementation of local linear discriminant analysis for face recognition

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

The multiple linear model is used successfully to extend the linear model to nonlinear problems. However, the conventional multilinear models fail to learn the global structure of a training data set because the local linear models are independent of each other. Furthermore, the local linear transformations are learned in the original space. Therefore, the performance of multilinear methods is strongly dependent on the results of partition. This paper presents a kernel approach for the implementation of the local linear discriminant analysis for face recognition problems. In the original space, we utilize a set of local linear transformations with interpolation to approximate an optimal global nonlinear transformation. Based on the local linear models in the original space, we derive an explicit kernel mapping to map the training data into a high-dimensional transformed space. The optimal transformation is learned globally in the transformed space. Experimental results show that the proposed method is more robust to the partition results than the conventional multilinear methods. Compared with the general nonlinear kernels that utilize a black-box mapping, our proposed method can reduce the negative effects caused by the potential overfitting problem.

Original languageEnglish
Pages (from-to)62-70
Number of pages9
JournalIEEJ Transactions on Electrical and Electronic Engineering
Volume12
Issue number1
DOIs
Publication statusPublished - 2017 Jan 1

Fingerprint

Discriminant analysis
Face recognition
Linear transformations
Interpolation

Keywords

  • generalized discriminant analysis
  • kernel
  • local linear discriminant analysis
  • local linear model with interpolation

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Cite this

@article{06d8a41a0a544583a67be9f32e4e53c8,
title = "A kernel approach to implementation of local linear discriminant analysis for face recognition",
abstract = "The multiple linear model is used successfully to extend the linear model to nonlinear problems. However, the conventional multilinear models fail to learn the global structure of a training data set because the local linear models are independent of each other. Furthermore, the local linear transformations are learned in the original space. Therefore, the performance of multilinear methods is strongly dependent on the results of partition. This paper presents a kernel approach for the implementation of the local linear discriminant analysis for face recognition problems. In the original space, we utilize a set of local linear transformations with interpolation to approximate an optimal global nonlinear transformation. Based on the local linear models in the original space, we derive an explicit kernel mapping to map the training data into a high-dimensional transformed space. The optimal transformation is learned globally in the transformed space. Experimental results show that the proposed method is more robust to the partition results than the conventional multilinear methods. Compared with the general nonlinear kernels that utilize a black-box mapping, our proposed method can reduce the negative effects caused by the potential overfitting problem.",
keywords = "generalized discriminant analysis, kernel, local linear discriminant analysis, local linear model with interpolation",
author = "Zhan Shi and Takayuki Furuzuki",
year = "2017",
month = "1",
day = "1",
doi = "10.1002/tee.22336",
language = "English",
volume = "12",
pages = "62--70",
journal = "IEEJ Transactions on Electrical and Electronic Engineering",
issn = "1931-4973",
publisher = "John Wiley and Sons Inc.",
number = "1",

}

TY - JOUR

T1 - A kernel approach to implementation of local linear discriminant analysis for face recognition

AU - Shi, Zhan

AU - Furuzuki, Takayuki

PY - 2017/1/1

Y1 - 2017/1/1

N2 - The multiple linear model is used successfully to extend the linear model to nonlinear problems. However, the conventional multilinear models fail to learn the global structure of a training data set because the local linear models are independent of each other. Furthermore, the local linear transformations are learned in the original space. Therefore, the performance of multilinear methods is strongly dependent on the results of partition. This paper presents a kernel approach for the implementation of the local linear discriminant analysis for face recognition problems. In the original space, we utilize a set of local linear transformations with interpolation to approximate an optimal global nonlinear transformation. Based on the local linear models in the original space, we derive an explicit kernel mapping to map the training data into a high-dimensional transformed space. The optimal transformation is learned globally in the transformed space. Experimental results show that the proposed method is more robust to the partition results than the conventional multilinear methods. Compared with the general nonlinear kernels that utilize a black-box mapping, our proposed method can reduce the negative effects caused by the potential overfitting problem.

AB - The multiple linear model is used successfully to extend the linear model to nonlinear problems. However, the conventional multilinear models fail to learn the global structure of a training data set because the local linear models are independent of each other. Furthermore, the local linear transformations are learned in the original space. Therefore, the performance of multilinear methods is strongly dependent on the results of partition. This paper presents a kernel approach for the implementation of the local linear discriminant analysis for face recognition problems. In the original space, we utilize a set of local linear transformations with interpolation to approximate an optimal global nonlinear transformation. Based on the local linear models in the original space, we derive an explicit kernel mapping to map the training data into a high-dimensional transformed space. The optimal transformation is learned globally in the transformed space. Experimental results show that the proposed method is more robust to the partition results than the conventional multilinear methods. Compared with the general nonlinear kernels that utilize a black-box mapping, our proposed method can reduce the negative effects caused by the potential overfitting problem.

KW - generalized discriminant analysis

KW - kernel

KW - local linear discriminant analysis

KW - local linear model with interpolation

UR - http://www.scopus.com/inward/record.url?scp=85003520599&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85003520599&partnerID=8YFLogxK

U2 - 10.1002/tee.22336

DO - 10.1002/tee.22336

M3 - Article

AN - SCOPUS:85003520599

VL - 12

SP - 62

EP - 70

JO - IEEJ Transactions on Electrical and Electronic Engineering

JF - IEEJ Transactions on Electrical and Electronic Engineering

SN - 1931-4973

IS - 1

ER -