A kernel approach to implementation of local linear discriminant analysis for face recognition

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

The multiple linear model is used successfully to extend the linear model to nonlinear problems. However, the conventional multilinear models fail to learn the global structure of a training data set because the local linear models are independent of each other. Furthermore, the local linear transformations are learned in the original space. Therefore, the performance of multilinear methods is strongly dependent on the results of partition. This paper presents a kernel approach for the implementation of the local linear discriminant analysis for face recognition problems. In the original space, we utilize a set of local linear transformations with interpolation to approximate an optimal global nonlinear transformation. Based on the local linear models in the original space, we derive an explicit kernel mapping to map the training data into a high-dimensional transformed space. The optimal transformation is learned globally in the transformed space. Experimental results show that the proposed method is more robust to the partition results than the conventional multilinear methods. Compared with the general nonlinear kernels that utilize a black-box mapping, our proposed method can reduce the negative effects caused by the potential overfitting problem.

Original languageEnglish
Pages (from-to)62-70
Number of pages9
JournalIEEJ Transactions on Electrical and Electronic Engineering
Volume12
Issue number1
DOIs
Publication statusPublished - 2017 Jan 1

Keywords

  • generalized discriminant analysis
  • kernel
  • local linear discriminant analysis
  • local linear model with interpolation

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'A kernel approach to implementation of local linear discriminant analysis for face recognition'. Together they form a unique fingerprint.

  • Cite this