Oversampling the minority class in a multi-linear feature space for imbalanced data classification

Peifeng Liang, Weite Li, Takayuki Furuzuki

Research output: Contribution to journalArticle

1 Citation (Scopus)


This paper proposes a novel oversampling method for imbalanced data classification, in which the minority class samples are synthesized in a feature space to avoid the generated minority samples falling into the majority class regions. For this purpose, it introduces a multi-linear feature space (MLFS) based on a quasi-linear kernel, which is composed from a pretrained neural network (NN). By using the quasi-linear kernel, the proposed MLFS oversampling method avoids computing directly the Euclidean distances among the samples when oversampling the minority class and mapping the samples to high-dimensional feature space, which makes it easy to be applied to classification of high-dimensional datasets. On the other hand, by using kernel learning instead of representation learning using the NN, it makes an unsupervised learning, even a transfer learning, to be easily employed for the pretraining of NNs because a kernel is usually less dependent on a specific problem, which makes it possible to avoid considering the imbalance problem at the stage of pretraining the NN. Finally, a method is developed to oversample the synthetic minority samples by computing the quasi-linear kernel matrix instead of computing very high dimensional MLFS feature vectors directly. The proposed MLFS oversampling method is applied to different real-world datasets including image dataset, and simulation results confirm the effectiveness of the proposed method.

Original languageEnglish
JournalIEEJ Transactions on Electrical and Electronic Engineering
Publication statusAccepted/In press - 2018 Jan 1



  • Imbalanced data classification
  • Kernel composition
  • Multi-linear feature space
  • Oversampling
  • Support vector machine

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Cite this