A multilayer gated bilinear classifier: From optimizing a deep rectified network to a support vector machine

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

A deep neural network (DNN) is called as a deep rectified network (DRN), if using Rectified Linear Units (ReLUs) as its activation function. In this paper, we show its parameters can be seen to play two important roles simultaneously: one for determining the subnetworks corresponding to the inputs and the other for the parameters of those subnetworks. This observation leads our paper to proposing a method to combine a DNN and an SVM, as a deep classifier. For a DRN trained by a common tuning algorithm, a multilayer gated bilinear classifier is designed to mimic its functionality. Its parameter set is duplicated into two independent sets, playing different roles. One set is used to generate gate signals so as to determine subnetworks corresponding to its inputs, and keeps fixed when optimizing the classifier. The other set serves as parameters of subnetworks, which are linear classifiers. Therefore, their parameters can be implicitly optimized by applying SVM optimizations. Since the DRN is only to generate gate signals, we show in experiments, that it can be trained by using supervised, or unsupervised learning, and even by transfer learning.

Original languageEnglish
Title of host publication2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages140-146
Number of pages7
Volume2017-May
ISBN (Electronic)9781509061815
DOIs
Publication statusPublished - 2017 Jun 30
Event2017 International Joint Conference on Neural Networks, IJCNN 2017 - Anchorage, United States
Duration: 2017 May 142017 May 19

Other

Other2017 International Joint Conference on Neural Networks, IJCNN 2017
CountryUnited States
CityAnchorage
Period17/5/1417/5/19

Fingerprint

Support vector machines
Multilayers
Classifiers
Unsupervised learning
Supervised learning
Tuning
Chemical activation
Experiments
Deep neural networks

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Li, W., & Furuzuki, T. (2017). A multilayer gated bilinear classifier: From optimizing a deep rectified network to a support vector machine. In 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings (Vol. 2017-May, pp. 140-146). [7965847] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IJCNN.2017.7965847

A multilayer gated bilinear classifier : From optimizing a deep rectified network to a support vector machine. / Li, Weite; Furuzuki, Takayuki.

2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. Vol. 2017-May Institute of Electrical and Electronics Engineers Inc., 2017. p. 140-146 7965847.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Li, W & Furuzuki, T 2017, A multilayer gated bilinear classifier: From optimizing a deep rectified network to a support vector machine. in 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. vol. 2017-May, 7965847, Institute of Electrical and Electronics Engineers Inc., pp. 140-146, 2017 International Joint Conference on Neural Networks, IJCNN 2017, Anchorage, United States, 17/5/14. https://doi.org/10.1109/IJCNN.2017.7965847
Li W, Furuzuki T. A multilayer gated bilinear classifier: From optimizing a deep rectified network to a support vector machine. In 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. Vol. 2017-May. Institute of Electrical and Electronics Engineers Inc. 2017. p. 140-146. 7965847 https://doi.org/10.1109/IJCNN.2017.7965847
Li, Weite ; Furuzuki, Takayuki. / A multilayer gated bilinear classifier : From optimizing a deep rectified network to a support vector machine. 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. Vol. 2017-May Institute of Electrical and Electronics Engineers Inc., 2017. pp. 140-146
@inproceedings{12519709fb4f4d7b914b2971bc259b30,
title = "A multilayer gated bilinear classifier: From optimizing a deep rectified network to a support vector machine",
abstract = "A deep neural network (DNN) is called as a deep rectified network (DRN), if using Rectified Linear Units (ReLUs) as its activation function. In this paper, we show its parameters can be seen to play two important roles simultaneously: one for determining the subnetworks corresponding to the inputs and the other for the parameters of those subnetworks. This observation leads our paper to proposing a method to combine a DNN and an SVM, as a deep classifier. For a DRN trained by a common tuning algorithm, a multilayer gated bilinear classifier is designed to mimic its functionality. Its parameter set is duplicated into two independent sets, playing different roles. One set is used to generate gate signals so as to determine subnetworks corresponding to its inputs, and keeps fixed when optimizing the classifier. The other set serves as parameters of subnetworks, which are linear classifiers. Therefore, their parameters can be implicitly optimized by applying SVM optimizations. Since the DRN is only to generate gate signals, we show in experiments, that it can be trained by using supervised, or unsupervised learning, and even by transfer learning.",
author = "Weite Li and Takayuki Furuzuki",
year = "2017",
month = "6",
day = "30",
doi = "10.1109/IJCNN.2017.7965847",
language = "English",
volume = "2017-May",
pages = "140--146",
booktitle = "2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - A multilayer gated bilinear classifier

T2 - From optimizing a deep rectified network to a support vector machine

AU - Li, Weite

AU - Furuzuki, Takayuki

PY - 2017/6/30

Y1 - 2017/6/30

N2 - A deep neural network (DNN) is called as a deep rectified network (DRN), if using Rectified Linear Units (ReLUs) as its activation function. In this paper, we show its parameters can be seen to play two important roles simultaneously: one for determining the subnetworks corresponding to the inputs and the other for the parameters of those subnetworks. This observation leads our paper to proposing a method to combine a DNN and an SVM, as a deep classifier. For a DRN trained by a common tuning algorithm, a multilayer gated bilinear classifier is designed to mimic its functionality. Its parameter set is duplicated into two independent sets, playing different roles. One set is used to generate gate signals so as to determine subnetworks corresponding to its inputs, and keeps fixed when optimizing the classifier. The other set serves as parameters of subnetworks, which are linear classifiers. Therefore, their parameters can be implicitly optimized by applying SVM optimizations. Since the DRN is only to generate gate signals, we show in experiments, that it can be trained by using supervised, or unsupervised learning, and even by transfer learning.

AB - A deep neural network (DNN) is called as a deep rectified network (DRN), if using Rectified Linear Units (ReLUs) as its activation function. In this paper, we show its parameters can be seen to play two important roles simultaneously: one for determining the subnetworks corresponding to the inputs and the other for the parameters of those subnetworks. This observation leads our paper to proposing a method to combine a DNN and an SVM, as a deep classifier. For a DRN trained by a common tuning algorithm, a multilayer gated bilinear classifier is designed to mimic its functionality. Its parameter set is duplicated into two independent sets, playing different roles. One set is used to generate gate signals so as to determine subnetworks corresponding to its inputs, and keeps fixed when optimizing the classifier. The other set serves as parameters of subnetworks, which are linear classifiers. Therefore, their parameters can be implicitly optimized by applying SVM optimizations. Since the DRN is only to generate gate signals, we show in experiments, that it can be trained by using supervised, or unsupervised learning, and even by transfer learning.

UR - http://www.scopus.com/inward/record.url?scp=85031047007&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85031047007&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2017.7965847

DO - 10.1109/IJCNN.2017.7965847

M3 - Conference contribution

AN - SCOPUS:85031047007

VL - 2017-May

SP - 140

EP - 146

BT - 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -