A deep quasi-linear kernel composition method for support vector machines

Weite Li, Takayuki Furuzuki, Benhui Chen

研究成果: Conference contribution

4 被引用数 (Scopus)

抄録

In this paper, we introduce a data-dependent kernel called deep quasi-linear kernel, which can directly gain a profit from a pre-trained feedforward deep network. Firstly, a multi-layer gated bilinear classifier is formulated to mimic the functionality of a feed-forward neural network. The only difference between them is that the activation values of hidden units in the multi-layer gated bilinear classifier are dependent on a pre-trained neural network rather than a pre-defined activation function. Secondly, we demonstrate the equivalence between the multi-layer gated bilinear classifier and an SVM with a deep quasi-linear kernel. By deriving a kernel composition function, traditional optimization algorithms for a kernel SVM can be directly implemented to implicitly optimize the parameters of the multi-layer gated bilinear classifier. Experimental results on different data sets show that our proposed classifier obtains an ability to outperform both an SVM with a RBF kernel and the pre-trained feedforward deep network.

本文言語English
ホスト出版物のタイトル2016 International Joint Conference on Neural Networks, IJCNN 2016
出版社Institute of Electrical and Electronics Engineers Inc.
ページ1639-1645
ページ数7
2016-October
ISBN(電子版)9781509006199
DOI
出版ステータスPublished - 2016 10 31
イベント2016 International Joint Conference on Neural Networks, IJCNN 2016 - Vancouver, Canada
継続期間: 2016 7 242016 7 29

Other

Other2016 International Joint Conference on Neural Networks, IJCNN 2016
CountryCanada
CityVancouver
Period16/7/2416/7/29

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

フィンガープリント 「A deep quasi-linear kernel composition method for support vector machines」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル