Sparse ternary connect

Convolutional neural networks using ternarized weights with enhanced sparsity

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Convolutional Neural Networks (CNNs) are indispensable in a wide range of tasks to achieve state-of-the-art results. In this work, we exploit ternary weights in both inference and training of CNNs and further propose Sparse Ternary Connect (STC) where kernel weights in float value are converted to 1, -1 and 0 based on a new conversion rule with the controlled ratio of 0. STC can save hardware resource a lot with small degradation of precision. The experimental evaluation on 2 popular datasets (CIFAR-10 and SVHN) shows that the proposed method can reduce resource utilization (by 28.9% of LUT, 25.3% of FF, 97.5% of DSP and 88.7% of BRAM on Xilinx Kintex-7 FPGA) with less than 0.5% accuracy loss.

Original languageEnglish
Title of host publicationASP-DAC 2018 - 23rd Asia and South Pacific Design Automation Conference, Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages190-195
Number of pages6
Volume2018-January
ISBN (Electronic)9781509006021
DOIs
Publication statusPublished - 2018 Feb 20
Event23rd Asia and South Pacific Design Automation Conference, ASP-DAC 2018 - Jeju, Korea, Republic of
Duration: 2018 Jan 222018 Jan 25

Other

Other23rd Asia and South Pacific Design Automation Conference, ASP-DAC 2018
CountryKorea, Republic of
CityJeju
Period18/1/2218/1/25

Fingerprint

Neural networks
Field programmable gate arrays (FPGA)
Hardware
Degradation

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Computer Science Applications
  • Computer Graphics and Computer-Aided Design

Cite this

Jin, C., Sun, H., & Kimura, S. (2018). Sparse ternary connect: Convolutional neural networks using ternarized weights with enhanced sparsity. In ASP-DAC 2018 - 23rd Asia and South Pacific Design Automation Conference, Proceedings (Vol. 2018-January, pp. 190-195). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ASPDAC.2018.8297304

Sparse ternary connect : Convolutional neural networks using ternarized weights with enhanced sparsity. / Jin, Canran; Sun, Heming; Kimura, Shinji.

ASP-DAC 2018 - 23rd Asia and South Pacific Design Automation Conference, Proceedings. Vol. 2018-January Institute of Electrical and Electronics Engineers Inc., 2018. p. 190-195.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Jin, C, Sun, H & Kimura, S 2018, Sparse ternary connect: Convolutional neural networks using ternarized weights with enhanced sparsity. in ASP-DAC 2018 - 23rd Asia and South Pacific Design Automation Conference, Proceedings. vol. 2018-January, Institute of Electrical and Electronics Engineers Inc., pp. 190-195, 23rd Asia and South Pacific Design Automation Conference, ASP-DAC 2018, Jeju, Korea, Republic of, 18/1/22. https://doi.org/10.1109/ASPDAC.2018.8297304
Jin C, Sun H, Kimura S. Sparse ternary connect: Convolutional neural networks using ternarized weights with enhanced sparsity. In ASP-DAC 2018 - 23rd Asia and South Pacific Design Automation Conference, Proceedings. Vol. 2018-January. Institute of Electrical and Electronics Engineers Inc. 2018. p. 190-195 https://doi.org/10.1109/ASPDAC.2018.8297304
Jin, Canran ; Sun, Heming ; Kimura, Shinji. / Sparse ternary connect : Convolutional neural networks using ternarized weights with enhanced sparsity. ASP-DAC 2018 - 23rd Asia and South Pacific Design Automation Conference, Proceedings. Vol. 2018-January Institute of Electrical and Electronics Engineers Inc., 2018. pp. 190-195
@inproceedings{18cb26a828374eec96a226e40225ea6a,
title = "Sparse ternary connect: Convolutional neural networks using ternarized weights with enhanced sparsity",
abstract = "Convolutional Neural Networks (CNNs) are indispensable in a wide range of tasks to achieve state-of-the-art results. In this work, we exploit ternary weights in both inference and training of CNNs and further propose Sparse Ternary Connect (STC) where kernel weights in float value are converted to 1, -1 and 0 based on a new conversion rule with the controlled ratio of 0. STC can save hardware resource a lot with small degradation of precision. The experimental evaluation on 2 popular datasets (CIFAR-10 and SVHN) shows that the proposed method can reduce resource utilization (by 28.9{\%} of LUT, 25.3{\%} of FF, 97.5{\%} of DSP and 88.7{\%} of BRAM on Xilinx Kintex-7 FPGA) with less than 0.5{\%} accuracy loss.",
author = "Canran Jin and Heming Sun and Shinji Kimura",
year = "2018",
month = "2",
day = "20",
doi = "10.1109/ASPDAC.2018.8297304",
language = "English",
volume = "2018-January",
pages = "190--195",
booktitle = "ASP-DAC 2018 - 23rd Asia and South Pacific Design Automation Conference, Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Sparse ternary connect

T2 - Convolutional neural networks using ternarized weights with enhanced sparsity

AU - Jin, Canran

AU - Sun, Heming

AU - Kimura, Shinji

PY - 2018/2/20

Y1 - 2018/2/20

N2 - Convolutional Neural Networks (CNNs) are indispensable in a wide range of tasks to achieve state-of-the-art results. In this work, we exploit ternary weights in both inference and training of CNNs and further propose Sparse Ternary Connect (STC) where kernel weights in float value are converted to 1, -1 and 0 based on a new conversion rule with the controlled ratio of 0. STC can save hardware resource a lot with small degradation of precision. The experimental evaluation on 2 popular datasets (CIFAR-10 and SVHN) shows that the proposed method can reduce resource utilization (by 28.9% of LUT, 25.3% of FF, 97.5% of DSP and 88.7% of BRAM on Xilinx Kintex-7 FPGA) with less than 0.5% accuracy loss.

AB - Convolutional Neural Networks (CNNs) are indispensable in a wide range of tasks to achieve state-of-the-art results. In this work, we exploit ternary weights in both inference and training of CNNs and further propose Sparse Ternary Connect (STC) where kernel weights in float value are converted to 1, -1 and 0 based on a new conversion rule with the controlled ratio of 0. STC can save hardware resource a lot with small degradation of precision. The experimental evaluation on 2 popular datasets (CIFAR-10 and SVHN) shows that the proposed method can reduce resource utilization (by 28.9% of LUT, 25.3% of FF, 97.5% of DSP and 88.7% of BRAM on Xilinx Kintex-7 FPGA) with less than 0.5% accuracy loss.

UR - http://www.scopus.com/inward/record.url?scp=85045315293&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85045315293&partnerID=8YFLogxK

U2 - 10.1109/ASPDAC.2018.8297304

DO - 10.1109/ASPDAC.2018.8297304

M3 - Conference contribution

VL - 2018-January

SP - 190

EP - 195

BT - ASP-DAC 2018 - 23rd Asia and South Pacific Design Automation Conference, Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -