Sparse ternary connect: Convolutional neural networks using ternarized weights with enhanced sparsity

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

Convolutional Neural Networks (CNNs) are indispensable in a wide range of tasks to achieve state-of-the-art results. In this work, we exploit ternary weights in both inference and training of CNNs and further propose Sparse Ternary Connect (STC) where kernel weights in float value are converted to 1, -1 and 0 based on a new conversion rule with the controlled ratio of 0. STC can save hardware resource a lot with small degradation of precision. The experimental evaluation on 2 popular datasets (CIFAR-10 and SVHN) shows that the proposed method can reduce resource utilization (by 28.9% of LUT, 25.3% of FF, 97.5% of DSP and 88.7% of BRAM on Xilinx Kintex-7 FPGA) with less than 0.5% accuracy loss.

Original languageEnglish
Title of host publicationASP-DAC 2018 - 23rd Asia and South Pacific Design Automation Conference, Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages190-195
Number of pages6
ISBN (Electronic)9781509006021
DOIs
Publication statusPublished - 2018 Feb 20
Event23rd Asia and South Pacific Design Automation Conference, ASP-DAC 2018 - Jeju, Korea, Republic of
Duration: 2018 Jan 222018 Jan 25

Publication series

NameProceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC
Volume2018-January

Other

Other23rd Asia and South Pacific Design Automation Conference, ASP-DAC 2018
Country/TerritoryKorea, Republic of
CityJeju
Period18/1/2218/1/25

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Computer Science Applications
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'Sparse ternary connect: Convolutional neural networks using ternarized weights with enhanced sparsity'. Together they form a unique fingerprint.

Cite this