A Low-Cost Training Method of ReRAM Inference Accelerator Chips for Binarized Neural Networks to Recover Accuracy Degradation due to Statistical Variabilities

Zian Chen, Takashi Ohsawa

Research output: Contribution to journalArticlepeer-review

Abstract

A new software based in-situ training (SBIST) method to achieve high accuracies is proposed for binarized neural networks inference accelerator chips in which measured offsets in sense amplifiers (activation binarizers) are transformed into biases in the training software. To expedite this individual training, the initial values for the weights are taken from results of a common forming training process which is conducted in advance by using the offset fluctuation distribution averaged over the fabrication line. SPICE simulation inference results for the accelerator predict that the accuracy recovers to higher than 90% even when the amplifier offset is as large as 40mV only after a few epochs of the individual training.

Original languageEnglish
Pages (from-to)375-384
Number of pages10
JournalIEICE Transactions on Electronics
VolumeE105.C
Issue number8
DOIs
Publication statusPublished - 2022 Aug

Keywords

  • binarized neural networks (BNNs)
  • deep neural networks (DNNs)
  • fabrication fluctuation
  • in-memory computing
  • in-situ training
  • ReRAM

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'A Low-Cost Training Method of ReRAM Inference Accelerator Chips for Binarized Neural Networks to Recover Accuracy Degradation due to Statistical Variabilities'. Together they form a unique fingerprint.

Cite this