Mixed-precision architecture based on computational memory for training deep neural networks

S. R. Nandakumar, Manuel Le Gallo, Irem Boybat, Bipin Rajendran, Abu Sebastian, Evangelos Eleftheriou

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

49 Citations (Scopus)

Abstract

Deep neural networks (DNN) have revolutionized the field of machine learning by providing unprecedented human-like performance in solving many real-world problems such as image or speech recognition. Training of large DNNs, however, is a computationally intensive task, and this necessitates the development of novel computing architectures targeting this application. A computational memory unit where resistive memory devices are organized in crossbar arrays can be used to store the synaptic weights in their conductance states. The expensive multiply accumulate operations can be performed in place using Kirchhoff's circuit laws in a non-von Neumann manner. However, a key challenge remains the inability to alter the conductance states of the devices in a reliable manner during the weight update process. We propose a mixed-precision architecture that combines a computational memory unit storing the synaptic weights with a digital processing unit and an additional memory unit that stores the accumulated weight updates in high precision. The new architecture delivers classification accuracies comparable to those of floating-point implementations without being constrained by challenges associated with the non-ideal weight update characteristics of emerging resistive memories. The computational memory unit in a two layer neural network realized using nonlinear stochastic models of phase-change memory achieves a test accuracy of 97.40% in the MNIST digit classification problem.

Original languageEnglish
Title of host publication2018 IEEE International Symposium on Circuits and Systems, ISCAS 2018 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781538648810
DOIs
Publication statusPublished - 26 Apr 2018
Event2018 IEEE International Symposium on Circuits and Systems, ISCAS 2018 - Florence, Italy
Duration: 27 May 201830 May 2018

Publication series

NameProceedings - IEEE International Symposium on Circuits and Systems
Volume2018-May
ISSN (Print)0271-4310

Conference

Conference2018 IEEE International Symposium on Circuits and Systems, ISCAS 2018
Country/TerritoryItaly
CityFlorence
Period27/05/201830/05/2018

Keywords

  • Deep learning
  • In-memory computing
  • Mixed-precision computing
  • Phase-change memory

Fingerprint

Dive into the research topics of 'Mixed-precision architecture based on computational memory for training deep neural networks'. Together they form a unique fingerprint.

Cite this