Paper

Improving Realistic Worst-Case Performance of NVCiM DNN Accelerators Through Training with Right-Censored Gaussian Noise

Publication Date:
Publication Date
2 November 2023

paper Menu

Abstract

Compute-in-Memory (CiM), built upon non-volatile memory (NVM) devices, is promising for accelerating deep neural networks (DNNs) owing to its in-situ data processing capability and superior energy efficiency. To battle device variations, noise injection training is commonly used, which perturbs weights with Gaussian noise during training to make the model more robust to weight variations. Despite its prevalence, however, existing successes are mostly empirical, and very little theoretical support is available. Even the most fundamental questions such as why Gaussian but not other types of noises should be used is not answered. In this work, through formally analyzing the effect of injecting Gaussian noise in training to improve the k-th percentile performance (KPP), a realistic worst-case performance metric, for the first time we provide a theoretical justification of the effectiveness of the approach. We further show that surprisingly Gaussian noise is not the best option, contrary to what has been taken for granted in the literature. Instead, a right-censored Gaussian noise significantly improves the KPP of DNNs. We further propose an automated method to determine the optimal hyperparameters for injecting this right-censored Gaussian noise during the training process. Our method achieves up to a 26% improvement in KPP compared to the state-of-the-art methods employed to enhance DNN robustness under the impact of device variations.

Affiliation
University of Notre Dame
Affiliation
University of Notre Dame
Affiliation
North Carolina State University
Affiliation
University of Notre Dame
IEEE Region
Region 04 (Central U.S.)
Email
Country
USA
Affiliation
University of Notre Dame
IEEE Region
Region 04 (Central U.S.)
Email