Download PDFOpen PDF in browser

ARTS: an Adaptive Regularization Training Schedule for Activation Sparsity Exploration

EasyChair Preprint no. 9317

8 pagesDate: November 14, 2022

Abstract

Brain-inspired event-based processors have attracted considerable attention for edge deployment because of their ability to efficiently process Convolutional Neural Networks (CNNs) by exploiting sparsity. On such processors, one critical feature is that the speed and energy consumption of CNN inference are approximately proportional to the number of non-zero values in the activation maps. Thus, to achieve top performance, an efficient training algorithm is required to largely suppress the activations in CNNs. We propose a novel training method, called Adaptive-Regularization Training Schedule (ARTS), which dramatically decreases the non-zero activations in a model by adaptively altering the regularization coefficient through training. We evaluate our method across an extensive range of computer vision applications, including image classification, object recognition, depth estimation, and semantic segmentation. The results show that our technique can achieve 1.41 times to 6.00 times more activation suppression on top of ReLU activation across various networks and applications, and outperforms the state-of-the-art methods in terms of training time, activation suppression gains, and accuracy. A case study for a commercially-available event-based processor, Neuronflow, shows that the activation suppression achieved by ARTS effectively reduces CNN inference latency by up to 8.4 times and energy consumption by up to 14.1 times.

Keyphrases: activation sparsification, computation efficiency, deep learning, Efficient Training, Energy reduction

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:9317,
  author = {Zeqi Zhu and Arash Pourtaherian and Luc Waeijen and Egor Bondarev and Orlando Moreira},
  title = {ARTS: an Adaptive Regularization Training Schedule for Activation Sparsity Exploration},
  howpublished = {EasyChair Preprint no. 9317},

  year = {EasyChair, 2022}}
Download PDFOpen PDF in browser