Download PDFOpen PDF in browser

Elastic Deep Learning using Knowledge Distillation with Heterogeneous Computing Resources

EasyChair Preprint no. 6252

12 pagesDate: August 7, 2021

Abstract

In deep neural networks, using more layers and parameters generally improves the accuracy of the models, which get bigger. Such big models have high computational complexity and big memory requirements, which exceed the capacity of small devices for inference. Knowledge distillation is an efficient approach to compress a large deep model (a teacher model) to a compact model (a student model). Existing online knowledge distillation methods typically exploit an extra data storage layer to store the knowledge or deploy the teacher model and the student model at the same computing resource, thus hurting elasticity and fault-tolerance. In this paper, we propose an elastic deep learning framework, EDL-Dist, for large scale knowledge distillation to efficiently train the student model while exploiting elastic computing resources. The advantages of EDL-Dist are three-fold. First, it decouples the inference and the training process to use heterogeneous computing resources. Second, it can exploit dynamically available computing resources. Third, it supports fault-tolerance during the training and inference processes within knowledge distillation. Our experimental validation, based on industrial-strength implementation and real datasets, shows that the throughput of EDL-Dist is up to 181\% faster than the baseline method (online knowledge distillation).

Keyphrases: Deep Neural Network, distributed computing, Knowledge Distillation

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:6252,
  author = {Daxiang Dong and Ji Liu and Xi Wang and Weibao Gong and An Qin and Xingjian Li and Dianhai Yu and Patrick Valduriez and Dejing Dou},
  title = {Elastic Deep Learning using Knowledge Distillation with Heterogeneous Computing Resources},
  howpublished = {EasyChair Preprint no. 6252},

  year = {EasyChair, 2021}}
Download PDFOpen PDF in browser