Home
EPiC Series
Kalpa Publications
Preprints
For Authors
For Editors
Keyword
:
Knowledge Distillation
Publications
Distillation-Based Model Compression Framework for Swin Transformer
Mazen Amria
,
Aziz M. Qaroush
,
Mohammad Jubran
,
Alaa Zuhd
and
Ahmad Khatib
EasyChair Preprint 15433
Optimizing Convolutional Neural Network Models for Resource-Constrained Devices in Telemedicine: a Lightweight Approach
Dylan Stilinki
EasyChair Preprint 14660
Improved Knowledge Distillation for Crowd Counting on IoT Devices
Zuo Huang
and
Richard Sinnott
EasyChair Preprint 10722
Rapid and High-Purity Seed Grading Based on Pruned Deep Convolutional Neural Network
Huanyu Li
,
Cuicao Zhang
,
Chunlei Li
,
Zhoufeng Liu
,
Yan Dong
and
Shuili Tang
EasyChair Preprint 7221
Elastic Deep Learning using Knowledge Distillation with Heterogeneous Computing Resources
Daxiang Dong
,
Ji Liu
,
Xi Wang
,
Weibao Gong
,
An Qin
,
Xingjian Li
,
Dianhai Yu
,
Patrick Valduriez
and
Dejing Dou
EasyChair Preprint 6252
Copyright © 2012-2024 easychair.org. All rights reserved.