Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 6 von 84830
Open Access
Knowledge Distillation: A Survey
International journal of computer vision, 2021-06, Vol.129 (6), p.1789-1819
2021

Details

Autor(en) / Beteiligte
Titel
Knowledge Distillation: A Survey
Ist Teil von
  • International journal of computer vision, 2021-06, Vol.129 (6), p.1789-1819
Ort / Verlag
New York: Springer US
Erscheinungsjahr
2021
Link zum Volltext
Quelle
SpringerLink (Online service)
Beschreibungen/Notizen
  • In recent years, deep neural networks have been successful in both industry and academia, especially for computer vision tasks. The great success of deep learning is mainly due to its scalability to encode large-scale data and to maneuver billions of model parameters. However, it is a challenge to deploy these cumbersome deep models on devices with limited resources, e.g., mobile phones and embedded devices, not only because of the high computational complexity but also the large storage requirements. To this end, a variety of model compression and acceleration techniques have been developed. As a representative type of model compression and acceleration, knowledge distillation effectively learns a small student model from a large teacher model. It has received rapid increasing attention from the community. This paper provides a comprehensive survey of knowledge distillation from the perspectives of knowledge categories, training schemes, teacher–student architecture, distillation algorithms, performance comparison and applications. Furthermore, challenges in knowledge distillation are briefly reviewed and comments on future research are discussed and forwarded.

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX