Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 3 von 106
2019 Fifth Workshop on Energy Efficient Machine Learning and Cognitive Computing - NeurIPS Edition (EMC2-NIPS), 2019, p.44-47
2019
Volltextzugriff (PDF)

Details

Autor(en) / Beteiligte
Titel
Training Compact Models for Low Resource Entity Tagging using Pre-trained Language Models
Ist Teil von
  • 2019 Fifth Workshop on Energy Efficient Machine Learning and Cognitive Computing - NeurIPS Edition (EMC2-NIPS), 2019, p.44-47
Ort / Verlag
IEEE
Erscheinungsjahr
2019
Quelle
IEEE/IET Electronic Library (IEL)
Beschreibungen/Notizen
  • Training models on low-resource named entity recognition tasks has been shown to be a challenge [1], especially in industrial applications where deploying updated models is a continuous effort and crucial for business operations. In such cases there is often an abundance of unlabeled data, while labeled data is scarce or unavailable. Pre-trained language models trained to extract contextual features from text were shown to improve many natural language processing (NLP) tasks, including scarcely labeled tasks, by leveraging transfer learning. However, such models impose a heavy memory and computational burden, making it a challenge to train and deploy such models for inference use. In this work-in-progress we combined the effectiveness of transfer learning provided by pre-trained masked language models with a semi-supervised approach to train a fast and compact model using labeled and unlabeled examples. Preliminary evaluations show that the compact models can achieve competitive accuracy with × 36 compression rate when compared with a state-of-the-art pre-trained language model, and run significantly faster in inference, allowing deployment of such models in production environments or on edge devices.
Sprache
Englisch
Identifikatoren
DOI: 10.1109/EMC2-NIPS53020.2019.00018
Titel-ID: cdi_ieee_primary_9463575

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX