Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 13 von 70
2023 3rd International Conference on Neural Networks, Information and Communication Engineering (NNICE), 2023, p.78-83
2023
Volltextzugriff (PDF)

Details

Autor(en) / Beteiligte
Titel
MTL-JER: Meta-Transfer Learning for Low-Resource Joint Entity and Relation Extraction
Ist Teil von
  • 2023 3rd International Conference on Neural Networks, Information and Communication Engineering (NNICE), 2023, p.78-83
Ort / Verlag
IEEE
Erscheinungsjahr
2023
Quelle
IEEE/IET Electronic Library
Beschreibungen/Notizen
  • Joint entity and relation extraction has achieved impressive advances in NLP, such as document understanding and knowledge graph construction. The typical methods for entity and relation extraction typically break down the joint task into several smaller components or stages for ease of implementation, but this leads to a loss of the interconnected knowledge in the triple. Hence, we propose to model the triple in one module jointly. Furthermore, the labeling of a joint entity and relation extraction tasks is costly and domain-specific; therefore, it is important to improve its performance on low-resource data and domain adaption. To address this issue, we suggest using two sources that are rich in information, namely pre-trained models on large data and multi-domain text corpora. Pretraining allows us to provide the model with the fundamental ability to perform joint entity and relationship extraction. Second, through meta-learning on multi-domain text, we can improve the model's generalization capabilities, enabling it to perform well even with limited data. We present MTL-JER, a Meta-Transfer Learning method for Joint Entity and Relation Extraction in low-resource settings in this paper. Using exhaustive experiments on five datasets, we prove that our model obtains optimal results.
Sprache
Englisch
Identifikatoren
DOI: 10.1109/NNICE58320.2023.10105766
Titel-ID: cdi_ieee_primary_10105766

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX