Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
An Effective Coverage Approach for Attention-based Neural Machine Translation
Ist Teil von
2019 6th NAFOSTED Conference on Information and Computer Science (NICS), 2019, p.240-245
Ort / Verlag
IEEE
Erscheinungsjahr
2019
Quelle
IEEE Electronic Library (IEL)
Beschreibungen/Notizen
Neural Machine Translation recently has become the state-of-the-art approach in Machine Translation. One of the more advanced techniques concerning this approach, the attention model, tends to not use alignments from past translation steps and selects the context word purely using the devised attention score. Unfortunately, this sometimes leads to repetition and omission of important words in translations. To solve this problem, we propose a simple approach using coverage techniques that can be used in conjunction with a diverse number of attention models. Our experiments show that our improved technique increases the quality of translation on both English -Vietnamese and Japanese - Vietnamese language pairings.