Self-attentive residual decoder for neural machine translation

Miculicich Werlen, Lesly (Idiap Research Institute, Switzerland ; École polytechnique fédérale de Lausanne (EPFL), Switzerland) ; Pappas, Nikolaos (Idiap Research Institute, Switzerland) ; Ram, Dhananjay (Idiap Research Institute, Switzerland ; Ecole polytechnique fédérale de Lausanne (EPFL), Switzerland) ; Popescu-Belis, Andrei (School of Management and Engineering Vaud, HES-SO // University of Applied Sciences Western Switzerland)

Neural sequence-to-sequence networks with attention have achieved remarkable performance for machine translation. One of the reasons for their effectiveness is their ability to capture relevant source-side contextual information at each time-step prediction through an attention mechanism. However, the target-side context is solely based on the sequence model which, in practice, is prone to a recency bias and lacks the ability to capture effectively nonsequential dependencies among words. To address this limitation, we propose a target-sideattentive residual recurrent network for decoding, where attention over previous words contributes directly to the prediction of the next word. The residual learning facilitates the flow of information from the distant past and is able to emphasize any of the previously translated words, hence it gains access to a wider context. The proposed model outperforms a neural MT baseline as well as a memory and self-attention network on three language pairs. The analysis of the attention learned by the decoder confirms that it emphasizes a wider context, and that it captures syntactic-like structures.


Mots-clés:
Type de conférence:
full paper
Faculté:
Ingénierie et Architecture
Ecole:
HEIG-VD Haute Ecole d’Ingénierie et de Gestion du Canton de Vaud
Institut:
IICT - Institut des Technologies de l'Information et de la Communication
Classification:
Ingénierie
Adresse bibliogr.:
New Orleans, USA, 1-6 June
Date:
New Orleans, USA
1-6 June
2018
Pagination:
14 pages
Publié dans
Proceedings of the 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Numérotation (vol. no.):
2018
Le document apparaît dans:

Note: The status of this file is: restricted


 Notice créée le 2018-05-22, modifiée le 2018-08-19

Fichiers:
Télécharger le document
PDF

Évaluer ce document:

Rate this document:
1
2
3
 
(Pas encore évalué)