Abstract
A significant challenge for machine translation (MT) is the phenomena of dropped pronouns (DPs), where certain classes of pronouns are frequently dropped in the source language but should be retained in the target language. In response to this common problem, we propose a semi-supervised approach with a universal framework to recall missing pronouns in translation. Firstly, we build training data for DP generation in which the DPs are automatically labelled according to the alignment information from a parallel corpus. Secondly, we build a deep learning-based DP generator for input sentences in decoding when no corresponding references exist. More specifically, the generation has two phases: (1) DP position detection, which is modeled as a sequential labelling task with recurrent neural networks; and (2) DP prediction, which employs a multilayer perceptron with rich features. Finally, we integrate the above outputs into our statistical MT (SMT) system to recall missing pronouns by both extracting rules from the DP-labelled training data and translating the DP-generated input sentences. To validate the robustness of our approach, we investigate our approach on both Chinese–English and Japanese–English corpora extracted from movie subtitles. Compared with an SMT baseline system, experimental results show that our approach achieves a significant improvement of + 1.58 BLEU points in translation performance with 66% F-score for DP generation accuracy for Chinese–English, and nearly + 1 BLEU point with 58% F-score for Japanese–English. We believe that this work could help both MT researchers and industries to boost the performance of MT systems between pro-drop and non-pro-drop languages.
Original language | English |
---|---|
Pages (from-to) | 65-87 |
Number of pages | 23 |
Journal | Machine Translation |
Volume | 31 |
Issue number | 1-2 |
DOIs | |
Publication status | Published - 1 Jun 2017 |
Externally published | Yes |
Keywords
- Dropped pronoun annotation
- Dropped pronoun generation
- Machine translation
- Multilayer perceptron
- Pro-drop language
- Recurrent neural networks
- Semi-supervised approach