3)翻譯句子
th translate.lua -model model_final.t7 -src data/src-test.txt -output pred.txt
查看指南了解更多:http://opennmt.github.io/Guide
研究
其中主要的模型基于論文 Neural Machine Translation by Jointly Learning to Align and Translate Bahdanau et al. ICLR 2015 和 Effective Approaches to Attention-based Neural Machine Translation, Luong et al. EMNLP 2015。
在基本模型上,還有大量可選項,這都要感謝 SYSTRAN(http://www.systransoft.com/)的出色工作。特別地,下面是
一些實現(xiàn)的功能:
Effective Approaches to Attention-based Neural Machine Translation . Luong et al., EMNLP 2015.
Character-based Neural Machine Translation. Costa-Jussa and Fonollosa, ACL 2016.
Compression of Neural Machine Translation Models via Pruning . See et al., COLING 2016.
Sequence-Level Knowledge Distillation . Kim and Rush., EMNLP 2016.
Deep Recurrent Models with Fast Forward Connections for Neural Machine Translation . Zhou et al, TACL 2016.
Guided Alignment Training for Topic-Aware Neural Machine Translation . Chen et al., arXiv:1607.01628.
Linguistic Input Features Improve Neural Machine Translation . Senrich et al., arXiv:1606.02892
聲明
OpenNMT 的實現(xiàn)使用了以下項目的代碼:
Andrej Karpathy 的 char-rnn:https://github.com/karpathy/char-rnn
Wojciech Zaremba 的 LSTM:https://github.com/wojzaremba/lstm
Element RNN 庫:https://github.com/Element-Research/rnn
證書
MIT
上一篇:北美知名翻譯公司Eriksen Translations Inc.獲得本地的《公司文化獎》
下一篇:InventHelp客戶發(fā)明了實時語言翻譯設備
微信公眾號搜索“譯員”關注我們,每天為您推送翻譯理論和技巧,外語學習及翻譯招聘信息。
相關新聞