允中 发自 凹非寺 量子位 报道 | 公众号 QbitAI
2019年的新年list列了吗?
不如把系统性学习机器翻译加入计划。
而且好消息是,清华大学自然语言处理组(THUNLP),刚刚整理完成了机器翻译阅读清单。
这份资源不简单。
首先是全。回顾了统计机器翻译(SMT)时代的亮点论文,并概括了近期神经机器翻译(NMT)方向下的各个子领域。
其中包括:
模型架构,注意力机制,开放词表问题与字符级别神经机器翻译,训练准则与框架,解码机制,低资源语言翻译,多语种机器翻译,先验知识融合,文档级别机器翻译,机器翻译中的鲁棒性,可视化与可解释性,公正性与多样性,机器翻译效率问题,语音翻译与同传翻译,多模态翻译,预训练方法,领域适配问题,质量估计,自动后处理,推导双语词典以及诗歌翻译。
其次是系统。有论文,有教程,还有模型。
即便就安安静静当一个伸手党,这份资料也能帮助你不断升级打怪层层进步。
下面,我们转列10大必读论文,更多资料还可通过传送门获得。
10大机器翻译必读论文
Peter E. Brown, Stephen A. Della Pietra, Vincent J. Della Pietra, and Robert L. Mercer. 1993. The Mathematics of Statistical Machine Translation: Parameter Estimation. Computational Linguistics.
http://aclweb.org/anthology/J93-2003
Kishore Papineni, Salim Roukos, Todd Ward, and Wei-Jing Zhu. 2002. BLEU: a Method for Automatic Evaluation of Machine Translation. In Proceedings of ACL 2002.
http://aclweb.org/anthology/P02-1040
Philipp Koehn, Franz J. Och, and Daniel Marcu. 2003. Statistical Phrase-Based Translation. In Proceedings of NAACL 2003.
http://aclweb.org/anthology/N03-1017
Franz Josef Och. 2003. Minimum Error Rate Training in Statistical Machine Translation. In Proceedings of ACL 2003.
http://aclweb.org/anthology/P03-1021
David Chiang. 2007. Hierarchical Phrase-Based Translation. Computational Linguistics.
http://aclweb.org/anthology/J07-2003
Ilya Sutskever, Oriol Vinyals, and Quoc V. Le. 2014. Sequence to Sequence Learning with Neural Networks. In Proceedings of NIPS 2014.
https://papers.nips.cc/paper/5346-sequence-to-sequence-learning-with-neural-networks.pdf
Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. 2015. Neural Machine Translation by Jointly Learning to Align and Translate. In Proceedings of ICLR 2015.
https://arxiv.org/pdf/1409.0473.pdf
Diederik P. Kingma, Jimmy Ba. 2015. Adam: A Method for Stochastic Optimization. In Proceedings of ICLR 2015.
https://arxiv.org/pdf/1412.6980.pdf
Rico Sennrich, Barry Haddow, and Alexandra Birch. 2016. Neural Machine Translation of Rare Words with Subword Units. In Proceedings of ACL 2016.
https://arxiv.org/pdf/1508.07909.pdf
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin. 2017. Attention is All You Need. In Proceedings of NIPS 2017.
https://papers.nips.cc/paper/7181-attention-is-all-you-need.pdf
传送门
十大论文之外,清华大学NLP-MT组完整学习清单:
https://github.com/THUNLP-MT/MT-Reading-List
— 完 —