I would like to introduce a paper from the University of Cambridge where the evolution of machine translation in 2019 is very well organized.
We have developed our own translation Chatbot Plugin for Slack, Kiara. Lead engineer Harada is working hard as Japan's first Slack Developer Chapter Leader. https://kiara-app.com/ (Free trial version available) With a passion for Slack and the work style revolution, we will continue to excite the developer community.
Abstract
Neural Machine Translation: A Review
(Submitted on 4 Dec 2019) The field of machine translation (MT), the automatic translation of written text from one natural language into another, has experienced a major paradigm shift in recent years. Statistical MT, which mainly relies on various count-based models and which used to dominate MT research for decades, has largely been superseded by neural machine translation (NMT), which tackles translation with a single neural network. In this work we will trace back the origins of modern NMT architectures to word and sentence embeddings and earlier examples of the encoder-decoder network family. We will conclude with a survey of recent trends in the field.
The field of machine translation (MT), the automatic translation of sentences from one natural language to another, has undergone a major paradigm shift in recent years. Statistical MT, which relied primarily on various count-based models and dominated MT research for decades, has been largely replaced by Neural Machine Translation (NMT), which works on translation with a single neural network. This task traces the origins of the latest NMT architectures to word and sentence embeddings and previous examples of encoder / decoder network families. Finally, we will investigate recent trends in this area.
https://arxiv.org/abs/1912.02047
Conclusion
Neural machine translation (NMT) has become the de facto standard for large-scale machine translation in a very short period of time. This article traced back the origin of NMT to word and sentence embeddings and neural language models. We reviewed the most commonly used building blocks of NMT architectures – recurrence, convolution, and attention – and discussed popular concrete architectures such as RNNsearch, GNMT, ConvS2S, and the Transformer. We discussed the advantages and disadvantages of several important design choices that have to be made to design a good NMT system with respect to decoding, training, and segmentation. We then explored advanced topics in NMT research such as explainability and data sparsity.
NMT=Neural Machine Translation
Word Embeddings
Phrase Embeddings
Sentence Embeddings
Encoder-Decoder Networks
Attentional Encoder-Decoder Networks
Recurrent NMT
Convolutional NMT
Self attention based NMT
Search problem in NMT
Greedy and beam search
Decoding direction
Generating diverse translation
Simultaneous translation
Open vocabulary NMT
NMT model errors
Reinforcement learning
Adversarial training
Explainable NMT
Multilingual NMT
Recommended Posts