Kyunghyun Cho on Neural Machine Translation | Conference News

Conference News

Read the Latest News About Translators' Conferences

Join Translation Journal

Click on the Subscribe button below to receive regular updates.

Subscribe

Kyunghyun Cho on Neural Machine Translation

The 2016 conference of the Globalization and Localization Association (GALA) offered a wide variety of presentations for attendees to choose from. One extremely popular session was “Neural Machine Translation” led by Kyunghyun Cho, a preeminent researcher and assistant professor of computer science at New York University.

Cho’s session was easily one of the most technically complex of the entire conference, but both he and his audience were up for the challenge. Cho kept his presentation upbeat and joked about being at GALA 2016, saying that as a specialist in machine translation, he was trained to run from human translators.

Super simple, right?

 

Cho began by introducing the concept of neural machine translation, which began as an offshoot of statistical machine translation in the late 1990s but was mostly forgotten by the time Cho started working on it in 2013. He credited an overall increase in computing power with opening up previously unthinkable possibilities in the field.

Cho went on to explain the concept of neural networks, also referred to as “neural nets,” which he said are already being used more than most people realize, from the speech recognition feature on mobile phones to the function of tagging photos on Facebook. Neural machine translation involves the construction of a giant neural network with a continually updating memory. 

In discussing various types of neural machine translation, Cho described one model in which a complete source sentence is contained in a single memory vector, or “context vector,” for translation. He also talked about a predictive, “attention-based” model which continually builds on its current knowledge while also calculating the probabilities of which words will be used in various syntactical constructions.

Following a lengthy discussion of this densely technical matter, Cho said with a smile, “So, this is super simple, right?” eliciting relieved laughter from his audience.

Character-level decoding

Cho then turned his attention to the future of machine translation and the astounding concept of “character-level decoding without segmentation.” Character-level decoding involves sequencing individual characters (in this case, individual letters) rather than entire words.

He explained that systems such as Google Translate break down sentences into words which are then machine translated, reordered and returned to actual language sentence by sentence. But recent studies have found character-level decoding to actually outperform the existing word-based approaches.

Put another way, like a Star Trek transporter, character-level decoding breaks down language into units even smaller than words — into the characters themselves — and then manages to reassemble these units into a comprehensible, translated text. Cho’s enthusiasm about these developments was apparent, but he remained composed, saying with a touch of wonder: “It just worked.”

 

Going beyond language

Finally, in another fascinating segment, Cho showed that the model for neural machine translation is not in fact specific to language, and can therefore utilize sources beyond language. In one study, the model was successfully trained to describe an image, a feat which captures the imagination with regard to the future of artificial intelligence. He also touched briefly on the intriguing idea that speech itself is actually a form of translation.

In closing, Cho predicted that developments in character-level, larger-context, multilingual translation will soon bring about a leap in the quality of future machine translation — a prediction whose larger meaning for the translation industry remains to be seen.

Kyunghyun Cho’s complete GALA 2016 session “Neural Machine Translation,” including a download of the many technical slides used in his presentation, is available on the GALA website.Access to the conference video is free for registered attendees and $60 for non-attendees.

Search for Articles

Log in

Log in