homehome Home chatchat Notifications


Google's Neural Machine can translate nearly as well as a human

It all happened because they let the program learn without interference.

Alexandru Micu
October 5, 2016 @ 7:19 pm

share Share

A new translation system unveiled by Google, the Neural Machine Translation (GNMT) framework comes close to human translators in it’s proficiency.

Public domain image.

Not knowing the local language can be hell — but Google’s new translation software might prove to be the bilingual travel partner you’re always wanted. A recently released paper notes that Google’s Neural Machine Translation system (GNMT) reduces translation errors by an average of 60% compared to the familiar phrase-based approach. The framework is based on unsupervised deep learning technology.

Deep learning simulates the way our brains form connections and process information inside a computer. Virtual neurons are mapped out by a program, and the connections between them receive a numerical value, a “weight”. The weight determines how each of these virtual neurons treats data imputed to it — low-weight neurons recognize the basic features of data, which they feed to the heavier neurons for further processing, and so on. The end goal is to create a software that can learn to recognize patterns in data and respond to each one accordingly.

Programmers train these frameworks by feeding them data, such as digitized images or sound waves. They rely on big sets of training data and powerful computers to work effectively, which are becoming increasingly available. Deep learning has proven its worth in image and speech recognition in the past, and adapting it to translation seems like the logical next step.

And it works like a charm

GNMT draws on 16 processors to transform words into a value called “vector.” This represents how closely it relates to other words in its training database — 2.5 billion sentence pairs for English and French, and 500 million for English and Chinese. “Leaf” is more related to “tree” than to “car”, for example, and the name “George Washington” is more related to “Roosevelt” than to “Himalaya”, for example. Using the vectors of the input words, the system chooses a list of possible translations, ranked based on their probability of occurrence. Cross-checking helps improve overall accuracy.

The increased accuracy in translation happened because Google let their neural network do without much of the previous supervision from programmers. They fed the initial data, but let the computer take over from there, training itself. This approach is called unsupervised learning, and has proven to be more efficient than previous supervised learning techniques, where humans held a large measure of control on the learning process.

In a series of tests pitting the system against human translators, it came close to matching their fluency for some languages. Bilingually fluent people rated the system between 64 and 87 percent better than the previous one. While some things still slip through GNMT’s fingers, such as slang or colloquialisms, those are some solid results.

Google is already using the new system for Chinese to English translation, and plans to completely replace it’s current translation software with GNMT.

 

share Share

Biggest Modern Excavation in Tower of London Unearths the Stories of the Forgotten Inhabitants

As the dig deeper under the Tower of London they are unearthing as much history as stone.

Millions Of Users Are Turning To AI Jesus For Guidance And Experts Warn It Could Be Dangerous

AI chatbots posing as Jesus raise questions about profit, theology, and manipulation.

Can Giant Airbags Make Plane Crashes Survivable? Two Engineers Think So

Two young inventors designed an AI-powered system to cocoon planes before impact.

First Food to Boost Immunity: Why Blueberries Could Be Your Baby’s Best First Bite

Blueberries have the potential to give a sweet head start to your baby’s gut and immunity.

Ice Age People Used 32 Repeating Symbols in Caves Across the World. They May Reveal the First Steps Toward Writing

These simple dots and zigzags from 40,000 years ago may have been the world’s first symbols.

NASA Found Signs That Dwarf Planet Ceres May Have Once Supported Life

In its youth, the dwarf planet Ceres may have brewed a chemical banquet beneath its icy crust.

Nudists Are Furious Over Elon Musk's Plan to Expand SpaceX Launches in Florida -- And They're Fighting Back

A legal nude beach in Florida may become the latest casualty of the space race

A Pig Kidney Transplant Saved This Man's Life — And Now the FDA Is Betting It Could Save Thousands More

A New Hampshire man no longer needs dialysis thanks to a gene-edited pig kidney.

The Earliest Titanium Dental Implants From the 1980s Are Still Working Nearly 40 Years Later

Longest implant study shows titanium roots still going strong decades later.

Common Painkillers Are Also Fueling Antibiotic Resistance

The antibiotic is only one factor creating resistance. Common painkillers seem to supercharge the process.