Facebook’s New Language Translation Approach is Nine Times Faster

The Facebook AI research team announced today it has developed a new approach to language translation that is nine times faster than rival techniques. The team was able to achieve this by using convolutional neural networks (CNNs) instead of recurrent neural networks (RNNs) for language translation. This is because RNNs analyze data sequentially, working from left to right to translate a sentence word for word. By using CNNs, the Facebook team was able to tackle the problem more holistically, as these types of neural networks look at different aspects of the data simultaneously.

Breaking the Language Barrier

Although RNNs have typically outperformed CNNs at language translation tasks, the Facebook AI Research team felt that their design had limitations, which is why they took another look at CNNs for these type of tasks. “The greater computational efficiency of CNNs has the potential to scale translation and cover more of the world’s 6,500 languages,” said the company in a blog post. The language barrier is an important obstacle to overcome in today’s global economy, and a task that other tech companies have struggled with. While the new approach is still in the research phase, Facebook is now one step closer to achieving its mission to connect the world’s people.