
At Meta’s latest event, founder Mark Zuckerberg showcased new artificial intelligence projects the company has been working on in hopes of integrating the tools into its planned metaverse. The tools include a new voice assistant, a voice-operated digital object generator, and a universal speech transition system, which will make use of AI to translate languages.
“The ability to communicate with anyone in any language—that’s a superpower people have dreamed of forever, and AI is going to deliver that in our lifetimes,” said Zuckerberg, as per Protocol. Two approaches will be used for the forthcoming translation model, the first of which, ‘No Language Left Behind’, will focus on more obscure, “low-resource languages” that have less training data for AI systems to learn from. According to Gizmodo, Meta estimated 20% of the world’s population use languages in this category, which has left them largely excluded from the online world thus far.
In essence, this approach will enable high-quality translations to be made for these under-served languages.
The second initiative, a ‘Universal Speech Translator’, will see augmented reality (AR) play a part in translating speech from one language to another in real-time. A demonstration video showcased combining the translator with AR glasses and other wearables, so users can communicate with those speaking another language directly. “In the not too distant future,” Meta said these translation tools could be integrated into virtual worlds—like its metaverse—to allow users to interact with anyone from around the globe, “just as they would with someone next door.”
https://www.facebook.com/watch/?v=2063827667110161
https://gizmodo.com/meta-wants-to-bring-ai-and-universal-translation-to-the-1848584856
https://www.protocol.com/bulletins/meta-ai-translation-metaverse-babelfish?
Leave a Reply