Meta describes advances in language translation, a crucial consideration in furthering metaverse connection.
Meta looks to advance its language translation tools with the launch of a new ‘No Language Left Behind’ AI model, which can translate 200 different languages, while also opening up its translation data to further improve its systems and democratize the technological access.
That video is about 3 minutes long, but the gist of it is that Meta is looking to advance its translation models to facilitate greater access, not just on today’s social media platforms, but more importantly within the space of the looming metaverse.
As Meta explains:
“Language is our lifeline to the world. But because high-quality translation tools don’t exist for hundreds of languages, billions of people today can’t access digital content or fully participate in online conversations and communities in their preferred or native languages. This is a particular problem for hundreds of millions of people who speak the many languages of Africa and Asia.”
To improve this, Meta has been developing systems that can learn language translations from smaller data sets, while also working with native speakers, where possible, to refine their systems.
Which has led to the development of its new translation model.
“We have created a single AI model called NLLB-200, which translates 200 different languages with much more accurate results than previous technology could achieve. When comparing translation quality to previous AI research, the NLLB-200 averaged 44% higher. For some African and Indian languages, the NLLB-200 translations were 70% more accurate.”
That will expand accessibility to more regions, while also ensuring that lesser-used languages live on in the future, another important consideration.
But Meta systems alone will not be able to facilitate full detection and translation of some languages. That’s why Meta also opens up its data to invite more native speakers and experts into the development process.
“We are also awarding up to $200,000 in NLLB-200 Impactful Uses Grants to researchers and nonprofits with initiatives focused on sustainability, food security, gender-based violence, education, or other areas in support of the Goals of UN Sustainable Development. Nonprofit organizations interested in using NLLB-200 to translate two or more African languages, as well as researchers working in linguistics, machine translation, and language technology, are invited to apply.”
Combined, these initiatives will help Meta develop its translation tools, eventually allowing users in the metaverse space to more easily converse and engage, in real time, through language translation tools.
Google is also advancing its efforts on this front, with its translation tools now capable of transcribe foreign language speech as it happens, providing more ways to interact in the moment.
The hope is that this will eventually facilitate greater global connection and opportunity, by removing barriers to connection. But again, that was the great hope of the Internet and social media too: that by providing a means to connect, we would facilitate greater understanding and community, by allowing more people to join the global conversation and add more. perspectives to enrich our understanding.
That’s not exactly how things have developed, but there is unique value to enhanced language translation, especially in regions where many languages are spoken, while eventually also facilitating Star Trek-like universal translator-type tools within of the spaces of the metaverse. that could open up new realms of connection and opportunity, in entirely new ways.
That’s why this is an important project, and while it may be hard to fully imagine just yet, it’s good to see Meta looking to establish translation tools at the fundamental level of metaverse change.
It could end up being a critical development; You can read more about Meta’s evolving translation process here.
Source link
Meta describes the advanced process of creating realistic digital avatars for the next stage of connection
While Meta has big dreams of a future Metaverse, where we all interact in fully digital environments, and where we can be and do anything in fully immersive worlds, there is a significant impediment to this process such as presently presents himself.
Can you guess what it is?
Apparently the metaverse is making legs obsolete. And while current avatars are also functional, in a basic sense, if Meta really wants to get people to engage with digital things, like clothing and other customization items, to make people more aligned with their virtual identity will require a better creative process, so you can build whatever representation you choose – right down to the laces of your virtual sneakers.
This is where this new development comes in.
As you can see in this new video, posted by Meta CEO Mark Zuckerberg, Meta is currently developing new technology that would enable the creation of more personalized digital recreations that simulate real human movement.
This could go a long way toward creating truly personalized and representative avatars, which Meta is also developing through its Avatar Codec technology.
As you can see from this example, which Meta shared last year, the ultimate goal is to allow users to create fully realistic versions of themselves for use in virtual worlds, which would include legs and a complete mapping of gestures.
If, of course, they choose. By the same process, you will also be able to change your appearance and change your digital identity according to your own expression. But the basic concept is that you’ll have nearly endless customization options, allowing you to conduct virtual activities with a fully formed representation of yourself, customized to your liking, and optionally outfitted with digital clothing, just like you. would buy clothes in a physical store.
This will open up a range of new opportunities for e-commerce, in terms of selling real items (because you can see what they look like before you buy) and digital items, which will be the expansion of the first NFT push .
Although there is a level of excitement about NFT profile pictures online, the real future of digital items is not in these cartoonish images, but rather in the virtual items, like digital clothing, that you you can buy and sell, and take with you to other items in the metaverse as you see fit.
Customizations like this have already proven popular in existing metaverse examples, with platforms like Roblox and Fortnite generating significant revenue from in-game customization options.

Users purchase the digital outfits, or “skins”, that they wish to use to represent themselves in these game worlds – but currently their use is limited to each individual property. The ultimate goal of the metaverse is to create a network of interoperable environments, where you can take these customizations with you – so if you choose to dress up as a banana character in Fortnite, then you can join a business meeting in the same character design.
These advanced Meta authoring tools are another step in that direction, and it’s interesting to consider the scope of possibilities they could facilitate in this regard, and where such developments will go over the next ten years.
What is the timeframe that Meta describes for its metaverse shift. While many are looking to jump in early and be in pole position for the next technological development, the reality is that these systems will take time to formulate and become more accessible to everyday users.
As a basic example – for an accurate avatar system to work and create a fully personalized 3D representation of you, you’ll need to scan yourself in a digital camera room, like the one in the meta video above .
This could possibly be another element of Meta’s retail stores, the first of which opened in California earlier this month.
Meta Store, our first physical retail and experience space, opening May 9 at our Burlingame campus (322 Airport Blvd, Burlingame, CA). Check it out! ???? pic.twitter.com/6UpKVOWiN5
— Boz (@boztank) April 25, 2022
It’s not there yet, but as Meta looks to expand its network of physical stores, it could also eventually add VR scanning booths where users can capture their virtual selves for these advanced avatars.
It’s still a long way off, but you can see where these developments are heading, which could have big implications, in a variety of ways.