[vc_row full_width=”stretch_row_content_no_spaces” css=”.vc_custom_1522216945055{background-color: #131722 !important;}”][vc_column css=”.vc_custom_1522217480829{padding-right: 0px !important;padding-left: 0px !important;}”][bsfp-cryptocurrency style=”widget-20″ align=”marquee” columns=”2″ coins=”selected” coins-count=”15″ coins-selected=”BTC,ETH,XRP,LTC,EOS,BCH,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”dark” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523173421731{margin-bottom: 5px !important;}”][/vc_column][/vc_row]

What does the Meta AI Showcase Mean for XR?

0

Meta CEO Mark Zuckerberg once again took the virtual stage on Wednesday to tease the public with new information about the firm’s commitment to artificial intelligence (AI) and how it will help to develop extended reality (XR) technologies and, most notably, Metaverse experiences.

During the online event, the CEO and other Meta execs announced a trove of new AI technology for solutions ready to try today and for upcoming ones.

Many of the announcements fall into the firm’s ongoing vision to provide a leading Metaverse experience with its Horizon Worlds platform.

Additionally, Meta AI developments will continue to innovate how individuals and groups use augmented, virtual, and mixed reality (AR/VR/MR) hardware, and software by introducing AI-driven contextual digital assistants.

The developments come after Meta announced its Super Cluster AI supercomputer in January, which aims to become the world’s fastest supercomputer, namely for natural language processing, computer vision processing, and deep learning research. Meta expanded upon much of its research during its online event by outlining developer opportunities and potential public use cases.

What is Builderbot?

Builder Bot is currently the best visual representation of how Meta AI will create the future of the Metaverse. The tool employs Meta’s self-learning AI platform to generate instant virtual environments.

Individuals use voice-recognition systems to suggest environments to Builder Bot to automatically create, or import, on the spot such as parks and beaches. Builder Bot can also add smaller objects such as clouds or trees with a user’s voice command.

In addition to adding objects, the voice-activated bot also adds sound effects and music to immerse a user in digital environments.

The Builder Bot promotional trailer contains graphics similar to Horizon: Worlds, the Menlo Park-based firm’s in-house Metaverse platform that allows individuals to develop custom worlds such as immersive games or virtual meeting spaces.

Last year, Meta released Horizon Worlds as a beta for users in the United States (US) and Canada aged 18 and up, offering access to a VR content-creation ecosystem to develop, distribute, and experience custom creations.

Project CAIRaoke

There is a lot to unpack with Project CAIRaoke, from exciting breakthroughs in self-learning and conversational AI to its practical use cases in everyday life. All this works together to build the future of digital assistants, Meta’s vision for Horizon Worlds, and future Builder Bot functions.

Project CAIRaoke employs an end-to-end neural model that generates a fluid, conversational AI system, marking the early stages of Meta’s goals to create digital assistants.

CAIRaoke contains four key AI platforms: natural language understanding (NLU), dialogue state tracking (DST), dialogue policy (DP) management, and natural language generation (NLG).

The building blocks of Project CAIRaoke PHOTO: Meta

By combining the four tools, Project CAIRaoke does not use scripted dialogue but generates real-time, contextualised and personalised conversation flows to help individuals in their day-to-day routines.

AI Assistants in AR Smart Glasses

In terms of XR, Meta plans to employ Project CAIRaoke digital assistants into its upcoming Project Nazare AR smart glasses – the firm also suggests integration into future VR headsets.

Meta originally teased AR digital assistants at its Connect Event late last year, revealing how users with the experimental smart glasses could talk to digital assistants and receive AR visuals.

Additionally, the AI platform scans a user’s environment to improve the identification of real-world items and the role each plays in their routines.

In July last year, Meta, then Facebook, partnered with AI and spatial data company Matterport to “help change the way we work and live” with a Habitat-Matterport 3D Research Dataset (HM3D) database of digital twins of indoor spaces.

The revolutionary partnership trains AI under HM3D to recognise objects and follow instructions faster and more accurately.

 

At the time, Dhruv Batra, Research Scientist for Meta AI, foreshadowed this week’s announcements, stating,

“Now imagine a world in which an AR assistant can tell you, ‘Hey, you forgot your keys on the coffee table’”

In a recent press release, Meta explained,

“On devices like VR headsets and AR glasses, we expect this type of communication to eventually be the ubiquitous, seamless method for navigation and interaction, much as how touchscreens replaced keypads on smartphones”

Already, Meta is testing Project CAIRaoke on Portal devices – the firm’s video calling and digital assistance device.

Meta is Going Open-Source

To close the event, Meta unveiled TorchRec, a free open-source platform with a public domain library for building and deploying AI-based recommendation systems.

The assets enable individuals using TorchRec to observe and learn from the latest Meta AI research, bugs, fixes, as well as accessing resources. Developers can access AI data under a single system, allowing Meta and its developer community to solve AI problems together with transparency.

Also, TorchRec enables Meta to provide large-scale benchmark tests, using public datasets, to show AI performance at a far greater scale to speed up research and developments (R&D) times and improve the contextual power of Project CAIRoake.

TorchRec is debuting as an open-source platform, creating a transparent space that invites developers to build the programme indefinitely. Currently, there is a selection of open-source XR platforms available for developers such as the Khronos Group, Open XR, and the Institute of Electricals and Electronics Engineer’s (IEEE) own open standards policy.

Leave A Reply

Your email address will not be published.