Amid all the hoopla around augmented and virtual reality (AR/VR), the potential of mixed reality (MR) is often ignored. When in fact, mixed reality has the power to engage with users in more meaningful, organic, and possibly more natural ways than AR/VR.
Research suggests that MR coupled with AR can increase customer confidence when shopping and 71% would react favourably to these tools. By 2026, the combined MR and AR market will be worth around $104 billion globally.
This is probably why Facebook (now Meta) has paid attention to MR opportunities during its recent pivot around the upcoming metaverse offering.
Its popular Quest VR experience will soon feature MR and persistent digital objects superimposed onto the real world, alongside pure-play virtual reality apps.
MR for Quest is still in the works and development efforts will be aided by Meta’s recently launched Insight SDK.
At Facebook Connect 2021, the company announced a number of significant moves, including its new branding as Meta, new VR products, and the Presence Platform for developers.
The Presence Platform will equip Facebook’s developer community with the tools they need to build VR worlds and their many components.
Almost like the Apple Developer program or the Google Play Console, the Presence Platform will be a one-stop solution for anyone doing VR for Meta.
The Insight SDK (along with other software development kits) is part of the Presence Platform.
Meta’s Insight SDK can be defined as a software development kit that will allow Quest developers to build mixed reality user experiences with an authentic sense of presence in terms of persistent objects, colour recreation, scene awareness, and scale measurements, while also ensuring they are indexable and queryable.
In other words, Insight SDK will help bring MR scenarios to life with a verisimilitude not possible before and at a fraction of the development effort. Underlying all of this is Meta’s proprietary machine perception and artificial intelligence IP.
Using Insight SDK, developers will be able to gain from the following capabilities:
1. General availability of Passthrough API
Meta launched an experimental version of the Passthrough API that would allow developers to insert black-and-white images from the VR headset’s sensors into an MR experience.
In addition, to this foundational capability, developers can also add effects, manipulate the image’s appearance, and even create scenarios where virtual assets are superimposed on top of real-world surroundings.
With the new Insight SDK, Passthrough exits the experimental mode to become generally available.
2. Spatial Anchors Experimental
This is a brand new capability introduced at Connect 2021. Spatial anchors refer to “world-locked frames of reference” that let you position virtual objects in physical surroundings in a persistent manner.
Simply put, when a Quest user places a virtual object on a real-world surface and logs off, the object will be waiting in the same position next time they log in. Spatial Anchors can be created at six degrees of freedom (6DOF) angles, with tracking and history retrieval capabilities.
3. A new Scene Understanding feature
Scene understanding is what makes AR and MR feel real to the user. To take a simple example, a Snapchat filter knows exactly how your facial features are arranged so that it can position a hat, a wig, or animal ears accordingly.
It allows the digital environment to assess the physical surroundings to build links and provide scene-aware experiences. Meta’s Insight SDK has a new scene understanding feature, which includes several advanced tools within it.
4. Scene Model for room design to scale
Scene model is among the sub-capabilities under Meta’s Scene Understanding. Scene Model will automatically assess a user’s surrounding space to provide you with a geometrically and semantically accurate representation.
As a result, you can scale MR experiences and virtual rooms down to the T, and attach objects with realistic occlusion – i.e., the phenomena where parts of a virtual object are hidden or overlapped by physical objects to create a realistic sense of depth.
5. System-guided Scene Capture
This feature is designed to save developers a massive amount of effort and time. The prebuilt Scene Capture capability will allow users to walk about wearing VR headsets and capture the physical environment around them.
Again, Scene Capture is part of scene understanding in Insight SDK – together with Scene Models, it will help recreate MR representations of the physical world.
6. Data privacy considerations
At the very outset, Meta had announced that it would respect user privacy and build a responsible metaverse. To that end, the Insight SDK is careful to not rely on personally identifiable data or confidential developer/user information.
Developers can use Insight SDK’s components to create MR experiences without having to use any raw videos or image data collected via Quest sensors.
Insight SDK is only one part of the Presence Platform for Quest developers. Along with this, Meta will provide:
- Interaction SDK – for hands and controller-centric interactions
- Tracked Keyboard SDK – VR keyboard for typing in the metaverse
- Voice SDK – an experimental SDK for natural language, hands-free navigation
Some of these features and SDKs are available in experimental versions, while others are in GA already – some, like Scene Understanding, will be launched in experimental mode next year.
One of the key traits of the metaverse as outlined at Connect is decentralisation. This means that no one company or individual will control or enable it.
Offerings like Insight SDK democratise development for the metaverse, providing VR designers and software engineers with an easy ramp-up to commercialisation. On the other hand, Insight SDK and other components of the Presence Platform are still tied to the Quest platform.
It remains to be seen whether Meta’s Insight SDK could power wider, cross-technology development with Facebook’s impressive machine perception and AI tools at their core.
Insight SDK will be launched with a complete set of capabilities in 2022.