[vc_row full_width=”stretch_row_content_no_spaces” css=”.vc_custom_1522216945055{background-color: #131722 !important;}”][vc_column css=”.vc_custom_1522217480829{padding-right: 0px !important;padding-left: 0px !important;}”][bsfp-cryptocurrency style=”widget-20″ align=”marquee” columns=”2″ coins=”selected” coins-count=”15″ coins-selected=”BTC,ETH,XRP,LTC,EOS,BCH,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”dark” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523173421731{margin-bottom: 5px !important;}”][/vc_column][/vc_row]

Meta Quest Debuts Mixed Reality Developer Tools


On Tuesday, Meta introduced several updates to its Presence Platform to facilitate accurate mixed reality (MR) interaction, with the platform originally unveiled at last October’s Connect event.

The Menlo Park-based firm has added keyboard tracking and experimental interaction software development kits (SDK) to the platform just after the rollout of its Spatial Anchors Experimental, Voice SDK, and Passthrough API solutions aimed at increasing the Quest’s capabilities.

Last month, Meta also released Quest 2 v37, which introduced a trove of new features such as Android link sharing, tablet mode, desktop mode, improved hand tracking, and a quick action menu, and later teased its tracked keyboard SDK released yesterday following the announcement.

With Quest v37, Meta introduced various updates to improve the device’s MR features for developers and consumers, supported on both Quest and Quest 2 devices, and more recently debuted its Tracked Keyboard and Interaction SDKs for developers.

Introducing Meta’s Tracked Keyboard SDK

The new Tracked Keyboard SDK build on Quest v37, which added the ability to pair keyboards to a Quest headset via Bluetooth to view 3D representations of the keyboard and user’s hands.

The latest SDK update uses computer vision to locate and render real-world keyboards, using pre-existing hand tracking and passthrough technology to render a user’s hands and inputs.

To perfect it, Meta worked with workspace productivity app vSpatial. The platform has already integrated virtual keyboards and the partnership will further enhance the feature.

The Tracked Keyboard SDK greatly improves hybrid working environments by cutting down on equipment needs, Daniel Platt, vSpatial Co-Founder & Director of Product said, adding,

“When it comes to being truly productive, the keyboard is king. To finally bring the natural feeling of typing on a keyboard to a VR headset that doesn’t require calibrating and shows the detail of all the keys and your hands is exactly what users need to be productive”

He added his company was “extremely excited” about progress on the feature as third-party developers would not “need to create their own keyboard models and deploy their own solutions.”

Currently, Meta supports the Logitech K830 and the Apple Magic Keyboard, and the tech giant plans to add more keyboards soon.

Interaction Experimental SDK

The Interaction SDK gives developers a flexible framework that is easy to integrate into new or existing immersive applications.

Some of its features include:

  • Improved object interactions that let users resize, throw, and pass their digital items.
  • Hand Grab allows developers to create accurate, custom hand poses for when a user picks up a virtual object.
  • Pose Detection enables developers to create unique gestures that are customisable down to each digital finger.
  • Direct Touch enables VR developers to add virtual surfaces that react to a users hands with immersive haptic feedback.
  • Targeting and Selection allows users to point and click with their hands on menu panels.

Meta worked with various third-party developers and the Quest community to create an accurate MR interaction platform. Developers Odders Lab, ForeVR Games, and Miru Studio created a small selection of tech demos to show off the Interaction Experimental SDK.

For example, the trio of developers showed how to use the SDK to improve world interaction and add new ways to communicate with virtual friends.

Odder Lab used the SDK to create a multiplayer chess game that shows the MR abilities of the Quest 2. The third-party development team commented on integrating the SDK, stating,

“The improvement to the grabbing experience is remarkable. Although the gesture detection looked a bit difficult to implement at the beginning, the speed and accuracy of the detection are outstanding”

What is Presence Platform

Formally announced in October 2021 during the Meta Connect event, the Presence Platform gives VR developers the tools required to blend virtual objects with real-world surroundings.

By introducing the Presence Platform, Meta is turning the Quest headset into an MR device. Originally a VR orientated piece of kit, Meta is expanding the headset without extra hardware or peripherals.

Upcoming Presence Platform features could serve as the foundation for some of Meta’s upcoming projects including the Nazare and Cambria headsets.

Additionally, the Presence Platform contains the multi-player building blocks needed to fuel the Menlo Park-based firm’s Metaverse vision.



Leave A Reply

Your email address will not be published.