How to enable VR Hand Tracking 2.0 in your VR app

By 
XR Bootcamp
April 20, 2022

Meta's VR Hand-Tracking 2.0 Update

 

As you know from our Advanced VR Interactions Master Class, we're big fans of intuitive and natural VR hand-tracking. It's just so important for the feeling of presence in VR, and making games and apps feel natural and immersive.

In a recent blogpost, Meta announced that the Meta Quest Hand tracking API, Presence Platform's hand tracking capability, Hand Tracking 2.0 is now available with an improved tracking system powered by computer vision and machine learning which will allow the device to keep track of hands when they move fast when they overlap, and when they do more complex gestures, etc.

Most importantly improvements include

  • a better understanding of hand poses through deep learning (especially when the cameras can't see the full hand or when hands are moving fast)
    • i.e. hand-over-hand interactions (like clapping, giving high-fives, etc)
    • gesture recognition of gestures like pinches, grabs, pokes (helpful for i.e. playing instruments in VR)

If you need a full tutorial and sample project files to dive deep into the Oculus Interaction SDK, here's the tutorial and Public Github SDK Download link.


In Meta's words: “We envision a future where your hands are able to move as naturally and intuitively in the metaverse as they do in real life. Over the past few years, we’ve made major strides in making that a (virtual) reality, beginning with the introduction of hand tracking on Quest in 2019. Since then, we’ve seen many developers build hand mechanics into their immersive experiences.”


Developer Feedback on Meta's VR Hand-Tracking Update 2.0

Among the early adopters and testers of the hand tracking 2.0 system, Dennys and Roger, the creators of the Hand Physics Lab (also featured by Mark Zuckerberg on his recent facebook post) and Master Trainers of the  Advanced Interactions Master Class comment:

"This update to hand tracking is a big step forward for natural and intuitive interactions with hands. One of the biggest challenges when building a hands-first application is the reliability and accuracy of hand tracking provided by the system. Hand Physics Lab was designed to highlight what was possible at the time and to challenge the user to play with the limitations of the technology. With this big improvement, we hope more people will discover what hand tracking has to offer to immersive experiences."


How to Enable VR Hand-Tracking 2.0 in your VR Project

So, how to enable Hand Tracking 2.0 in my VR project?

If you've already previously built hand tracking into your your VR apps, you don't need to change the existing API calls.

The best part is that you only need to add one line to the custom android manifest of your project. In the case of Unity, you first need to create a custom manifest file. For doing that, go to Oculus>Tools>Create store-compatible AndroidManifest.xml

VR Hand-tracking

This will create an android manifest file (if you didn’t have one previously) in the Plugins/Android folder. So let’s go to that folder and open the manifest.

And now, we need to add the line:
<meta-data android:name="com.oculus.handtracking.version" android:value="V2.0"/>

To the manifest, so it should look something like this:

VR hand-tracking

(We only added line 13 to the manifest, the rest is created automatically by Oculus Integration when we generate the manifest).

And that’s it. Go and clap in VR (because, now, you can) to the engineers of Meta for the improved hand tracking technology :)!

If you're just getting started with hand tracking, we'd recommend you to check our article on Hand Tracking here: https://xrbootcamp.com/meta-oculus-interaction-sdk/

Reach out to us in our Discord Channel if you need more support
© 2021 XR BOOTCAMP. All Rights Reserved