The Oculus Interaction SDK has finally been released! Want to try? Public Github SDK Download

By 
XR Bootcamp
February 2, 2022

A new kid on the block: 
The Meta/Oculus Interaction SDK.

If you have been developing VR experiences lately, you know that a proper Oculus Interaction SDK has been missing. If you have been using the Oculus/Meta Integration for creating rich hand interactions and intuitive movements for Virtual Reality applications, you know how limited and difficult it can be to start an interaction-rich experience without needing to code most of the stuff yourself.

So, how do you integrate Oculus with Unity? What is this Oculus SDK? How do I use the Oculus XR Plugin? How do I download Oculus SDK? Let's get started with Oculus, Oculus Quest Hand Tracking SDK, Meta Quest Development, hand tracking implementation and hands tracking SDK in Unity.

Probably, many times you needed to import and use other complementary SDKs such as Unity’s XR Interaction Toolkit, Microsoft’s MRTK, VR Interaction Framework, etc. Well, it looks like those days are (and hopefully might be) over.

The Oculus Interaction SDK just released by Meta/Oculus (yes, please let’s keep using the word “Oculus” as long as we can) it’s a very complete set and library of tools, components, prefabs and examples that will tackle all your basic needs when starting to develop better and richer experiences (optional with Passthrough features), including some features as

  • Hand Pose grabbing: we can now pre-set how a hand will grab a specific interactable
  • New ray interactors: to interact with UI in the same way as home menus
  • Curved UI and canvases. Yay! (like Oculus/Meta Menus)
  • Poke interaction: using your index finger to interact with UI, buttons, scroll views
  • Pose detection, such as detecting a “thumbs up” hand pose
  • Complex Physics grabbing such as two hand based scaling, rotation constraints, etc

Previously, each of these features would have needed an external (and most of the time, paid) third party asset, or multiple nights without sleeping trying to figure out how to make the tip of your finger touch a world space Unity Canvas.

It is important to point out that it is still a preview package, so there might be still some issues as pointed out in the official SDK documentation.

Let’s now deep dive into how to use the new Interaction SDK!

 

Clone, Download, Play! Public Github Interaction SDK Setup

Don't want to waste time? Test our Oculus Interaction SDK setup!

The Oculus Interaction SDK Experimental is a library of modular, composable components that allows developers to implement a range of robust, standardized interactions (including grab, poke, raycast and more) for controllers and hands. We created this repository so that everyone, beginner and seasoned developers, can test out this new SDK by Oculus without the hassle of setting up the development environment yourselves.

Just clone/download and hit PLAY!

 

Download Public Github Interaction SDK Setup.

How To Install the Oculus Interaction SDK

Download link: https://developer.oculus.com/documentation/unity/unity-isdk-interaction-sdk-overview/

This time the new SDK is included with the newest Oculus Integration for Unity (unlike the last Meta Avatars SDK released which is still a separate package). So for installing, just make sure you install Oculus Integration version 37.0 through the package manager in Unity 2020.3 LTS (or 2019.4 if you are using legacy XR Setup, more info here).

 

Besides that you need to, of course, configure the development environment for your Quest device for Meta Quest Development.

 

Let's dig in! Example Prefabs and scenes

To start, we can go to the example scenes of the Oculus Interaction SDK, which you can find in the following path after importing Oculus Integration:

 

 

The first thing we notice after opening one of the scenes, is the environment with some of the new art guidelines we could see in the latest Facebook/Meta Connect, with soft and light colours. It also has a really nice stencil shader in the windows. Very elegant and minimalistic, well done Oculus/Meta.

 

 

We’ll be testing the scenes with hand tracking, so first, lets go to the OVRCameraRig and let’s enable both controllers and hands (optionally, you can set the Hand Tracking Frequency to MAX, but remember this will reserve some performance headroom from you app, but in the case of this almost empty room, it’s ok).

 



Example scene 1: BasicGrab

In this scene of the Oculus Interaction SDK, we can see different examples of the grabbing capabilities we can explore. Pinch Grab, the first example, is basically the grab we all know from Oculus (it is setup differently though, using the new Interfaces provided with the SDK). Then we can see Pinch Grab, Palm Grab and Combined Grab, all of these are set up similarly, where poses are pre defined for both right and left hands, with the freedom to choose what fingers are constrained to the poses, and what fingers are free.

In the last example with the cup, we can look into HandGrabPoint class and check that in one of the poses, every finger is Locked, but the pinky is set to Free, which allows us to drink tea as the royalty.



Example scene 2: BasicPoke

In this scene of the Oculus Interaction SDK, we find another four examples, this time about using your index finger to interact with different types of objects, introducing a new class called PokeInteractable.

 

 

It seems that this class allows us to use our fingers to interact with 3D and 2D stuff equally, which is neat. The examples here are: Button, Pressy Touchpad, Unity Canvas & Scrolling, Hover Above Touch:

What is really satisfying, is that the interaction includes some resistance, so your hand will not cross through the button, contributing to the haptic sensation.

This is achieved through setting the ProximityField (another new class needed for your poke interaction) with an offset, as we can see in the “Pressy Touchpad” example:

 

Green squares are the ProximityFields.



Example scene 3: BasicPoseDetection

In this scene of the Oculus Interaction SDK, we can test the hand-tracking-exclusive new feature of pose detection.

We can create new poses to be recognized by right clicking in our Project window and clicking on Create>Oculus>Interaction>SDK>Pose Detection> Shape.

 

 

A Shape Recognizer is a scriptable object that stores all the states of the different fingers, so for example, the thumbs up pose consists of the thumb finger Curl set to Open, and the rest of the fingers Curl and Flexion set to Not Open. These are medical concepts from the movement of fingers and muscles in general, I found this video explaining them a bit.

 



Example scene 4: ComplexGrab

Translate on Plane: This is a great example on how the new interactables can be configured to create interactions with constraints, like a plane.

Rotate with min/max angle: Finally, we can configure our own doors, let's hope it doesn't turn into a nightmare as it's been happening since always in the history of game engines. https://www.ign.com/articles/putting-doors-in-video-games-is-a-nightmare-say-developers.

2H Transform: 

We can also, finally, have 2 hand based interactions such as scaling an object, something very useful in design and creative apps. We can also test throwing and physics behaviors in this example, great!



Example scene 5: BasicRay

Here we can see 3 different types of curved, yes, curved canvases with different type of rendering modes: Alpha Blended, Underlay and Alpha Cutout. We can interact with them with a ray coming from the hands which feels exactly the same as when interacting with Oculus’ menuses in Home.

 

Summary: Realistic hand & controller interactions have now become much more easy!

With this Oculus Interaction SDK, Oculus Interaction Integration is starting to fill a big gap existing since the release of some more advanced interaction SDKs such as MRTK for example. The Interaction SDK is still not as complete as Microsoft’s counterpart, but it is definitely a very solid start.

I also think that with this SDK, Oculus/Meta is trying to educate and level up developers in general, teaching some best practices and a more generalized approach in input and interactables. This is good for the VR ecosystem, but is still a bold thing to do, because there’s always the chance that developers feel they need to completely re-learn how the integration works. 

On the other hand, it is very exciting to see newer SDK releases (such as Meta Avatars and this one) to fully support and embrace hand tracking. 

Personally, I think hand tracking will be the standard of this new era of spatialized computing and let me add that it would not surprise me that the Quest 3 comes with optional controllers, so the (growing and growing) number of people interested only in hantracking can buy the device cheaper, and game oriented users can pay extra for controllers. Hand tracking is just a great way to get super immersed in any Virtual Reality experience, making VR more and more fun moving forward.  

Also this way Meta can maybe pull it off again releasing a better and newer device priced lower than its predecessor? Lets see!

Discuss further about the Interaction SDK in our Discord Channel!


© 2021 XR BOOTCAMP. All Rights Reserved