The best 5 VR SDKs for Interactions for Unity & Unreal

By 
XR Bootcamp
March 18, 2022

How to select the right AR/VR Interaction SDK for your project

Have you ever considered changing your preferred VR SDK? Are you curious about what you have been missing out on by not selecting another VR Interaction SDK for your project (especially after the release of Meta's new Presence Platform and VR Interaction SDK)? This article will give you the overview you need to decide about your best choice for your next VR Development project. 

Have you recently started to implement Unity VR SDKs? Are you a beginner learning about Virtual Reality tools, software, and software development kits? Have you maybe followed our Unity VR tutorial on how to get started with Unity VR development? Now, you're probably ready to take the next step. You have started your VR or AR project with Unity, maybe even a VR Game Development project, and you've already downloaded the Unity Hub, created your project, opened it, connected your VR device, and imported your 3D models.

Now you're asking yourself how to get started with VR Interactions and are looking for VR tools. How will the user of your VR app or game interact with the digital world you have created? This is the “magic” of your future VR or AR application.

VR SDK

The Magic of XR Interactions - Courtesy of Holonautic’s Hand Physics Lab

This Unity VR SDK for Unity tutorial and overview may also be helpful if you've only worked on one specific Unity Virtual Reality SDK in the past and now want to build VR apps or XR games with another one. Based on our comparison table, you can easily compare the platform you've

already been working with, with another one you may be working with in the future. In addition, you may be curious which features Meta has added to their recently launched Interaction VR SDK, and how it compares to the previously existing ones.

 

There are many new releases, updates and new Interaction SDKs popping up every day. That's why we created an Interaction SDK Knowledgebase on Notion, where we are regularly tracking and adding new frameworks and features constantly. If you would like to add your comments, or input, or wish to just get access to the VR Interactions SDK Overview Table, feel free to submit your e-mail - we will send you the link.

 

Get access to the VR Interactions SDK Overview Table

What is an Interaction SDK?

Before starting, we shortly want to define what an Interactions SDK is. “SDK you say?” Yes, an SDK, or Software Development Kit. It contains not only script libraries, but also documentation, guides, tutorials, examples, etc, to make the work easier for developers.

Unity VR SDK

Source: clevertap.com

VR SDKs are great Virtual reality tools and a way to easily implement tons of pre-built and configured interactions in your XR project such as grabbing objects, interacting with UI, recognizing hand gestures, etc.

When using the appropriate Unity Virtual Reality SDK for Unity VR projects, we can focus on building and designing our experience rather than creating it from scratch. This way you can get your XR project up and running.

 

The Importance of VR Interactions

Interactions are simply the core of the XR experience. We can have an incredible world that we created, perfectly tracked hands that match our real hands, beautiful visuals and sound… but what makes XR really outstanding is how we can interact with the world around us.

This is what makes designing interactions difficult and so important in VR. More than any other medium, VR interactions are centered around your physical surroundings and senses. "Immersive" or "real" experience in virtual reality are the keywords we think about interactions. No matter how fantastic your VR app is, having grounded interactions that make the user feel comfortable in the environment is essential. It could be the difference between "fun" and "boring."

Some of the main interactions you should be looking for in a good Interactions SDK, are:

Grabbing

Grabbing is one of the first things you try in VR development. Grabbing may sound like an easy thing to implement, but a good SDK will provide a more polished feel to the grabbing experience, and give you plenty of settings to customize how the grabbing works.

Physics Interactions: 

When we are moving around in virtual worlds, our body and mind will expect stuff to happen just like in the real world. That’s why good physics interactions are crucial to create immersive AR and VR experiences. Examples for physics interactions are opening a drawer, pushing a button, moving a lever, etc.

VR SDK

PC VR’s Biggest Innovation from SadlyItsBradley

UI Interactions: 

We will definitely need menus and setting screens in our experience or game, and good and reactive UI components are a must.

Button builder example from Ultraleap.

There’s still much design exploration to do in this topic, as we are transitioning from 2D to 3D, so every SDK will present different solutions to this topic. Some examples of UI Interactions are interacting with UI using pointers, Interacting with UI using the tip of the finger, etc.

Locomotion: 

Locomotion refers to the ability we have to move inside the virtual world, for example teleporting to a different place, or using the controllers (or hand tracking gestures) to continuously move through the environment.

This is, in general, a controversial topic where you will find separated opinions, because some people are more prompt to VR motion sickness with one type of locomotion over the other.

That's why, depending on your use case, you might want an SDK that offers some type of locomotion system.

Hand tracking smooth teleport, part of the curriculum in our Advanced VR Interactions bootcamp.

How to choose the best VR SDK for your project?

Here are some of the most important criteria you should consider before choosing your VR SDK for Unity, but remember, these criteria are also conditioned by your use case. For example, if you are building a game, you might want to focus on controller support, physics interactions, locomotion, etc. On the other hand, if you’re looking into building an educational experience, you will prioritize interactions such as hand tracking, UI, multiple target devices, etc.

  • Multiple XR target devices: if the SDK lets us build to different devices/platforms easily.
  • Target XR environment: if we can build VR, AR, MR, etc.
  • Development engine/IDE: Unity, Unreal, etc.
  • Implementation difficulty level: How difficult is it to the interaction and features in our project?
  • Development activity and source code access: When you choose an SDK, it is very important to check what’s the activity of the development, if its source code is open, and to get a general idea on who is behind it to make sure the framework is guaranteed to exist in the following years.
  • OpenXR support: OpenXR is a royalty-free, open standard, that manages the communication between the game engine and the devices. If you want your game/app to be future proof, it is important to consider it.
  • Physics interactions: if the SDK includes physics interactions support or examples.
  • Hand tracking support: if the SDK has support for handling tracking our hands using, for example, the external cameras of the device.
  • Device/Action based: Is the SDK detecting inputs based on the device (Oculus Quest, HTC Vive, HoloLens) or is the detection based on the action (Select - grip button pressed - , Activate - trigger button pressed -  etc.)?
  • Interaction controller samples: e.g.: left/right hand controller with hover, select and activate inputs pre-configured
  • UI Interactions: Does the SDK provide UI interactions support and samples?
  • Locomotion: If the SDK includes support for locomotion systems as teleporting, or controller continuous movement.
  • Two hand based interactions: does the SDK provide support for interactions using both hands at the same time? For example, grabbing an object with both hands at the same time, and scaling it.
  • Official Toolkit website & community: webpages for manuals, documentation and community.

Now that we know the criteria, we’ll have a look at some of the most used SDK's out there:

The XR Interaction Toolkit (Unity's official XR interaction toolkit, recently out of preview phase, meaning that Unity considers it to be good enough for production.), the MRTK (Microsoft’s official MR (Mixed Reality) toolkit, originally released for Hololens development, but then extended to other platforms), the VRTK (The virtual reality toolkit was one of the first widely adopted tool kits/SDK for VR development with Unity. Low code and beginner-friendly), the Oculus Interaction SDK (Official SDK from Oculus. It is focused on developing for Meta/Oculus devices), the XRTK (The XR Tool Kit is a community-supported fork of Microsoft’s MRTK. The XRTK differs from the MRTK with its heavy focus on efficient architecture and lightweight SDK that promotes additional expandability but may be more difficult for beginners due to the lack of sample scenes.

To help you choose, and to give you a better overview over the VR SDKs for Unity, we prepared the following comparison table looking at all previously mentioned criteria.

The VR SDK Comparison Table

Virtual Reality Toolkit

Windows Mixed Reality Toolkit

Mixed Reality Toolkit

XR Interaction Toolkit

Oculus Interaction SDK

Multiple XR Target devices/ platforms

Oculus Steam VR HP Reverb Windows MR Pico Varjo   Vive

Oculus Steam VR AR Core AR Kit Magic Leap Hololens Windows MR

Oculus Steam VR AR Core AR Kit Magic Leap Hololens Windows MR

Oculus Steam VR HP Reverb Pico AR Core AR Kit Magic Leap Varjo   Vive

Oculus

Target XR Environment

VR

VR          AR

VR          AR

VR          AR

VR

Game  Engine

Update Frequency

Open Source

OpenXR Support

Physics Interactions

Multiple samples

Multiple samples

Using the Unity editor

Some samples

Hand-Tracking Support

Including HandTracking API

Controller support

Input Detection

Device

Action

Action

Device & Action

Device

Interaction Controller Samples

Multiple

Multiple

Limited

Limited

Limited

UI Interaction Samples

Multiple

Multiple

Limited

Limited

Limited

Locomotion/ Teleportation

Outdated, devs need to modify some code

2 Hands-based Interactions

Multiple Samples

Multiple Samples

Can be implemented

Only a small scaling implementation

Included Hands (default hands)

Add custom hands

Modeling/ rigging knowledge required

Modeling/ rigging knowledge required

Accessibility Options

Difficulty level

 

There are many new releases, updates and new Interaction SDKs popping up every day. That's why we created an Interaction SDK Knowledgebase on Notion, where we are regularly tracking and adding new frameworks and features constantly. If you would like to add your comments, or input, or wish to just get access to the VR Interactions SDK Overview Table, feel free to submit your e-mail - we will send you the link.

Get access to the VR Interactions SDK Overview Table


Advantages and Disadvantages of VR SDKs for Interactions

The XR Interaction Toolkit

The XR InteractionToolkit is a Unity VR tool. It is part of Unity's XR Plugin Architecture (XR Tech Stack), and it contains a set of components that allow you to create quick and easy XR interactions with minimum effort. Also, it uses Unity's Input System to identify the input events. More info about XRIT here

 

Here’s a tutorial about how to use  Interactions and Locomotion with this VR SDK for Unity:  here

Pros:

  • XR development (AR, VR, MR)
  • Easy to implement with pre-configured files (samples) for basic XR interactions such as Select, Grab and Hover
  • Scalable
  • OpenXR support
  • Room scale and stationary experiences available
  • Cross platform (see image above)
  • Allows physics interactions and locomotion 
  • Very good documentation
  • Big community (Unity answers)
  • Pretty good amount of Youtube content
  • Free (for Unity personal plan)

Cons:

  • Some developers find it difficult to add custom features. This is because the XR Interaction Toolkit uses Unity's Input System, which is more difficult to learn.
  • There are no UI, 1 or 2 hand interaction samples.

The MRTK

The Mixed Reality Toolkit (MRTK) is a Microsoft-driven project that provides a set of components and features, used to accelerate cross-platform MR app development in Unity. 

If you’re interested in exploring the MRTK further, check our HoloLens & Mixed Reality Master Class here.

Here are some of its functions:

  • Provides the cross-platform input system and building blocks for spatial interactions and UI.
  • Enables rapid prototyping via in-editor simulation that allows you to see changes immediately.
  • Operates as an extensible framework that provides developers the ability to swap out core components.
  • Supports a wide range of platforms.

Pros:

  • Supports a wide range of devices
  • Very robust support from Microsoft
  • Open source virtual reality software and community supported GitHub repository
  • Excellent sample scenes provided
  • Modular SDK allows users to only download components needed to reduce size and complexity
  • A large number of pre-built UI Elements and Interaction Concepts
  • You can migrate from MRTK to XRTK with relatively few steps

Cons:

  • Usage of this SDK differs heavily from standard Unity conventions, which may confuse both new and experienced Unity developers.
  • Many settings need to be configured properly, which may cause frustration during development if some settings are missed, resulting in errors.
  • Constant evolving XR standards (e.g., OpenXR) and MRTK’s attempt to support many platforms often means that the MRTK’s various platform support are only partial in nature, which may lead to frustration for developers looking to use MRTK for multi-platform app creation.

The VRTK

VRTK example scenes link

This VR SDK provides a huge variety of reusable interaction and UI classes/components that can be used to achieve all kinds of VR interactions including physic interactions, two hand based interactions, reactive UI panels, locomotion, etc.

In both most used versions (v3 and v4), you can find a big scene including examples, which is very useful to build your first VR experiences, as you can “drag and drop” the examples into your own projects and adapt them to your own personal use.

Pros:

  • Balanced UI and Physical interactions.
  • Lots of examples within the VR SDK.
  • The Discord community is active, and the developer (The_StoneFox) responds to questions from time to time.
  • Relatively good amount of community content (youtube tutorials/blog posts).
  • MIT License, free for personal or commercial use.

Cons:

  • It might be difficult to create new interactions that are not explained in examples.
  • It’s getting outdated regarding new technologies and standards such as hand tracking, OpenXR, etc.
  • Not as widely used as 2 or 3 years ago.

The Oculus Integration SDK

Oculus Integration SDK, which is also an Oculus dev kit, provides a good amount of tools and features out of the box. They update the kit quite often, and new exciting features are being added very frequently (this can lead to the toolkit being unorganized and outdated). The last SDK update, v37, featured the Interactions SDK, which presents new tools to build more complex and richer interactions. 

Since v31 (August 2021), OpenXR is the main API behind the SDK, meaning that new features will be delivered through this standard. For example, the passthrough feature is exclusive to OpenXR and it is not supported with the old mobile or pc APIs.

Pros:

  • You are able to access all the newest features from the most widely used VR Headset up to this date, which have some of the more advanced and polished characteristics, such as hand tracking, passthrough, etc.
  • OpenXR support.

Cons:

  • Biggest con is that building for Meta/Oculus headsets using their SDK, confines your app/game to their platform, as most of their more advanced features are specific to Meta/Oculus hardware and platform, which also requires a Facebook account currently.
  • Examples are now divided into pre and post Interaction SDK (more info about the newly Interaction SDK here) release, so it might be confusing for new developers, and if you go through all the examples.
  • Community is not very relevant, as most developers migrate to a different SDK after a while, and use Oculus Integration SDK as an additional layer if they want to specifically build for Meta/Oculus headsets and use headset specific features.

The XRTK.io

The XRTK is a fork of Microsoft’s MRTK. The XRTK is also pronounced “Mixed Reality Toolkit.” The XRTK fork was a result of a schism that occurred between the MRTK developers from Microsoft and non-Microsoft MRTK contributors regarding the Mixed Reality Toolkit’s ideology, architecture, and contribution process. 

XRTK's vision is simple, to provide a complete cross-platform solution for AR/XR/VR development that supports three different developer skill levels:

  • Beginner No Coding Required: Perfect for artists, Hackathons, and Quick Prototyping.
  • Intermediate Customizable: The framework is flexible enough so coders can customize what they need to cover edge cases with ease.
  • Advanced Extensible: The framework is easy to extend and modify to add additional custom services to meet specific criteria and needs.

Pros:

  • Lightweight
  • Efficient architecture
  • Familiarity for those that already use the MRTK
  • Strong multi-platform support
  • Very flexible and expandable framework

Cons:

  • Relatively few sample scenes
  • Architecture, while efficient, may be difficult for beginners to grasp
  • Slower development and support from core contributors of the XRTK

The best VR SDK out there is…

All VR SDKs for Unity have their pros and cons. It comes down to what works best for you, depending heavily on your use case and target platforms you wish to build for.

If you are looking into building a VR game in multiple XR platforms, we recommend going for Unity’s XR Interaction Toolkit as this VR SDK for Unity focuses on controllers, locomotion, and physics interactions. It provides a lot of compatibility with other Unity assets, the ability to build to multiple devices and platforms, the possibility to build AR experiences, and it includes OpenXR support.

Oculus Integration SDK is great if you plan to only create products for the Meta platform. You get access to the cutting-edge features that Meta develops such as hand-tracking and Meta avatars, as well as many of the new updates that Meta plans to release down the line. Oculus is currently the most widely used VR platform. So choosing Oculus Integration is a good pick if you know that you will be primarily on the Oculus platform. That said if you are looking for a more multi-platform use case we still suggest XR Interaction Toolkit for now. 

For non gaming purposes, such as educational or enterprise experiences, you should be looking into using either MRTK or XRTK , as they are heavily focused on hand tracking and UI components. Microsoft has done an amazing job creating UI design tools for this next generation of spatial computing.

 

This article has been written by: 

  • Jonathan Lourie, Hernando Nieto, and Lucas Martinic, VR Developers and Content Creators at XR Bootcamp
  • Sean Ong, Hololens & Mixed Reality Master Class Trainer at XR Bootcamp

In case you have any questions about this article or VR SDKs in general, feel free to reach out at mentors@xrbootcamp.com.

 

There are many new releases, updates and new Interaction SDKs popping up every day. That's why we created an Interaction SDK Knowledgebase on Notion, where we are regularly tracking and adding new frameworks and features constantly. If you would like to add your comments, or input, or wish to just get access to the VR Interactions SDK Overview Table, feel free to submit your e-mail - we will send you the link.

 

Get access to the VR Interactions SDK Overview Table
© 2021 XR BOOTCAMP. All Rights Reserved