When using the appropriate Unity Virtual Reality SDK for Unity VR projects, we can focus on building and designing our experience rather than creating it from scratch. This way you can get your XR project up and running.
Have you ever considered changing your preferred VR SDK? Are you curious about what you have been missing out on by not selecting another VR Interaction SDK for your project (especially after the release of Meta's new Presence Platform and VR Interaction SDK)? This article will give you the overview you need to decide about your best choice for your next VR Development project.
Have you recently started to implement Unity VR SDKs? Are you a beginner learning about Virtual Reality tools, software, and software development kits? Have you maybe followed our Unity VR tutorial on how to get started with Unity VR development? Now, you're probably ready to take the next step. You have started your VR or AR project with Unity, maybe even a VR Game Development project, and you've already downloaded the Unity Hub, created your project, opened it, connected your VR device, and imported your 3D models.
Now you're asking yourself how to get started with VR Interactions and are looking for VR tools. How will the user of your VR app or game interact with the digital world you have created? This is the “magic” of your future VR or AR application.
The Magic of XR Interactions - Courtesy of Holonautic’s Hand Physics Lab
This Unity VR SDK for Unity tutorial and overview may also be helpful if you've only worked on one specific Unity Virtual Reality SDK in the past and now want to build VR apps or XR games with another one. Based on our comparison table, you can easily compare the platform you've
already been working with, with another one you may be working with in the future. In addition, you may be curious which features Meta has added to their recently launched Interaction VR SDK, and how it compares to the previously existing ones.
There are many new releases, updates and new Interaction SDKs popping up every day. That's why we created an Interaction SDK Knowledgebase on Notion, where we are regularly tracking and adding new frameworks and features constantly. If you would like to add your comments, or input, or wish to just get access to the VR Interactions SDK Overview Table, feel free to submit your e-mail - we will send you the link.
Before starting, we shortly want to define what an Interactions SDK is. “SDK you say?” Yes, an SDK, or Software Development Kit. It contains not only script libraries, but also documentation, guides, tutorials, examples, etc, to make the work easier for developers.
Source: clevertap.com
VR SDKs are great Virtual reality tools and a way to easily implement tons of pre-built and configured interactions in your XR project such as grabbing objects, interacting with UI, recognizing hand gestures, etc.
Interactions are simply the core of the XR experience. We can have an incredible world that we created, perfectly tracked hands that match our real hands, beautiful visuals and sound… but what makes XR really outstanding is how we can interact with the world around us.
This is what makes designing interactions difficult and so important in VR. More than any other medium, VR interactions are centered around your physical surroundings and senses. "Immersive" or "real" experience in virtual reality are the keywords we think about interactions. No matter how fantastic your VR app is, having grounded interactions that make the user feel comfortable in the environment is essential. It could be the difference between "fun" and "boring."
Some of the main interactions you should be looking for in a good Interactions SDK, are:
Lucas Martinic - XR Bootcamp - The Oculus Interaction SDK
Grabbing:
Grabbing is one of the first things you try in VR development. Grabbing may sound like an easy thing to implement, but a good SDK will provide a more polished feel to the grabbing experience, and give you plenty of settings to customize how the grabbing works.
Physics Interactions:
When we are moving around in virtual worlds, our body and mind will expect stuff to happen just like in the real world. That’s why good physics interactions are crucial to create immersive AR and VR experiences. Examples for physics interactions are opening a drawer, pushing a button, moving a lever, etc.
PC VR’s Biggest Innovation from SadlyItsBradley
UI Interactions:
We will definitely need menus and setting screens in our experience or game, and good and reactive UI components are a must.
Button builder example from Ultraleap.
There’s still much design exploration to do in this topic, as we are transitioning from 2D to 3D, so every SDK will present different solutions to this topic. Some examples of UI Interactions are interacting with UI using pointers, Interacting with UI using the tip of the finger, etc.
Locomotion:
Locomotion refers to the ability we have to move inside the virtual world, for example teleporting to a different place, or using the controllers (or hand tracking gestures) to continuously move through the environment.
This is, in general, a controversial topic where you will find separated opinions, because some people are more prompt to VR motion sickness with one type of locomotion over the other.
That's why, depending on your use case, you might want an SDK that offers some type of locomotion system.
Hand tracking smooth teleport, part of the curriculum in our Advanced VR Interactions bootcamp.
Here are some of the most important criteria you should consider before choosing your VR SDK for Unity, but remember, these criteria are also conditioned by your use case. For example, if you are building a game, you might want to focus on controller support, physics interactions, locomotion, etc. On the other hand, if you’re looking into building an educational experience, you will prioritize interactions such as hand tracking, UI, multiple target devices, etc.
Now that we know the criteria, we’ll have a look at some of the most used SDK's out there:
The XR Interaction Toolkit (Unity's official XR interaction toolkit, recently out of preview phase, meaning that Unity considers it to be good enough for production.), the MRTK (Microsoft’s official MR (Mixed Reality) toolkit, originally released for Hololens development, but then extended to other platforms), the VRTK (The virtual reality toolkit was one of the first widely adopted tool kits/SDK for VR development with Unity. Low code and beginner-friendly), the Oculus Interaction SDK (Official SDK from Oculus. It is focused on developing for Meta/Oculus devices), the XRTK (The XR Tool Kit is a community-supported fork of Microsoft’s MRTK. The XRTK differs from the MRTK with its heavy focus on efficient architecture and lightweight SDK that promotes additional expandability but may be more difficult for beginners due to the lack of sample scenes.
To help you choose, and to give you a better overview over the VR SDKs for Unity, we prepared the following comparison table looking at all previously mentioned criteria.
Virtual Reality Toolkit
Windows Mixed Reality Toolkit
Mixed Reality Toolkit
XR Interaction Toolkit
Oculus Interaction SDK
Multiple XR Target devices/ platforms
Oculus Steam VR HP Reverb Windows MR Pico Varjo Vive
Oculus Steam VR AR Core AR Kit Magic Leap Hololens Windows MR
Oculus Steam VR AR Core AR Kit Magic Leap Hololens Windows MR
Oculus Steam VR HP Reverb Pico AR Core AR Kit Magic Leap Varjo Vive
Oculus
Target XR Environment
VR
VR AR
VR AR
VR AR
VR
Game Engine
Update Frequency
Open Source
OpenXR Support
Physics Interactions
Multiple samples
Multiple samples
Using the Unity editor
Some samples
Hand-Tracking Support
Including HandTracking API
Controller support
Input Detection
Device
Action
Action
Device & Action
Device
Interaction Controller Samples
Multiple
Multiple
Limited
Limited
Limited
UI Interaction Samples
Multiple
Multiple
Limited
Limited
Limited
Locomotion/ Teleportation
Outdated, devs need to modify some code
2 Hands-based Interactions
Multiple Samples
Multiple Samples
Can be implemented
Only a small scaling implementation
Included Hands (default hands)
Add custom hands
Modeling/ rigging knowledge required
Modeling/ rigging knowledge required
Accessibility Options
Difficulty level
Official Toolkit Website & Community
XRTK.io
There are many new releases, updates and new Interaction SDKs popping up every day. That's why we created an Interaction SDK Knowledgebase on Notion, where we are regularly tracking and adding new frameworks and features constantly. If you would like to add your comments, or input, or wish to just get access to the VR Interactions SDK Overview Table, feel free to submit your e-mail - we will send you the link.
The XR InteractionToolkit is a Unity VR tool. It is part of Unity's XR Plugin Architecture (XR Tech Stack), and it contains a set of components that allow you to create quick and easy XR interactions with minimum effort. Also, it uses Unity's Input System to identify the input events. More info about XRIT here
Here’s a tutorial about how to use Interactions and Locomotion with this VR SDK for Unity: here
Pros:
Cons:
The Mixed Reality Toolkit (MRTK) is a Microsoft-driven project that provides a set of components and features, used to accelerate cross-platform MR app development in Unity.
If you’re interested in exploring the MRTK further, check our HoloLens & Mixed Reality Master Class here.
Here are some of its functions:
Pros:
Cons:
VRTK example scenes link
This VR SDK provides a huge variety of reusable interaction and UI classes/components that can be used to achieve all kinds of VR interactions including physic interactions, two hand based interactions, reactive UI panels, locomotion, etc.
In both most used versions (v3 and v4), you can find a big scene including examples, which is very useful to build your first VR experiences, as you can “drag and drop” the examples into your own projects and adapt them to your own personal use.
Pros:
Cons:
Oculus Integration SDK, which is also an Oculus dev kit, provides a good amount of tools and features out of the box. They update the kit quite often, and new exciting features are being added very frequently (this can lead to the toolkit being unorganized and outdated). The last SDK update, v37, featured the Interactions SDK, which presents new tools to build more complex and richer interactions.
Since v31 (August 2021), OpenXR is the main API behind the SDK, meaning that new features will be delivered through this standard. For example, the passthrough feature is exclusive to OpenXR and it is not supported with the old mobile or pc APIs.
Pros:
Cons:
The XRTK is a fork of Microsoft’s MRTK. The XRTK is also pronounced “Mixed Reality Toolkit.” The XRTK fork was a result of a schism that occurred between the MRTK developers from Microsoft and non-Microsoft MRTK contributors regarding the Mixed Reality Toolkit’s ideology, architecture, and contribution process.
XRTK's vision is simple, to provide a complete cross-platform solution for AR/XR/VR development that supports three different developer skill levels:
Pros:
Cons:
All VR SDKs for Unity have their pros and cons. It comes down to what works best for you, depending heavily on your use case and target platforms you wish to build for.
If you are looking into building a VR game in multiple XR platforms, we recommend going for Unity’s XR Interaction Toolkit as this VR SDK for Unity focuses on controllers, locomotion, and physics interactions. It provides a lot of compatibility with other Unity assets, the ability to build to multiple devices and platforms, the possibility to build AR experiences, and it includes OpenXR support.
Oculus Integration SDK is great if you plan to only create products for the Meta platform. You get access to the cutting-edge features that Meta develops such as hand-tracking and Meta avatars, as well as many of the new updates that Meta plans to release down the line. Oculus is currently the most widely used VR platform. So choosing Oculus Integration is a good pick if you know that you will be primarily on the Oculus platform. That said if you are looking for a more multi-platform use case we still suggest XR Interaction Toolkit for now.
For non gaming purposes, such as educational or enterprise experiences, you should be looking into using either MRTK or XRTK , as they are heavily focused on hand tracking and UI components. Microsoft has done an amazing job creating UI design tools for this next generation of spatial computing.
This article has been written by:
In case you have any questions about this article or VR SDKs in general, feel free to reach out at mentors@xrbootcamp.com.
There are many new releases, updates and new Interaction SDKs popping up every day. That's why we created an Interaction SDK Knowledgebase on Notion, where we are regularly tracking and adding new frameworks and features constantly. If you would like to add your comments, or input, or wish to just get access to the VR Interactions SDK Overview Table, feel free to submit your e-mail - we will send you the link.