Are you new to VR ? Are you wondering how to create a VR App? How to create a VR Game? Welcome to this Unity VR Development Tutorial (Oculus Quest 2 Tutorial - Oculus Quest Tutorial).
After reading this guideline, you can create your first VR scene in Unity!
How to’s answered in this article
Skills Required: None. Beginner level.
Hardware Required: PC or laptop with Windows, Linux or Mac OS (Windows recommended). Oculus Quest 2 with headset and hand controllers
In this blog we will create a VR demo scene in which the player will be able to move (teleport) and interact with game objects in a VR environment.
If you already have Unity installed and know how to create a new 3D project, you can jump to add modules or XR settings
Unity VR Tutorial. Unity Hub is an application used for handling different Unity editor settings, such as editor version, creating a new project or opening an existing one, etc.
For installing Unity Hub click on the Download Unity Hub button here: Download - Unity (unity3d.com) (see image below).
The page also displays the computer requirements. A PC or laptop with Windows OS is the recommended option.
Unity Hub is a standalone application that streamlines the way you find, download, and manage your Unity Projects and installations. More info here
After the file has been downloaded, double click on it to install. A new window will appear.
A new window like this should appear.
Let me introduce you to Unity Hub!
Unity VR Tutorial: In Unity Hub, click on Installs (left) and later on Install editor (blue button, right)
For this tutorial we’ll use the 2021.2.8f1 version, so we’ll need to click on the Archive tab and click on download archive link (blue text, see image below). It will redirect you to the Unity archive webpage, in which you will find the editor versions that can be installed.
Click on the Unity Hub button of Unity version 2021.2.8
An Oculus app is created (built) with Android OS, so we will need to install some additional modules in Unity.
After clicking on the button described above, a new window should appear (see image below). Select the following options:
Although in this tutorial we won’t create scripts, it is recommended to install Visual Studio.
Visual Studio is a C# editor, the script programming environment used in Unity. The Community version is free. More info here
Do you remember the main window in Unity Hub?
This is used for several purposes. One of them is creating new projects.
In the next window:
Why not the VR template? because it doesn’t include the XR Interaction Toolkit (this is a Unity software package) that will be used in this tutorial.
3D will be our project’s template.
After some time, the Unity editor will be loaded, and a new window will appear for you to give it a try clicking on the different options available to familiarize with the editor.
What is XR?
XR is a term used to include several types of applications such as VR (Virtual Reality), MR(Mixed Reality) and AR(Augmented Reality). More info here
To configure our Unity project for XR, we will need to use a software package called XR Plugin Management. It manages and helps the user with loading, initialization, settings, and build support for XR.
Now the XR Plugin Management option will look like this:
In the Unity manual, the description of this package is: “provides simple management of XR plug-ins. Manages and offers help with loading, initialization, settings, and build support for XR plug-ins.”
It’s a standardized interface that allows you to configure several XR inputs (i.e. Oculus Quest 2, for the Oculus Quest 2 Tutorial) in a single project and translate them into multiple headsets and platforms. More info here
A new window may appear asking you to enable Unity's Input System.
After the step above, a warning message may appear:
Interaction profile: type of configuration required for the headset you’ll be interacting with.
In the same window, be sure that Render Mode is set to Multi passMulti-Pass renders the Scene into two images shown separately for each eye, and is the recommended option.
Single-Pass combines both images into a single double-width Texture, where only the relevant half is displayed to each eye.
Go to the Android tab and select OculusInputs are actions taken from the Player through a device (for example mobile, PC or VR headset like the Oculus Quest or Meta Quest for the Oculus Quest 2 Tutorial, Meta Quest Tutorial) that will be identified by the game engine editor (Unity) and will “trigger” or execute some other actions in the program, application or game.
One example is firing. When the player presses (input action) the “Fire” button on a mobile game, it starts to spawn bullets (action triggered in the game) in a certain direction.
Now we will configure the XR Interaction toolkit, a framework that works pretty well with OpenXR.
It will show a warning window informing you that if you have an older XR Interaction Toolkit version you should make a backup before updating to this newer one.
These Default Input Actions are pre-configured actions that are already set up and ready to go.
The XR Interaction Toolkit will identify the inputs in the Unity’s VR project and will use OpenXR that will be in charge of translating them into the desired VR platform (in this case, Oculus)
Your assets folder in the project tab should look similar to this:
If you go into the XR Interaction Toolkit folder, you should have these sample XR interaction files (except by the file with the blue ray):
Select each file and Add it to the project by clicking on “Add to …” in the inspector tab
Go to Edit/Project Settings/Preset Manager and write “Right” and “Left” for the corresponding Action Controllers (see image below, bottom middle)
Normally, when you play a game, all the interactions (including camera movement) happen inside the device (for example mobile, PC or VR headset like the Oculus Quest or Meta Quest for the Oculus Quest 2 Tutorial, Meta Quest Tutorial), and can be watched by the player from outside through the device’s display.
VR apps usually are called VR experiences, because they generate, well, an immersive experience.
Immersive experience means that the player feels like “being part” of the game, and in this case, also the scene.
In Unity, the camera and sound are “attached” to a gameobject called XR Origin.
For creating a XR Origin in Unity:
The difference between XR Origin and XR Origin (action-based) is that the first one has just the head camera and the second one also has the controllers.
The XR Interaction Manager game object that contains the script with the same name is automatically imported
Now it’s time to add a component to the XR Interaction Manager in order to configure input actions in our VR scene:
See the steps below
The hierarchy should look like this:
In a VR device (for example mobile, PC or VR headset like the Oculus Quest or Meta Quest for the Oculus Quest 2 Tutorial, Meta Quest Tutorial), usually the player will have a headset and 2 hand controllers.
Both Left and Right Hand Controllers must have the following actions configured in the inspector
And XR Origin must look something like this:
After following the steps above, the Unity editor is ready for checking if our headset and hand controllers are identified by the Unity editor.
At the top middle in the Unity editor, you’ll find a button with a play icon
When this button is pressed, Unity enters Play mode
Play mode is used for testing a project quickly in Unity, and this is how we will check if our VR app demo is working fine on the VR headset (for example mobile, PC or VR headset like the Oculus Quest or Meta Quest for the Oculus Quest 2 Tutorial, Meta Quest Tutorial).
If you are using Mac or Linux, you will need to compile the apk and run it on the headset. To do this, you’ll need to follow the steps described here
If you are using Windows OS, the next step is to configure Oculus Link in your Oculus device (for example mobile, PC or VR headset like the Oculus Quest or Meta Quest for the Oculus Quest 2 Tutorial, Meta Quest Tutorial). The steps required are described in the Oculus Link official guide webpage or here
After Oculus Link is installed, we can test our progress so far entering Play mode in Unity.
For this, you need to enable Oculus Link in your headset, go back to the Unity editor and press the Play mode button at the middle top. Now, go back to your headset and move your head and hands
Expected result:
It’s time to move around our VR scene. Now, we will configure something called “Locomotion System”
You should get in the inspector something like the image below while the Locomotion System is selected in the inspector
This way you have added a Locomotion System into the scene. Now we need to configure it.
It’s used for teleportation in the scene. For creating a Teleportation Area:
It will generate a new plane with a Teleportation Area script. This will be the area we will be able to teleport around.
Maybe you’ve noticed that there is also a Teleport Anchor option
The main difference is that the Teleport Area will allow you to teleport to any point on the plane, and Teleport Anchor is only for teleporting to a specific position where you point your cursor at.
For this tutorial, we will use the previous plane game object for moving around the scene.
When the XR ray turns white, it means that the Teleportation Area has been hit
To teleport, press the grip button when the line is white
For rotation, use the Joystick button
There is another locomotion system: continuous move. To make it work in the current plane:
If you enter Play mode, teleportation will be disabled and you can move and turn using the Joysticks. However, this locomotion system causes something called “motion sickness” in most people, so it’s suggested to use Continuous Move Provider just in very specific cases and use teleportation instead.
VR motion sickness happens when you're moving around in a VR environment, but your body feels like it's sitting or standing in one place.
It allows you to interact with game objects in VR
When adding the XR Grab Interactable component, it will automatically add a Rigidbody, so that we can pick the cube up and interact with it in a physics-based way
For better performance:
If you enter play mode, you’ll be able to grab the cube when the ray is white using the grip button
So far, you’ve created your first VR demo with Locomotion System and object interactions.
Awesome!
Feel free to now play around and enjoy yourself using Unity’s endless features!
I hope you enjoyed this Unity VR Tutorial (Oculus Quest 2 Tutorial, Oculus Quest Tutorial Meta Quest Tutorial). If you're a beginner, you may also want to try the free C# Coding Course to get started with VR Programming.