skip page navigationOregon State University

« | »

Quickstart: Oculus Rift + touch

Posted February 6th, 2018 by Warren Blyth

Here’s a quick summary of how to jump into a simple Oculus Rift + Touch VR project, with Unity3D. (summarizing many pages of official documentation to save others time)

A) The headset is supported natively in Unity now (go to Project Settings and enable VR support. i think it’s hidden away under “XR” at the moment. has been changing location a lot the past few months). Press play, and you can look around with your Rift goggles. You can use mouse and keyboard, or gamepad, for input just as you would in any other Unity3D project. And you can develop for SteamVR and controls will work on both Vive and Rift input (can’t publish that to Oculus store though. and some mappings are weird. Vive’s Grip maps to Y or B button, while nothing maps to A or X)

If you want to release your app officially through Oculus, you’ll need to create a developer account on their Developer website, and setup an App ID through their Dashboard website.

B) To get deeper, and enable Touch using the same ghost hands* you see in the Oculus setup sequence, you’ll need to download and install 3 packages, and then tweak 3 components.
(* note: these are called your “Avatar,” along with a ghostly floating head that others can see in networked experiences. You can customize the appearance and color of these elements over in the Oculus Home app).

Download 3 packages from the Oculus Unity Downloads page
– Import “Utilities for Unity” package (has everything for interface)
and maybe:
– Import Platform (has oculus community stuff, security stuff, etc. all the core non-gameplay stuff basically)
– Import AvatarSDK (this is how you get the standard looking hand presence. Also has social scene sample with VOiP)

… or you can search for “Oculus Integration” on the Unity Asset store (it has all these and more).

Things to tweak:
1- Disable or delete the existing Main Camera so it isn’t competing for control. CenterEyeAnchor will be the camera used for your goggles.
2- Drag the simple prefab OVRCameraRig into scene (I prefer this, because i’m writing my own controls for movement). But if you want to start moving with controllers immediately, drag OVRPlayerController prefab to scene (this includes control scripts, and many things as children, including the “OVRCameraRig”).
3- drag LocalAvatar (from Project window OvrAvatar/Content/Prefabs/ ) to the TrackingSpace (in Hierarchy window childed under:
OVRPlayerController/OVRCameraRig/ ) Note: the Oculus Avatar is only allowed to do the basic button reactions (point, thumbs up, fist). For example, you can't make it change shape/finger-placement to appear to be grabbing a special shape. To do ANYTHING other than the basic functions, you will need to create your own hands with custom animations.

Resource for programming:
: basic code:

public Transform ball;
public OVRInput.Controller c;
void Update () {
float f = OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, c); //*
ball.localPosition = new Vector3(ball.position.x, ball.position.y, f);
//*or you could replace "c" with "OVRInput.Controller.LTouch" to hard code it

~ note: if using OVRPlayerController “ForwardDirection should house the body geometry which will be seen by the player (contains the matrix which motor control bases it direction on).” … i just wrote this down because it seemed like it’d be important later. not actually using yet.

And I’ll post separately about how to set up basic grabbing function (basically you drag a script to the grabber, and a script to everything you want grabbable, then tweak some parts of the components). And i have a short list of other basic functions (pressing buttons, painting on a texture, teleporting, magnifying, twirling with thumbstick, etc.) that I hope to post short entries on. someday. when we have another project.

Print Friendly, PDF & Email

Leave a Reply