Welcome to 2017! I wish to you all that this will be an astonishing year!

In 2016 I left you with this video showcasing my first experiments with Oculus Touch, Avatar SDK, in cooperation with our ImmotionRoom full body system. As you can see in the video, there are female full body avatars made with ImmotionRoom (so, basically, made using a Microsoft Kinect) and epic-bearded blue avatars made using Oculus Avatar SDK. Then there is some firing, because firing in VR is always cool!

This demo took me more time than expected, due to some problems I had in using the Avatar SDK. That’s why I’m writing this article: to help you in developing using this SDK wasting less time than me. It will not be a step-by-step tutorial, but I’ll give you some detailed hints. So, ready… go!

Oculus Touch integration

Oculus Touch integration took me no time. Really, it already works out of the box and this is amazing. Just open your Unity project, import the Oculus Utilities for Unity 5 (you can find them here), remove the Main Camera gameobject, drag and drop inside your scene an OVRCameraRig or OVRPlayerController and that’s it! When you run the game, you will just see standard virtual reality without Oculus Touch and so you will think that I’m completely insane, but trust me… you have already Touch Integration!

If you look OVRCameraRig prefab better, you will see that its grandchildren LeftHandAnchor and RightHandAnchor already move according to your Oculus Touch pose. So, for example, if you attach a sphere to your LeftHandAnchor transform, that sphere will follow exactly your left hand, as you can see here:

If when you press play you can’t see your app to run properly in VR, it is because latest version of Oculus Utilities often doesn’t check the Virtual Reality Supported flag. So go to Edit -> Project Settings -> Player and make sure that the “Virtual Reality Supported” flag is checked and that Oculus SDK has been selected.

Ok, now you have Oculus Touch reference frame, but… how about input? Super easy, you use the OVRInput class. OVRInput class is very simple and easy to use and it is also very well documented at this link: I strongly advice you to read the docs! (RTFM is always the right advice, you know) Basically all you have to do is to query OVRInput about the status of some Touch triggers or thumbsticks and then do some action. In the above video, at 0:07 I move the avatars using Touch thumbsticks. Underlying code is super-easy: I attached a script to the avatar specifying this:

void Update () { Vector2 touchAxis = OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick) * Time.deltaTime; transform.position += new Vector3(touchAxis.x, 0, touchAxis.y); }

Basically, I ask OVRInput about the status of the primary thumbstick (that is, the left Touch thumbstick) and move avatar object accordingly.

How about the Kalashnikov? Easy as well: first of all, I’ve attached it to the RightHandAnchor transform, so that it appears in the right hand. Then, in its update function, I’ve written:

if (OVRInput.Get(OVRInput.Button.SecondaryIndexTrigger) && m_deadTime <= 0) { GameObject bullet = Instantiate<GameObject>(Bullet); GameObject fireStart = transform.FindChild("FireStart").gameObject; bullet.transform.position = fireStart.transform.position; bullet.transform.rotation *= fireStart.transform.rotation; bullet.GetComponent<Rigidbody>().AddForce(fireStart.transform.right*2.85f, ForceMode.Impulse); GetComponent<AudioSource>().Play(); m_deadTime = DeadTime; }

Here Seconday Index Trigger is the trigger that I push with my right hand index… like if I were firing. If I push it, I just fire a bullet and play a firing sound. It sounds amazing. Nothing difficult, nothing worth giving particular advices.

The problem if you use Touch this way is that you have no haptic feedback and no avateering… so bad!

Adding haptics

Oculus Touch can give you little vibrations to simulate haptics of the objects you’re interacting with. Again, this is a simple-to-use feature and it is well documented here. Only two things got me a bit confused:

There’s no haptic prefab, nor MonoBehaviour script;

Haptics uses audio clips. Yes, you’ve read right: you provide haptic engine with audio clips that it “plays” through the vibrations of your Touch controllers.

You can’t see that in the video, but I added vibrations to right Touch controller when the user fires with the Kalashnikov, to give my little game a more badass effect. So, I modified my update block to add this feature.

if (OVRInput.Get(OVRInput.Button.SecondaryIndexTrigger) && m_deadTime <= 0) { GameObject bullet = Instantiate<GameObject>(Bullet); GameObject fireStart = transform.FindChild("FireStart").gameObject; bullet.transform.position = fireStart.transform.position; bullet.transform.rotation *= fireStart.transform.rotation; bullet.GetComponent<Rigidbody>().AddForce(fireStart.transform.right*2.85f, ForceMode.Impulse); GetComponent<AudioSource>().Play(); m_deadTime = DeadTime; OVRHaptics.Channels[1].Mix(new OVRHapticsClip(VibeClip)); }

Last line is what does the magic. I ask OVRHaptics to play something on Channel 1 (that is the one of right hand… Channel 0 represents left hand instead). The clip I ask to play is VibeClip (which is an AudioClip object, that I pass as parameter to the script), which I have to box into a OVRHapticsClip object to let it to be played through haptic engine (and yes, I know, instantiating a OVRHapticsClip at every call of the method is not very smart, but this is a simple toy program, so that’s ok). About the chosen method, I preferred Mix over Queue… why? If you ask OVRHaptics to Queue a new clip, it will finish playing last one and then will start with new one: in the case of a machine gun, this is not a smart choice, since every new bullet gets fired while the gun is still vibrating from last shot, so you want the engine to blend the new vibration with the current playing one and this is where the Mix method comes in.

How did I create the VibeClip object? Well, I took the audio file of the kalashnikov bullets, amplified a bit its volume and duration using Audacity and that’s it: it’s simply a WAV file. My advice is starting from an existing WAV file and then going trial and error using Audacity until you obtain the vibration you find optimal. It’s obvious that the file gets interpreted so that the device vibrates to mimic the waveform the file contains.

Adding avatar

Here is were things start having some problems. Avatar SDK is not built-in into Oculus Utilities and you have to download it separately. So, go back to Oculus download page and look for Avatar SDK, download it and unzip it. This project is not only Unity-oriented, so you have to look for Unity package inside the \Unity folder of the unzipped SDK. Import into your project the OvrAvatar.unityPackage package (this may require some time due to avatar shaders).

Importing the package do not suffice. To see your avatar, you have to drag-and-drop into your project the Assets\OvrAvatar\Content\Prefabs\LocalAvatar prefab. Hit Play and move your Touch controllers… you should start seeing some hands moving someway. Great! You should also see an error about Oculus App Id, but we will deal with this later.

LocalAvatar’s OvrAvatar behaviour will show you lots of options: most important to me are:

“StartWithControllers”: if you check this, the avatar will not show you bare hands, but hands with Touch controllers in them;

“ShowFirstPerson”: select this if the avatar is the one of the player;

“ShowThirdPerson”: check this if this is not the player’s avatar. With this option a simple avatar’s body and face will be shown.

Ok, now we have a great problem: our OVRCameraRig and Avatar are completely unrelated!!! This is the biggest WTF of Oculus SDK. Your avatar and your camera+touch reference frames are not related in any way.

To make it work, you can do as in the samples: set the same pose (position+rotation+scale) for OVRCameraRig and LocalAvatar game objects. Then in OVRManager script of OVRCameraRig select TrackingOriginType: Floor Level.

Re-run the samples… now you should see that your avatar actually has sense now and that avatar hands are exactly where they should be!!

But this is a Pirro’s victory: if you try to move your OVRCameraRig gameobject (to emulate the fact that the player is moving inside the game), the avatar remains fixed in place! This is a non-sense for me (IMHO if I say that an avatar is a first-person-avatar, it should follow the CameraRig automatically) and I think that it is something that Oculus has to fix. And then, again, if you select Eye Level as origin type, nothing works the same.

You can solve the issues putting a common parent between the Rig and the Avatar, but I prefer having a more complete approach (also because this is needed in our ImmotionRoom prefab!). Solution I found to make them related in any case (indepently by the parents of both objects and the origin type) is writing a small script that makes sure that the avatar follows your position.

using UnityEngine; public class AvatarCalibrator : MonoBehaviour { public GameObject LeftHand; public GameObject LeftHandAvatar; // Use this for initialization void Start () { if(LeftHand == null) LeftHand = GameObject.Find("LeftHandAnchor"); if(LeftHandAvatar == null) LeftHandAvatar = transform.FindChild("hand_left").gameObject; } // Update is called once per frame void Update () { transform.position += LeftHand.transform.position - LeftHandAvatar.transform.position; } }

Take this script, save it as AvatarCalibrator.cs and attach it to your LocalAvatar gameobject. This is a super-simple script that I wrote in few minutes and it is far from optimal, but it conveys the idea: the avatar will always make sure that his left hand coincides with the left hand reference frame of OVRCameraRig, this way it will follow you and will work like a charm even if you use a OVRPlayerController.

Try substituting the OVRCameraRig with a OVRPlayerController (and adding a floor plane, or the player controller will fall to death forever) : you will see that even if you move with your Touch Thumbsticks, your avatar will always follow you… awesome!

Ok, time for a build… save the scene and build the executables… launch the project… and…e-ehm, WHERE ARE MY HANDS???

Well, there is a problem. If you try looking at the program log, you will see a bazillion of exceptions, all like this one:

DllNotFoundException: libovravatar at (wrapper managed-to-native) Oculus.Avatar.CAPI:ovrAvatarMessage_Pop () at OvrAvatarSDKManager.Update () [0x00000] in <filename unknown>:0 (Filename: Line: -1)

The problem is in the chosen architecture. Go to your Build Settings (File -> Build Settings…) and pick x86_64 Architecture. I’m running a 64-bit Windows 10 PC, but building for x86 with Oculus had not given me any problem until now that I started using Oculus Avatars… weird!

Ok, so re-build and re-launch the program! Time to see…no, THERE ARE NO HANDS THE SAME! I know, I know, 别担心. I can handle this. Good news is that the previous exception is vanished… but it has been replaced by a brand new exception!

Surprise, surprise… there is a bug! If you go to the present download page of Avatar SDK you can see that in that text that no one actually reads, there is a reported known issue!

Missing avatar graphics when you build and run a Unity .exe. As a temporary workaround, click Edit > Project Settings > Graphics. Under Always Included Shaders, add +3 to Size, and then add the following shading elements: AvatarSurfaceShader, AvatarSurfaceShaderPBS, and AvatarSurfaceShaderSelfOccluding.

So, do what it suggests you to do and build everything.

Build time should be much longer now (I’ve an Intel i7, GTX 1080 and all super fancy stuff, but it requires me two minutes or such) but now when you launch it, you should finally see your hands!!

Anyway, if you want more info, full docs of Avatar SDK are here (with “full” I was joking… lots of stuff is still incomplete).

Adding YOUR avatar

So, everything seems fine, but you can still see only that bald aqua-coloured guy avatar and unless you are the Brazzers bald man, this can’t satisfy you: you’ve spent hours crafting an epic avatar of yourself and you wanna see your personal avatar. So, how can you do this? Again, this will not be super-easy.

Oculus documentation gives us precise instructions at this link (where there is a more precise explanation of Unity Avatar Plugin). I’ll copy-paste this here just to spare you the time of following the link:

To demonstrate loading an Avatar Specification: Import the Oculus Platform Unity Package into your Unity project.

Click Oculus Platform > Edit Settings and paste your Oculus Rift App Id into the field.

Open the LocalAvatar sample scene.

Open the OvrAvatar script.

At the top of the script, add using Oculus.Avatar;

Find and replace all occurrences of CAPI.ovrAvatar with Oculus.Avatar.CAPI.ovrAvatar

Find the lines: OvrAvatarSDKManager.Instance.RequestAvatarSpecification(

oculusUserID, this.AvatarSpecificationCallback); Replace oculusUserID with a valid Oculus User ID such as 295109307540267. When you play the scene, the default blue avatar is replaced with the avatar of the specified user.

First point is… where the heck I find the Platform SDK? It is not in Oculus main download page! Long story short, you have to search it a very little bit. At current time of writing, its download link is here. In any case, you can select “Oculus Platform” from the dropdown menu of the Oculus download page to obtain its link.

UPDATE: while the above lines and link still prove to be true, now Oculus Platform SDK is on Oculus main download page! Is Oculus reading my blog? 🙂

Download it, unzip it, then import into your project the Unity Platform plugin that you can find at \Unity\OculusPlatform.unityPackage.

Then, select your LocalAvatar gameobject and double click on OvrAvatar script to modify it. Then some of the provided instructions prove to be useless, since new version of OvrAvatar.cs file already uses Oculus.Avatar namespace. Only thing you have to do is

Find the lines: OvrAvatarSDKManager.Instance.RequestAvatarSpecification(oculusUserID, this.AvatarSpecificationCallback); Replace oculusUserID with a valid Oculus User ID such as 295109307540267.

Done it? (If you are in doubt which number to use, use the provided one, we’ll return on this later on)

Ok, now, do you remember that little exception I told you to ignore? Well, it’s time to consider it. You have to provide Avatar SDK and Platform SDK your Oculus App Id. Go to your Oculus Developer Dashboard and create a new fake app, giving it the name you prefer (I chose TouchTest, since I have a lot of fantasy).

Go to details of your newly created app and choose the API tab: you should see an App ID. Copy it.

Go back to Unity and select Oculus Avatars -> Edit Configuration and then paste your ID into the Oculus Rift App Id textbox that you see in the inspector.

Now select Oculus Platform -> Edit Settings and paste the same Id to the Oculus Rift App Id and Gear VR App Id fields. Now when you hit play you should not see the exception anymore. But, even more: you can see a new avatar, a pink one! This is the avatar of user 295109307540267, that I have no idea who is… maybe Palmer Luckey? (Don’t know… there are no flip flops in the avatar… can’t understand if he’s really Palmer)

We’re almost there: we have shown the avatar of Palmer Luckey… but, what about showing yours? It’s so simple! You have to put your Oculus User ID inside that method instead of the one of Palmer. But… WHAT THE HECK IS YOUR OCULUS USER ID??

Long story short: I don’t know… looking on Google I didn’t find anything… I tried looking on my Oculus user profile, Oculus dev profile… and no clue about my ID! So, how to find your Oculus Id? Honestly, I don’t know.

BUT I can tell you a workaround to obtain it. I found this amazing video of Unity where they explain properly how to use Avatar SDK.

It’s awesome and at 21:50 they show you something really useful, that is the proper way to use OvrAvatar to show the user’s avatar. I’ve taken a screenshot for you

Basically, you have to create a script to initialize Oculus Platform with your App Id, then wait for platform initialization and only at that point you can create and initialize your avatar, providing it the ID of the logged in user (that is, you). I used this to trick the system and obtain your User Id.

So, create a new script called PlatformManager and add it to an empty gameobject.

using Oculus.Platform; using Oculus.Platform.Models; using UnityEngine; public class PlatformManager : MonoBehaviour { // Use this for initialization void Start() { Core.Initialize("<your_oculus_app_id>"); Users.GetLoggedInUser().OnComplete(OnGetUser); } private void OnGetUser(Message<User> message) { var id = message.Data.ID; Debug.Log("My id is " + id); } }

Of course you have to replace <your_oculus_app_id> with your Oculus App Id. Launch the program hitting Play and see in the Unity console your Oculus User ID being written! Copy it and replace the standard Palmer Luckey’s number with this number inside OvrAvatar.cs script.. re-launch everything (you can even remove the PlatformManager script, now!) and now you should see your avatar being used! You can even add a new Local Avatar with “Show Third Person” flag checked to see your own face!

Two more things about this last point:

You will get a warning about Clobbering of ID, because in the PlatformManager script you should not write Core.Initialize and specify again the App ID (that you’ve already specified in Unity menus), but if you don’t do it, you get a null reference exception (don’t know why, another strange issue). To cancel the warning, you can go back to Oculus Platform -> Edit Settings and clear the Oculus Rift App Id and Gear VR App Id you wrote some time ago. Anyway, the warning is of no harm, so you can even leave everything as is. As I’ve said, this is the quick-and-dirty way to obtain your avatar. Theoretically you should do as the video suggests, so to make sure that the avatar of the current user is shown in a clean way, whoever the user is (if you distribute your game with the above scripts, everyone will see YOUR avatar!). To do that, there is the issue that you have to wait for the Platform to initialize itself before showing the Avatar/Avatars. If you try to specify a user Id before the Platform has been initialized, nothing will work and you will see again the aqua bald guy avatar. One possibility to obtain this is deactivating the Avatar gameobjects and then modify the script PlatformManager as follows: using Oculus.Platform; using Oculus.Platform.Models; using UnityEngine; public class PlatformManager : MonoBehaviour { public OvrAvatar[] Avatars; // Use this for initialization void Start() { Core.Initialize("<your_oculus_app_id>"); Users.GetLoggedInUser().OnComplete(OnGetUser); } private void OnGetUser(Message<User> message) { foreach(OvrAvatar avatar in Avatars) { avatar.oculusUserID = message.Data.ID; avatar.gameObject.SetActive(true); } } } And then set your Local Avatars as parameters of this script. This script will make everything wait until the platform has initialized with current user; then when the user will be recognized, the script will re-activate all avatars assigning them the ID of the current logged in user (the player). Of course you have also to go back to the line you changed into OvrAvatar.cs and restore the line OvrAvatarSDKManager.Instance.RequestAvatarSpecification(oculusUserID, this.AvatarSpecificationCallback);

Launch everything again: you should see again your avatar… but the code is far better! So, why do I showed you the dirty approach? Because this way I could show you the hack to obtain your Oculus User ID and because knowing that can be useful in some situations.

And with this, my super-long tutorial (woah, it is the longest post ever on my blog!) ends. My final words on Avatar SDK is that it is cool but it still needs bug fixing and a lot of refactoring.

Hope you liked this tutorial, since I spent a lot of time writing it: please like it, ask questions about it, comment it saying that I’m amazing and share it with all your other fellow Virtual Reality devs! And if you want, come to say hello to all of us of Immotionar! Happy 2017!