Welcome to the Avatar Magna Carta :

How to Puppeteer a 3D Humanoid with 6DoF head and hands tracking

In this post, we present the workflow required in order to enable a player to live puppeteer a rigged first person 3d avatar in-game, by:

driving the avatars in-game hands

in a 1:1 relationship with the players actual physical hands, and animating the avatar’s in-game head

orientation to match the precise orientation of the players physical real-world head, via 3d trackers on the players heads and hands,

and the application of a simple inverse kinematic (IK) physics model.

We spent a long time figuring out this path, so I thought we’d share it with the community. Note: this is an entirely technical, workflow, pipeline post for readers who are currently developing VR applications. Its not for the general consumer. This tutorial is specifically crafted for a Unity3D pipeline. Also, it is specific to the Oculus Rift DK2 HMD, and Razer Hydra hand trackers, powered by Sixense. It should work with other tracking solutions, with modification.

Project Avatar : The basic premise:

People want to relate to their own physical avatars in VR. They want to be able to look down at their feet, and see their body. They want to be able to wave their hands in front of their face, and see some representation of their appendages in front of them, superimposed on the virtual scene. In short, they want to feel like they are present in the experience, not just an ethereal viewer.

This problem proved a bit more difficult to solve in practice than one might imagine. So in the interest of fostering community, we are sharing our technical solution with everyone. It isn’t perfect yet; we’ve posted a number of tips and follow-on research topics at the end of the tutorial, places where this needs to go before its fully “ready for prime time.” However, the solution presented here IS functional, leaps beyond the standard “avatar-less” VR being produced today, and should serve as a baseline from which improvements can and will be made.

The Avatar Puppeteering tutorial:

build and skin your Avatar Model we create humanoids in Mixamo Fuse you may choose the modeling software of your choice : Maya, Blender, etc make sure that your final model is in T-pose rig the character with bones this can be done by uploading a t-pose character to mixamo.com …or manually in your software use the maximum resolution possible : we use a 65-bone skeleton, which includes articulated fingers give the character at least one animation we will use “idle” state you assign this online in Mixamo get the Avatar into Unity export from Mixamo to Unity FBX format import the resulting FBX (approx. 5-20MB) into Unity Assets : gChar folder this will generate a prefab in Unity along with 4 components in a subfolder: a mesh a bone structure an animation loop file an avatar object the prefab will have a name such as “YourModelName@yourAnimName” Configure the Avatar click on the prefab in the Assets folder In the inspector, select the “Rig” tab make sure that “Humanoid” is selected in the “Animation Type” pull-down if you selected that manually, hit “Apply” drag the prefab from the Assets into the Heirarchy select the prefab avatar in the heirarchy In the Inspector: add an “Animator” component. we will fil in the details later add the g-IKcontrol.cs C# script. again, we will fill in details later you can copy the source of the script from here Add the latest Oculus SDK (OVR) to the project. Download the latest Oculus SDK for Unity this is usually done by double-clicking “OculusUnityIntegration.unitypackage” in the OS, then accepting the import into your project by clicking “import all” You should now have a folder within Assets called “OVR” Add the latest Sixense Hydra / STEM SDK to the project Download the Hydra plug-in from the Unity Asset Store Import it into your project. You should now have a folder within Assets called “SixenseInput” Create a high level Empty in your hierarchy and name it “PLAYER-ONE” make your Avatar prefab a child of this parent Drag the OVR CameraRig from the OVR folder and also make it a child of PLAYER-ONE Properly position the Oculus camera in-scene The oculus camera array should be placed just forward of the avatars eyes we typically reduce the forward clipping plane to around 0.15m If you’re using the OVRPlayerController, Character Controller settings work well: Center Y = -0.84m (standing), Center Z = -0.1 (prevents from being “inside head”) Radius = 0.23m Height = 2m This will require some trial and error. Make sure that you use the Oculus camera, and not the Oculus Player controller. Experimentation will be required to bridge the spatial relationship between a seated player and a standing avatar. Calibration software needs to be written. Trial and Error is generally defined as a series of very fast cycles of : build, test, make notes. modify, re-build, re-test, make notes. repeat until perfect. There are many gyrations and you will become an expert at rapidly donning and removing the HMD, headphones, and hand controllers. Create the IK target for head orientation Right-click on the CenterEyeAnchor in the Hierarchy, and select “Create 3D Object > Cube” Name the cube “dristi-target” move the cube approx 18” (0.5m) directly outward from the Avatar’s third eye This will serve as the IK point towards where the avatar’s head is “aimed” at, i.e. where they are looking. In yoga, the direction of the gaze is called dristi. Get the Sixense code into your scene Open the SixenseDemoScene copy the HandsController and SixenseInput assets from the heirarchy Re-open your scene paste the HandsController and SixenseInput assets into your heirarchy drag both to make them children of OVRcameraRig Make sure the Sixense hands are correctly wired. Each hand should have the “SixenseHandAnimator” controller assigned to it Root Motion should be UNchecked Each hand should have the SixenseHand Script attached to it On the pull down menu doe SixenseHand script, the proper hand should be selected (L/R) Properly position the Sixense hands in-scene They should be at about the Y-pos height of the belly button The wrists should be about 12” or 30cm in Z-pos forward of the abdomen In otherwords, they should be positioned as if you are sitting with your elbows glued to your sides, forearms extended paralell to the ground. You will want to adjust, tweak, and perfect this positioning. There is an intrinsic relationship between where you position the hands in the scene, and the Sixense trackers position in the real world relative to the Oculus camera. Trial and error and clever calibration software solves this. That’s another tutorial. Make the Sixense hands invisible. we do this because they will merely serve as IK targets for the avatars hands do this by drilling down into HandsController : Hand_Right : Hand_MDL and unchecking the “Skinned Mesh Renderer” in the Inspector panel do the same with the left hand. this leaves the assets available as IK targets, but removes their rendered polys from the game Create the Animator Controller create transitions from Entry to New State, and from New State to Idle (or what you named your created animation) On the Base Layer, click the gear, and make sure that “IK Pass” is checked. this will pass the IK data from the animation controller on down the script chain Assign the new Animation Controller to the Avatar select the avatar in the heirarchy assign the controller in the inspector Map the Avatar with Puppet IK targets for Hands and Head, drag the “Hand – Left” from the Sixense Handscontroller parent to the “Left Hand Obj” variable drag the “Hand – Right” from the Sixense Handscontroller parent to the “Right Hand Obj” variable drag the Look-At-Target from the OVRCameraRig to the “Look Obj” variable The Look-At-Target is nested deep: PLAYER-ONE : OVRCameraRig : TrackingSpace : CenterEyeAnchor : Look-at-Target THATS IT! Build a compiled runtime. Connect your Rift and Hydra launch the game activate the Hydras. grasp the left controller, aim it at the base, and squeeze the trigger. grasp the right controller, aim it at the base, and squeeze the trigger. hit the “start” button, just south of the joystick on the right controller When you tilt and rotate your head, the avatars head should also tilt and roll. When you move your hands, the avatars hands should move in a 1:1 ratio in-scene. Congratulations, you’re an Avatar Master.

TIPS

and areas for further R&D

Ideally, the avatars head should not be rendered for the player, yet it should still cast shadows and reflections the avatars head should also clearly be rendered for other players in multi-player scenarios, as well as for third-person camera observer positions. An in-game shadow is a great way to ground the player to the avatar in a convincing manner. Even when the hands are outside the field of view, seeing the shadows of the heads and hands triggers a very powerful sense of presence. While head rotation and orientation on the skull axis is fairly straightforward, head translation, i.e. significant leaning in or out or to the side, is a bit more abstract in terms of pupeteering choices. You may wish to explore either locomotion animations, or “from the hip” IK solutions to move the torso along with the head. RL : VR / 1:1 Spatial Calibration is KEY to great experiences. See 9.3, above : Properly position the Oculus camera and 13.4 : Properly position the Sixense hands The built-in Unity IK leaves a lot to be desired when it comes to realistic approximations of elbow positions. We are investigating the FinalIK package and other professional-class solutions. This solution in its current form disables the ultra-cool “grasping” or “pointing” animations that are built-in to the Sixense template models. Investigate how to re-enable those animations on the rigged avatar’s MecAnim structure. You will also want to configure the remainder of the Hydra joysticks and buttons to control all game functions, because it sucks to be reaching and fumbling for a keyboard when you are fully immersed in a VR world. The majority of this configuration starts in the Unity Input Manager Edit | Project Settings | Input… There should be keyboard fallback controls for users who do not own Hydras…

Have you tackled this same challenge? Have you created any solutions to the further investigations we’ve proposed above?

Share in the comments below,

because we’re all in this together!