The Goals of the Tutorial

We will create a Unity3D application. The goals of this tutorial is to learn how to:

Setup the MRTK v2.0.0 RC1 Release package in your Unity application.

Create an environment that allows user movement.

Add custom free Unity Assets to our project.

The user has the ability to spawn a wall of cubes via the Windows Mixed Reality (WMR) Controller menu button. Multiple spawns are allowed.

The user can interact with the cubes that were spawned.

We will accomplish this by creating a floor with a material that allows the user have better visual feedback of subtle movement, and ambient audio for better emersion. Then adding the ability to spawn a wall of cubes that the user can interact with.

NOTE: This project was not tested for HoloLens.

Preparing for the Tutorial

MRTK v2.0.0 RC1 Release located here: Go to GitHub to download the unity packages (*.unitypackage) for thelocated here: https://github.com/Microsoft/MixedRealityToolkit-Unity/releases

Create a Unity Project named MRTK-RC1-Demo-01. I have found the issues in loading the MRTK assets are minimized if you change the Build Setting to output a Universal Windows Platform (UWP) application, and change the Player Settings/Other Settings to Api Compatibility Level to .NET Standard 2.0 before importing the MRTK assets. The images below indicates the settings I have modified.

Select menu item: Assets/Import Package/Custom Package…

to import the MRTK packages you already downloaded:

Microsoft.MixedReality.Toolkit.Unity.Foundation-v2.0.0-RC1.unitypackage

Microsoft.MixedReality.Toolkit.Unity.Examples-v2.0.0-RC1.unitypackage

Follow the MRTK Getting Started Guide(here: https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/GettingStartedWithTheMRTK.html ) to get the MRTK installed and setup. Then load a scene and from the MRTK Examples and run the unity application by clicking the Start button in the unity editor. If the scene is loaded and runs without error and your Windows Mixed Reality (WMR) headset / or Hololens displays the examples scene you loaded then we can begin writing our custom code.

Review of tasks so far:

Created the Project

Modified the Build settings to output a UWP application.

Modified The Player Settings API Compatibility Level to .NET Standard 2.0.

Imported the MRTK unity packages

Verified that the MRTK packages are installed correctly by running an MRTK example scene.

Custom Application Groundwork

Now import these free assets from the Unity Asset Store (These assets will be used in the application we are building in this – and future – tutorial(s)):

“Western Audio Music” see Figure 03 for specific assets imported.

for specific assets imported. “Lowpoly Wasteland Props” all assets imported.

Now create a folder layout in the Project Assets to support our custom application. Modify the Project Asset add the folder structure shown in Figure 04.

Select the /Scenes folder you created above and add a scene named: Demo_01 to that folder.

With this new scene loaded into the Unity Editor, select the Mixed Reality Toolkit menu and select Add to Scene and Configure.

Figure 05 – The Hierarchy window after MRTK configuration.

The purpose of the hierarchy of folders we created (as displayed in Figure 04) is to allow us to keep our custom game objects and custom scripts separate from unity assets imported via the Unity Asset Store, and other sources such as GitHub. This creates a clear demarcation between your code and code from a foreign source.

Adding the Tutorial Custom Code and Assets

Step One – Environment Setup Now we will create game objects in the Demo_01 scene that will use the MRTK. Create an empty game object, named SceneContent , in the root of the Hierarchy Window.

, in the root of the Hierarchy Window. Add a child empty game object named Environment to the SceneContent game object. Position: 0, -1, 0 Add an Audio Source Component AudioClip:”Western Outside Loop” (located in the “Western Demo Audio Assets” asset folder) Play On Awake: checked Loop: checked

to the game object. Add a child empty game object named GlobalListener to the Environment game object. Add a new MonoBehavior named SpawnAction to the Assets/App/Content/Scripts folder (implementation shown in Code Listing 01 ) Add Component: Interactable Enabled: checked Input Actions: Menu Is Global: checked OnClick() Event (shown in Figure 08 ) Runtime Only: GlobalListener SpawnAction.SpawnPrefab

to the game object. Add a child empty game object named Terrain to the Environment game object.

to the game object. Add a child 3D Cube named FloorPanel to the Terrain game object. Position: 0, -0.25, 0 Scale: 100, 0.5, 100 Add Sand Material (located under the “Lowpoly Wasteland Props” assets)

to the game object. What we just did is setup an area for the VR user to move around in. We added an audio clip that begins playing as soon as the VR scene is initialized and continuously loops until we exit the application. This provides audio feedback the the user that the scene is active and ready. We also added a simple, flat, 100 meter by 100 meter floor so the user can teleport around in the environment. We added the Sand material to FloorPanel game object give the user better visual feedback on movement. If we had just assigned a solid color material to the floor, it would be difficult for the user’s senses (vision) to detect movement within the environment. Additionally we dropped the Environment game object to 1 meter below the normal level (this is because my system, for some reason, routinely spawns me into the floor, at about chest height, when I don’t do this). Additionally, we created an empty game object named SceneContent that will act as the root for all scene game objects other than those added by the MRTK configuration (items shown in Figure 05). We do this when using the MRTK (and I believe this applies to all VR applications) because unlike most Unity 3D scenes, you don’t move the player and/or their attached camera to move around in the scene. In a VR application context, instead of moving the player/camera to move around, you move the scene around the player/camera. This root object is used to simplify the moving all game objects in the scene at one go. To make this even more explicit, place all interactive visuals, spawn points, etc. in the scene as children of the SceneContent root game object.

Step Two – Wiring Up the VR Controllers to Spawn Game Objects Ground Work: Create a Material named DefaultGrabbableMaterial to the Assets/App/Content/Materials folder Set the Shader to Mixed Reality Toolkit/Standard Set the Albedo color to a pastel redish ochre.

to the Assets/App/Content/Materials folder Add a 3D Cube named WallCube to the scene Position: 0, 0, 0 Add Component: RigidBody (because we want gravity to affect it) Add the DefaultGrabbableMaterial to the WallCube . Add an empty MonoBehavior named DragAndDropHandler to the Assets/App/Content/Scripts folder (implementation shown in Code Listing 02 ).

to the scene Drag game object WallCube into the Assets/App/Content/Prefabs folder and delete it from the scene. Add an empty game object named WallOfCubes to the scene. Add the Prefab WallCube we created earlier as a child game object. Duplicate the prefab 16 times (so you have a total of 17 WallCube prefab children) Add Component: Grid Object Collection (from the MRTK SDK) Sort Type: Child Order Surface Type: Cylinder Orient Type: Face Center Axis Layout: Column Then Row Radius: 8 Rows: 3 Cell Width: 1.32 Cell Height: 1 Click the Update Collection button on the component in the inspector window.

to the scene. Drag game object WallOfCubes into the Assets/App/Content/Prefabs folder and delete it from the scene.

into the Assets/App/Content/Prefabs folder and delete it from the scene. Drag the WallOfCubes Prefab into the SpawnAction Prefab To Spawn slot as shown in Figure 07. At this point, you should be able to accomplish the following in the scene: Teleport around the scene using the WMR controllers.

Click the WMR controller menu button (the small button below the thumb stick). When the menu button is clicked, a wall of 18 cubes are spawned 8 meters in front of the user’s original position and orientation. Video Links of the Application in Action MRTK RC1 vNext 2.0.0 Tutorial Result – Part 01 – Create Wall of Cubes

MRTK RC1 vNext 2.0.0 Tutorial Result – Part 02 – Drag and Drop MonoBehavior On Cube(s)

MRTK RC1 vNext 2.0.0 Tutorial Result – Part 03 – Throwing and Bashing Cubes

Code Listings

Code Listing 01 – Start – SpawnAction code

using System . Collections; using System . Collections . Generic; using UnityEngine; public class SpawnAction : MonoBehaviour { public GameObject PrefabToSpawn; public Vector3 SpawnPosition; public void SpawnPrefab() { var spawnedPrefab = Instantiate(PrefabToSpawn); spawnedPrefab . transform . position += SpawnPosition; } }

Code Listing 01 – End- SpawnAction code

Code Listing 02 – Start – DragAndDropHandler code using Microsoft . MixedReality . Toolkit; using Microsoft . MixedReality . Toolkit . Input; using Microsoft . MixedReality . Toolkit . Physics; using Microsoft . MixedReality . Toolkit . Utilities; using System . Collections; using System . Collections . Generic; using UnityEngine; /// < summary > /// Component that allows dragging a < see cref = " GameObject " /> . /// Dragging is done by calculating the angular delta and z-delta between the current and previous hand positions, /// and then repositioning the object based on that. /// </ summary > public class DragAndDropHandler : BaseFocusHandler , IMixedRealityInputHandler < MixedRealityPose >, IMixedRealityPointerHandler , IMixedRealitySourceStateHandler { private enum RotationModeEnum { Default, LockObjectRotation, OrientTowardUser, OrientTowardUserAndKeepUpright } [ SerializeField ] [ Tooltip ( "The action that will start/stop the dragging." )] private MixedRealityInputAction dragAction = MixedRealityInputAction . None; [ SerializeField ] [ Tooltip ( "The action that will provide the drag position." )] private MixedRealityInputAction dragPositionAction = MixedRealityInputAction . None; [ SerializeField ] [ Tooltip ( "Transform that will be dragged. Defaults to the object of the component." )] private Transform hostTransform; [ SerializeField ] [ Tooltip ( "Scale by which hand movement in Z is multiplied to move the dragged object." )] private float distanceScale = 2f ; [ SerializeField ] [ Tooltip ( "How should the GameObject be rotated while being dragged?" )] private RotationModeEnum rotationMode = RotationModeEnum . Default; [ SerializeField ] [ Range ( 0.01f , 1.0f )] [ Tooltip ( "Controls the speed at which the object will interpolate toward the desired position" )] private float positionLerpSpeed = 0.2f ; [ SerializeField ] [ Range ( 0.01f , 1.0f )] [ Tooltip ( "Controls the speed at which the object will interpolate toward the desired rotation" )] private float rotationLerpSpeed = 0.2f ; /// < summary > /// Gets the pivot position for the hand, which is approximated to the base of the neck. /// </ summary > /// < returns > Pivot position for the hand. </ returns > private Vector3 HandPivotPosition => CameraCache . Main . transform . position + new Vector3 ( 0 , - 0.2f , 0 ) - CameraCache . Main . transform . forward * 0.2f ; // a bit lower and behind private bool isDragging; private bool isDraggingEnabled = true ; private bool isDraggingWithSourcePose; // Used for moving with a pointer ray private float stickLength; private Vector3 previousPointerPositionHeadSpace; // Used for moving with a source position private float handRefDistance = - 1 ; private float objectReferenceDistance; private Vector3 objectReferenceDirection; private Quaternion gazeAngularOffset; private Vector3 objectReferenceUp; private Vector3 objectReferenceForward; private Vector3 objectReferenceGrabPoint; private Vector3 draggingPosition; private Quaternion draggingRotation; private Rigidbody hostRigidbody; private bool hostRigidbodyWasKinematic; private IMixedRealityPointer currentPointer; private IMixedRealityInputSource currentInputSource; // If the dot product between hand movement and head forward is less than this amount, // don't exponentially increase the length of the stick private readonly float zPushTolerance = 0.1f ; #region MonoBehaviour Implementation private void Start() { if (hostTransform == null ) { hostTransform = transform; } hostRigidbody = hostTransform . GetComponent< Rigidbody >(); } private void OnDestroy() { if (isDragging) { StopDragging(); } } #endregion MonoBehaviour Implementation #region IMixedRealityPointerHandler Implementation void IMixedRealityPointerHandler . OnPointerUp( MixedRealityPointerEventData eventData) { if ( ! isDraggingEnabled || ! isDragging || eventData . MixedRealityInputAction != dragAction || eventData . SourceId != currentInputSource ?. SourceId) { // If we're not handling drag input or we're not releasing the right action, don't try to end a drag operation. return ; } eventData . Use(); // Mark the event as used, so it doesn't fall through to other handlers. StopDragging(); } void IMixedRealityPointerHandler . OnPointerDown( MixedRealityPointerEventData eventData) { if ( ! isDraggingEnabled || isDragging || eventData . MixedRealityInputAction != dragAction) { // If we're already handling drag input or we're not grabbing, don't start a new drag operation. return ; } eventData . Use(); // Mark the event as used, so it doesn't fall through to other handlers. currentInputSource = eventData . InputSource; currentPointer = eventData . Pointer; FocusDetails focusDetails; Vector3 initialDraggingPosition = MixedRealityToolkit . InputSystem . FocusProvider . TryGetFocusDetails(currentPointer, out focusDetails) ? focusDetails . Point : hostTransform . position; isDraggingWithSourcePose = currentPointer == MixedRealityToolkit . InputSystem . GazeProvider . GazePointer; StartDragging(initialDraggingPosition); } void IMixedRealityPointerHandler . OnPointerClicked( MixedRealityPointerEventData eventData) { } #endregion IMixedRealityPointerHandler Implementation #region IMixedRealitySourceStateHandler Implementation void IMixedRealitySourceStateHandler . OnSourceDetected( SourceStateEventData eventData) { } void IMixedRealitySourceStateHandler . OnSourceLost( SourceStateEventData eventData) { if (eventData . SourceId == currentInputSource ?. SourceId) { StopDragging(); } } #endregion IMixedRealitySourceStateHandler Implementation #region BaseFocusHandler Overrides /// < inheritdoc /> public override void OnFocusExit( FocusEventData eventData) { if (isDragging) { StopDragging(); } } #endregion BaseFocusHandler Overrides /// < summary > /// Enables or disables dragging. /// </ summary > /// < param name = " isEnabled " > Indicates whether dragging should be enabled or disabled. </ param > public void SetDragging( bool isEnabled) { if (isDraggingEnabled == isEnabled) { return ; } isDraggingEnabled = isEnabled; if (isDragging) { StopDragging(); } } /// < summary > /// Starts dragging the object. /// </ summary > private void StartDragging( Vector3 initialDraggingPosition) { if ( ! isDraggingEnabled || isDragging) { return ; } Transform cameraTransform = CameraCache . Main . transform; currentPointer . IsFocusLocked = true ; isDragging = true ; if (hostRigidbody != null ) { hostRigidbodyWasKinematic = hostRigidbody . isKinematic; hostRigidbody . isKinematic = true ; } if (isDraggingWithSourcePose) { Vector3 pivotPosition = HandPivotPosition; objectReferenceDistance = Vector3 . Magnitude(initialDraggingPosition - pivotPosition); objectReferenceDirection = cameraTransform . InverseTransformDirection( Vector3 . Normalize(initialDraggingPosition - pivotPosition)); } else { Vector3 inputPosition = currentPointer . Position; //currentPointer.TryGetPointerPosition(out inputPosition); previousPointerPositionHeadSpace = cameraTransform . InverseTransformPoint(inputPosition); stickLength = Vector3 . Distance(initialDraggingPosition, inputPosition); } // Store where the object was grabbed from objectReferenceGrabPoint = cameraTransform . transform . InverseTransformDirection(hostTransform . position - initialDraggingPosition); // in camera space objectReferenceForward = cameraTransform . InverseTransformDirection(hostTransform . forward); objectReferenceUp = cameraTransform . InverseTransformDirection(hostTransform . up); draggingPosition = initialDraggingPosition; } /// < summary > /// Stops dragging the object. /// </ summary > private void StopDragging() { if ( ! isDragging) { return ; } currentPointer . IsFocusLocked = false ; isDragging = false ; handRefDistance = - 1 ; if (hostRigidbody != null ) { hostRigidbody . isKinematic = hostRigidbodyWasKinematic; } } #region IMixedRealityInputHandler<MixedRealityPose> Implementation void IMixedRealityInputHandler < MixedRealityPose > . OnInputChanged( InputEventData < MixedRealityPose > eventData) { if (eventData . MixedRealityInputAction != dragPositionAction || ! isDraggingEnabled || ! isDragging || eventData . SourceId != currentInputSource ?. SourceId) { return ; } Transform cameraTransform = CameraCache . Main . transform; Vector3 pivotPosition = Vector3 . zero; if (isDraggingWithSourcePose) { Vector3 inputPosition = eventData . InputData . Position; pivotPosition = HandPivotPosition; Vector3 newHandDirection = Vector3 . Normalize(inputPosition - pivotPosition); if (handRefDistance < 0 ) { handRefDistance = Vector3 . Magnitude(inputPosition - pivotPosition); Vector3 handDirection = cameraTransform . InverseTransformDirection( Vector3 . Normalize(inputPosition - pivotPosition)); // Store the initial offset between the hand and the object, so that we can consider it when dragging gazeAngularOffset = Quaternion . FromToRotation(handDirection, objectReferenceDirection); } // in camera space newHandDirection = cameraTransform . InverseTransformDirection(newHandDirection); Vector3 targetDirection = Vector3 . Normalize(gazeAngularOffset * newHandDirection); // back to world space targetDirection = cameraTransform . TransformDirection(targetDirection); float currentHandDistance = Vector3 . Magnitude(inputPosition - pivotPosition); float distanceRatio = currentHandDistance / handRefDistance; float distanceOffset = distanceRatio > 0 ? (distanceRatio - 1f ) * distanceScale : 0 ; float targetDistance = objectReferenceDistance + distanceOffset; draggingPosition = pivotPosition + (targetDirection * targetDistance); } else { pivotPosition = cameraTransform . position; Vector3 pointerPosition = currentPointer . Position; ; //currentPointer.TryGetPointerPosition(out pointerPosition); Ray pointingRay = currentPointer . Rays[ 0 ]; ; //currentPointer.TryGetPointingRay(out pointingRay); Vector3 currentPosition = pointerPosition; Vector3 currentPositionHeadSpace = cameraTransform . InverseTransformPoint(currentPosition); Vector3 positionDeltaHeadSpace = currentPositionHeadSpace - previousPointerPositionHeadSpace; float pushDistance = Vector3 . Dot(positionDeltaHeadSpace, cameraTransform . InverseTransformDirection(pointingRay . direction . normalized)); if ( Mathf . Abs( Vector3 . Dot(positionDeltaHeadSpace . normalized, Vector3 . forward)) > zPushTolerance) { stickLength = DistanceRamp(stickLength, pushDistance); } draggingPosition = pointingRay . GetPoint(stickLength); previousPointerPositionHeadSpace = currentPositionHeadSpace; } switch (rotationMode) { case RotationModeEnum . OrientTowardUser: case RotationModeEnum . OrientTowardUserAndKeepUpright: draggingRotation = Quaternion . LookRotation(hostTransform . position - pivotPosition); break ; case RotationModeEnum . LockObjectRotation: draggingRotation = hostTransform . rotation; break ; default : // in world space Vector3 objForward = cameraTransform . TransformDirection(objectReferenceForward); // in world space Vector3 objUp = cameraTransform . TransformDirection(objectReferenceUp); draggingRotation = Quaternion . LookRotation(objForward, objUp); break ; } Vector3 newPosition = Vector3 . Lerp(hostTransform . position, draggingPosition + cameraTransform . TransformDirection(objectReferenceGrabPoint), positionLerpSpeed); // Apply Final Position if (hostRigidbody == null ) { hostTransform . position = newPosition; } else { hostRigidbody . MovePosition(newPosition); } // Apply Final Rotation Quaternion newRotation = Quaternion . Lerp(hostTransform . rotation, draggingRotation, rotationLerpSpeed); if (hostRigidbody == null ) { hostTransform . rotation = newRotation; } else { hostRigidbody . MoveRotation(newRotation); } if (rotationMode == RotationModeEnum . OrientTowardUserAndKeepUpright) { Quaternion upRotation = Quaternion . FromToRotation(hostTransform . up, Vector3 . up); hostTransform . rotation = upRotation * hostTransform . rotation; } } #endregion IMixedRealityInputHandler<MixedRealityPose> Implementation #region Private Helpers /// < summary > /// Gets the pivot position for the hand, which is approximated to the base of the neck. /// </ summary > /// < remarks > /// An exponential distance ramping where distance is determined by: /// f(t) = (e^At - 1)/B /// where: /// A is a scaling factor: how fast the function ramps to infinity /// B is a second scaling factor: a denominator that shallows out the ramp near the origin /// t is a linear input /// f(t) is the distance exponentially ramped along variable t /// /// Here's a quick derivation for the expression below. /// A = constant /// B = constant /// d = ramp(t) = (e^At - 1)/B /// t = ramp_inverse(d) = ln(B*d+1)/A /// In general, if y=f(x), then f(currentY, deltaX) = f( f_inverse(currentY) + deltaX ) /// So, /// ramp(currentD, deltaT) = (e^(A*(ln(B*currentD + 1)/A + deltaT)) - 1)/B /// simplified: /// ramp(currentD, deltaT) = (e^(A*deltaT) * (B*currentD + 1) - 1) / B /// </ remarks > private static float DistanceRamp( float currentDistance, float deltaT, float A = 4.0f , float B = 75.0f ) { return ( Mathf . Exp(A * deltaT) * (B * currentDistance + 1 ) - 1 ) / B; } #endregion Private Helpers } Code Listing 02 – End – DragAndDropHandler code