Link Search Menu Expand Document

A3.5 - Building Behaviors

Taking all the previous activities, the following is the creation of behavior in Unity 3D, for VR and AR for two devices Valve Index and HoloLens 2 respectively.

  • VR

In Unity 3D a very well known plugin is XR Interaction Toolkit and Open XR which are used to create VR behaviors through pre-defined components.

  1. Create a player:

    The player is the virtual representation of the user named “XRRig”. To create it you can follow the following hierarchy. Where the camera represents the position and orientation of the user’s head and the hands represent the position and rotation of the user’s controls.

    • XRRig
      • Camera
      • LeftHand
      • RightHand
  2. Teleportation Area:

    Teleport

    • Define the user input to perform the teleportation, in this case, thumbstick up to activate teleportation and release thumbstick to execute teleportation.
    • Create a teleportation area where the user can move freely. You can also create specific teleportation points. This will be the area or points that the user can reach in the virtual environment, note also the zones where the user cannot reach.
    • Deliver some output to the user, how the user will know that the teleportation has been activated, and how he will know where he will be teleported to. In this case it is a curve and a reticle.
  3. Object selection:

    • Add collisions and rigidbodies that will allow raycasting and the hand to detect the presence of an object. For collisions it is suggested to use a primitive shape (cube, sphere, cylinder) that will represent the object. is possible to create behavior when the user is selecting an object, using different states:
      • On Hover Entered / On Hover Exited
      • Select Entered / Select Exited Assets
      • The above states can be used to give information to the user when an object has been hit and can be selected. In this case, the raycasting will change color when it hits the object and the object will increase in size Assets
  4. Object manipulation:

    • Create a socket or define a place where the selected object will be manipulated. In this case the selected object will go directly to the user’s hand. It is also possible to create a gripping pose to give the user a more realistic feel. It is possible to create behavior when the user is manipulating an object, using different states:

    • On Manipulation Started / On Manipulation ended Assets

  • AR

As before in Unity 3D, MRTK is used, a high-level SDK developed by Microsoft that provides a pre-defined component to create XR experiences.

It requires a 3D representation of the object to be manipulated, to make everything simple let’s use the cube that comes by default in Unity.

Before setting up the interactions, add a collider on the parts of the object that will be manipulated. Once this is done, click on Add Component and add the Object Manipulatedt script.

The ObjectManipulator script makes an object movable, scalable, and rotatable using one or two hands. The object manipulator can be configured to control how the object will respond to various inputs.

To make the object respond to near articulated hand input, add the NearInteractionGrabbable script as well.

By just doing this, the object is now manipulable in Augmented Reality. And by adding the Rigidbody component, it can have physical properties, such as gravity.

Assets

  • Manipulation events

The term “manipulation events” refers to events that are triggered when a user manipulates an object.

Manipulation handler provides the following events:

OnManipulationStarted: Fired when manipulation starts.
OnManipulationEnded: Fires when the manipulation ends.
OnHoverStarted: Fires when a hand / controller hovers the manipulatable, near or far.
OnHoverEnded: Fires when a hand / controller un-hovers the manipulatable, near or far.

The event fire order for manipulation is:

OnHoverStarted → OnManipulationStarted → OnManipulationEnded → OnHoverEnded

If there is no manipulation, it will still get hover events with the following fire order:

OnHoverStarted → OnHoverEnded

Input

Guidelines and procedures to construct behaviors in Unity 3D for immersive VR (using Valve Index) and AR (employing HoloLens 2) applications, focusing on the utility of the XR Interaction Toolkit for VR and the MRTK for AR.

Output

A fully functional VR and AR application in Unity 3D where users can experience pre-defined behaviors, such as teleportation in VR and object manipulation in AR.

Control

  • SDKs integration in Unity 3D
  • Quality and accuracy of the developed behaviors
  • Performance running on xr device
  • User feedback

Resources

  • Unity 3D Development Environment
  • XR Interaction Toolkit
  • MRTK (Microsoft Mixed Reality Toolkit)
  • 3D Asset Libraries