Unity MR Part 3: Camera Component
tststs
Posted on January 10, 2024
π Stumbled here on accident? Start with the introduction!
π This article will guide you through understanding cameras in the context of MR and their implementation in Unity. It covers adding the camera to a scene and examines the XR Origin (AR)
Prefab, providing essential insights for integrating cameras effectively in MR environments using Unity.
βΉοΈ If you find yourself facing any difficulties, remember that you can always refer to or download the code from our accompanying GitHub repository
In Unity, the camera is a crucial component for creating visual experiences in both traditional game development and MR. It acts as the viewer's eye, determining what is seen on the screen. In MR, the camera not only renders the virtual environment but is often integrated with real-world inputs, allowing for a seamless blend of physical and digital elements.
The camera in AR Foundation for Unity is crucial for AR applications, offering features like camera components, image capture, and accessing EXIF data from device camerasββ. The Unity OpenXR Meta plugin supplements this by focusing on Meta Quest devices.
On Meta Quest devices, AR Foundation's Camera subsystem controls Meta Passthrough
. Enable the AR Camera Manager
component to enable Passthrough, and disable it to disable Passthrough.
Adding a Camera to the Scene
- Remove the default
Main Camera
from the hierarchy. - Create an empty GameObject
MR
- Move the
Session
GameObject into the newly createdMR
GameObject - In the hierarchy view right click the
MR
GameObject and select XR β XR Origin (AR). After adding the XR Origin your scene should look like as follows:
The XR Origin (XR Rig)
Prefab is composed of a Camera Offset
, Main Camera
, and two GameObjects for the left and right controllers. We will delve into the XR Origin (XR Rig)
Prefab in this article, and the subsequent article in this series will cover the setup of both controllers in more detail.
You'll observe that an XR Interaction Manager
has been automatically generated in your scene.
βΉοΈ The XR Interaction Manager in Unity is a central component that manages interactive elements in XR (Extended Reality) environments. It handles the interactions between the user and virtual objects, including input from controllers and headsets. This manager coordinates actions like grabbing, touching, or selecting objects, ensuring smooth and intuitive interaction within the virtual world, which is essential for creating immersive and responsive XR experiences.
Weβll get back to the XR Interaction Manager
in the next article.
Exploring the XR Origin (AR) Prefab
XR Origin (XR Rig)
The XR Origin represents the session-space origin
(0, 0, 0) in an XR scene.
βΉοΈ session-space origin
refers to the reference point or coordinate system that is established when an XR session starts. This origin is used to define the positions and orientations of objects and users within the virtual environment. It's crucial for accurately placing and moving virtual elements in relation to the real world or the user's environment, ensuring a coherent and immersive experience in XR applications.
The XR Origin component is typically attached to the base object of the XR Origin, and stores the GameObject that will be manipulated via locomotion
. It is also used for offsetting the camera.
βΉοΈ Locomotion
refers to the methods by which a user moves within a virtual environment. This can include physical movements, like walking or turning, which are tracked and translated into movement within the virtual space. Alternatively, it can involve virtual movement methods such as teleportation, sliding, or using a controller to navigate. Effective locomotion techniques are key to creating a comfortable and immersive XR experience.
XROrigin Script fields
Field | Description |
---|---|
Origin Base GameObject | The Origin GameObject is used to refer to the base of the XR Origin, by default it is this GameObject. This is the GameObject that will be manipulated via locomotion. |
Camera Floor Offset Object | The GameObject to move to desired height off the floor (defaults to this object if none provided). This is used to transform the XR device from camera space to XR Origin space. |
Camera | The Camera used to render the scene from the point of view of the XR device. Must be a child of the GameObject containing this XROrigin component. |
Tracking Origin Mode | Sets which Tracking Origin Mode to use when initializing the input device.NotSpecified : Uses the default Tracking Origin Mode of the input device.Device : Input devices will be tracked relative to the first known location.Floor : Input devices will be tracked relative to a location on the floor |
Camera Y Offset | Camera height to be used when in Device Tracking Origin Mode to define the height of the user from the floor.This is the amount that the camera is offset from the floor when moving the CameraFloorOffsetObject . |
Main Camera
Let's examine the Main Camera component, focusing specifically on the additional scripts and systems essential for creating a MR experience. This article will not cover the basic aspects of Unity's Camera component. Instead, it will delve into the AR Camera Manager
, AR Camera Background
and Tracked Pose Driver
, which are crucial for implementing an MR experience.
AR Camera Manager
This component controls Passthrough
on Meta Quest devices. Enable this component to enable Passthrough
, and disable it to disable Passthrough
. Properties such as Auto Focus and Facing Direction have no effect on Meta Quest devices.
βΉοΈ Passthrough
refers to a technology that allows users to see the real world through their headset's cameras while still being in a virtual or augmented environment. This blending of real-world visuals with virtual elements helps create a mixed reality experience, enabling users to interact with both physical and digital objects, enhancing safety and immersion.
AR Camera Background
This component has no effect on Meta Quest devices, so we disable it as seen in the above screenshot (checkmark next to AR Camera Background)
.
Tracked Pose Driver (Input System)
The Tracked Pose Driver
is a component used in XR development. It tracks the position and rotation of the headset or controller, and applies these transformations to a GameObject in Unity. This allows the GameObject to mirror the real-world movements of the device.
In our scenario, the current pose value of the tracked device is applied to the Transform of the Main Camera
. This ensures that the camera's position and orientation in the virtual environment accurately reflect the real-world movements of the tracked device, enhancing the realism and interactivity of the experience.
Testing the app
Run your Unity project as explained in the last article Session Component
.
You should now see the Unity project running in your headset as seen in the next screenshot.
Since we're utilizing Meta Quest Link, passthrough functionality is not available, when running the app from within the Unity Editor. The red lines represent the left and right controllers, automatically integrated via the XR Origin Prefab.
Additionally, the black background is a consequence of the solid color setting, configured through the Environment section of our camera component, resulting in this specific visual effect.
Next article
In our next article, we're excited to introduce the concept of Passthrough technology, a pivotal feature for entering the realm of MR. Passthrough acts as a bridge between the virtual and real worlds, allowing users to see their physical environment through a digital lens.
Posted on January 10, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.