text
stringlengths
0
1.11k
When you first enter the scene, you see several 3D objects, which display behind the passthrough scene.
Adjust the opacity of the Passthrough layer using the right controller thumbstick. Move the thumbstick left to make it more transparent and right to make it more opaque. Leaving it in the middle leaves it at 50% opacity, revealing the Unity scene beneath it.
Overlay sample
Details
There are two ways to adjust the opacity of the Passthrough layer.
You can set the OVRPassthroughLayer to Overlay. This technique is demonstrated in this sample. To view the property, select the OVRCameraRig object in the Hierarchy window. Then, in the Inspector, expand the OVR PassthroughLayer (Script).
Alternatively, you can set the opacity to 0, and turn on edge rendering. This gives the Sobel edge-filtered version of Passthrough, rendering above Unity. This technique is not covered in this sample. Find both Opacity and Edge Rendering in OVR PassthroughLayer (Script) as well.
Key Assets
The assets used in this scene are:
InfoPanel.prefab
Provides the scene runtime information
Location: .\Assets\StarterSamples\Usage\Passthrough\Prefabs
OverlayPassthrough.cs
Provides the overlay opacity control code
Location .\Assets\StarterSamples\Usage\Passthrough\ScriptsDepth API for Unity
The platform for which this article is written (Unity) does not match your preferred platform (Nativa).
Haz clic aquí para ver la página del índice de la documentación para tu plataforma de preferencia.
Health and Safety Recommendation: While building mixed reality experiences, we highly recommend evaluating your content to offer your users a comfortable and safe experience. Please refer to the Health and Safety and Design guidelines before designing and developing your app using Depth.
Overview
The Depth API exposes real-time sensed environment depth to apps in the form of depth maps. It may be used for a variety of depth-based visual effects, but mainly it’s used in mixed reality (MR) to render virtual objects so that they become occluded by objects and surfaces in the real world and appear to be embedded in the actual environment. Without occlusions, it will be apparent that the virtual content is just a layer drawn on top of the real world which breaks the MR immersion.
Scene with and without occlusions
Supported use cases
A comparison of supported use cases in the Depth and Scene APIs.
Use cases
Depth API
Scene API
Static Occlusion: the occluding real-world environment remains immobile (static) throughout the lifetime of the app, i.e. no moving objects such as hands, people or chairs.
Dynamic Occlusion: the occluding real-world environment contains mobile (dynamic) elements, e.g. the users hands, other people, pets.
Raycasting: Computing intersection of a ray and real-world surfaces. Supports use cases like content placement.
Physics/Collisions: interactions between virtual content and the real-world, like a virtual ball bouncing off of a physical couch.
Limitations
Limitation
Reason
Suggested workaround
Occlusions flickers near surfaces
This is caused by an issue often referred to as “Z-fighting”. In 3D graphics, this usually happens when two virtual objects are rendered at the same depth. Environment depth values are produced within the error margin, so in this case, z-fighting is apparent even when the depth is not precisely the same from frame to frame.
The sample soft occlusion implementation has mechanics in place to partially mitigate this issue. However, it is recommended to offset objects that you place on Scene Model surfaces along the surface normal.
Occlusions aren’t exactly matching the real-world and lags behind during fast movements
Limitations in real-time depth sensing doesn’t allow for a pixel perfect occlusions at the same framerate as apps are rendered.
Soft occlusion shaders help mitigate the visibility of this, but apps needs to be designed with this limitation in mind.
FAQs
Should I use both Scene and Depth API together?
To build realistic MR apps, both capabilities should be used together to cover a broader set of use cases than either of these capabilities can address in isolation. The recommended flow is:
Prompt users to initiate the space setup flow to build a 3D scene model of their environment.
Use depth maps from Depth API to render occlusions based on the per-frame sensed depth, which will be take advantage of the scene model to improve the depth estimates.
If needed, use Scene API to implement other features such as game physics and object placement, which requires a watertight static 3D model of the environment.
What if scene capture is not initiated?
If space setup is not initiated, the Depth API cannot take advantage of the scene model when computing the depth maps. In this case, the depth maps returned from Depth API will be computed only from the sensed depth. This will make the depth estimates worse and less stable, especially on planar surfaces such as walls, floors and tables. When using the Depth API for occlusion, this will increase flickering when virtual objects are close to the these real-world surfaces.
Are there scenarios where I should only be using the Depth API?
It is technically possible to use Depth API independently of Scene. Note that when done so, it will only expose depth maps based on sensed depth alone. Depending on the use case for your app, you may choose to only use the Depth API.
Best practices
Today, the Scene Model allows you to build room-scale, mixed reality experiences with realistic occlusion. However, this method does not support occlusion with dynamically moving objects in the user’s view (for example, hands, limbs, other people, pets, etc.). In order to realistically occlude with dynamic elements of the Scene, you must also use the Depth API.
Spatial data permission
An app that wants to use Depth API needs to request spatial data permission during the app’s runtime. See the Unity section of Spatial Data Permission and Requesting runtime permissions for more information.
Sample
https://github.com/oculus-samples/Unity-DepthAPI
Requirements
Unity 2022.3.1f or Unity 2023.2
V60 Meta XR All-in-One SDK or later
com.unity.xr.oculus package supporting Depth API. In Unity Package Manager via Add package by name with name - com.unity.xr.oculus and version - 4.2.0