text
stringlengths
0
1.11k
Illustrating how the highlights and shadows shader renders an image.
The PTRLHighlightsAndShadows shader computes highlights intensity from point light sources and marks shadow areas adjusting transparency. This is processed in the OpenXR compositor and blended with the Passthrough layer.
When a virtual object rendered with this shader is overlapping with its real physical counterpart, it creates the effect of relighting, lit with virtual light and shadowed by virtual objects.
The BiRP subshader in PTRLHighlightsAndShadows contains multiple passes:
The first ForwardAdd pass is for point lights, highlights, and additional directional light highlights. This pass runs for each light that is not considered the main light. It accumulates diffuse contribution, this time without multiplying any albedo component that a regular diffuse shader would have.
After that, ForwardBase pass handles the shadows of the main light. It outputs a black color multiplied by the shadow intensity set in the material.
Finally, a second ForwardAdd pass adds shadow contributions from additional directional lights.
URP version
There is also a URP subshader, which implements the PTRL effect in a single pass. UniversalForward pass computes light contribution and shadows from all light sources.
Blob shadow
A common alternative to real-time shadows are blob shadows. Those are simple blots of color that do not take into account the geometry of the object and are more performant. The Oppy entity contains a BlobShadow game object that enables the technique. BlobShadow objects are designed using the Projector component, along with a specific material and shader that can be found in the corresponding folder.
ChatGPT
Ok.
User
Interaction SDK Overview
Unity
All-In-One VR
Quest
The platform for which this article is written (Unity) does not match your preferred platform (Nativa).
Haz clic aquí para ver la página del índice de la documentación para tu plataforma de preferencia.
The Meta XR Interaction SDK for Unity makes it easy for VR users to immersively interact with their virtual environment. With Interaction SDK, you can grab and scale objects, push buttons, teleport, navigate user interfaces, and more while using controllers or just your physical hands.
To try out Interaction SDK on a supported device without any required setup, see the Try Interaction SDK section.
Features
Interaction SDK offers many features to create an immersive XR experience.
Multiple ways to grab an object, including grabbing it up close or from far away, or grabbing it dynamically based on colliders.
Pressing buttons using raycasting.
Pressing buttons using poking, and scrolling on a UI canvas.
Teleporting to spots or surfaces in the environment, and turning so you can rotate in place.
The ability to customize how your hands grab an object on both Mac and PC.
Manipulating objects, including scaling and moving objects freely or along fixed axes.
Snapping objects to a location (ex. items to an in-game inventory).
Creating and detecting custom gestures, including stationary gestures, like a thumbs-up, or moving gestures, like a wave.
Support for custom hand models.
Customizable User Interface (UI) prefabs.
Throwing objects.
Here’s a video showing the features in action.
Try Interaction SDK
To try Interaction SDK interactions without any setup, you can download one of the following apps.
Interaction SDK Samples, the official reference for Interaction SDK features.
First Hand, an official hand tracking demo built by Meta.
Move Fast, a short showcase of Interaction SDK being used in fast, fitness-type apps.
Links to the source code for First Hand and Move Fast are in the Related Topics section.
Supported devices
Quest 1
Quest 2
Quest 3
Quest Pro
Supported Unity versions
Unity 2021 LTS (2021.3)
Unity 2022 LTS (2022.3)
OpenXR compatibility
Interaction SDK supports OpenXR via the Oculus OpenXR backend. Unity’s OpenXR plugin is not supported at this time. For more information, see XR Plugin.
Directory layout
Interaction SDK follows the standard Unity UPM layout and contains three root folders, each with their own Assembly Definition (.asmdef):
Editor (Oculus.Interaction.Editor): Contains all Editor code for Interaction SDK.
Runtime (Oculus.Interaction): Contains the core runtime components of Interaction SDK.
Samples (Oculus.Interaction.Samples): Contains sample scenes, prefabs, and scripts. This directory is optional to import.
Dependencies
Interaction SDK depends on the following package:
[SDK] Oculus Core (com.oculus.integration.vr)
The Samples directory additionally depends on the following package:
TextMeshPro (com.unity.textmeshpro)
Related topics
For a video overview of inputs, hand tracking, and Interaction SDK, see Connect 2022.
For a video tutorial of how to get started with Interaction SDK, see Building Intuitive Interactions in VR.