text
stringlengths 0
1.11k
|
|---|
View the Controller Animations
|
The animation clips can be used to animate the buttons, triggers, and thumbsticks on the controller models.
|
To view the controller animations:
|
In the Project view, expand the Meta > VR > Meshes folder.
|
Expand the controller folder.
|
Click on the controller animation.
|
In the Inspector view, select the Animation tab.
|
Watch the animation in the Preview view.
|
List of available controller animations
|
For each of the Meta headset devices, the following buttons have controller animations. To see more about mapping, see Touch Input Mapping in Map Controllers.
|
For each of the controller types, the following buttons are available:
|
button01
|
stickSE
|
trigger
|
button02
|
button03
|
button04
|
stickSW
|
grip
|
sticks
|
stickNW
|
stickNE
|
stickW
|
stickS
|
stickN
|
button01_neutral
|
button02_neutral
|
trigger_neutral
|
grip_neutral
|
Use the Animation Controllers
|
The animation clip controllers layer and blend the button, trigger, and thumbstick animations based off of controller input.
|
In the Project view, expand the Meta > VR > Meshes folder.
|
Expand the controller folder.
|
Click on the Animation folder.
|
Example Implementation
|
OVRControllerPrefab uses these animation controllers. This prefab is a good example of how these animation controllers can be used with custom scripts.
|
To view the OVRControllerPrefab, go to the Project view and expand the Meta > VR > Prefabs folder.
|
Below is an simple example implementation to bind the animations to the controller buttons.
|
if (m_animator != null)
|
{
|
m_animator.SetFloat("Button 1", OVRInput.Get(OVRInput.Button.One, m_controller) ? 1.0f : 0.0f);
|
m_animator.SetFloat("Button 2", OVRInput.Get(OVRInput.Button.Two, m_controller) ? 1.0f : 0.0f);
|
m_animator.SetFloat("Button 3", OVRInput.Get(OVRInput.Button.Start, m_controller) ? 1.0f : 0.0f);
|
m_animator.SetFloat("Joy X", OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick, m_controller).x);
|
m_animator.SetFloat("Joy Y", OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick, m_controller).y);
|
m_animator.SetFloat("Trigger", OVRInput.Get(OVRInput.Axis1D.PrimaryIndexTrigger, m_controller));
|
m_animator.SetFloat("Grip", OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, m_controller));
|
}
|
See OVRControllerHelper.cs for a more detailed implementation of controller animations.Runtime Controllers
|
Unity
|
All-In-One VR
|
Quest
|
The platform for which this article is written (Unity) does not match your preferred platform (Nativa).
|
Haz clic aquí para ver la página del índice de la documentación para tu plataforma de preferencia.
|
Runtime controller models load controller models dynamically from the Meta Quest software to match the controllers in use. This feature is available for the Meta Quest apps built with the OpenXR backend only. To switch from the legacy VRAPI backend to the OpenXR backend, go to the Switch between OpenXR and Legacy VRAPI documentation.
|
Set Up Runtime Controller Prefab
|
The OVRRuntimeControllerPrefab renders the controller models dynamically and adds a default shader. The prefab uses the OVRRuntimeController.cs script that queries and renders controller models. Do the following to set up the runtime controller prefab:
|
Create a new scene or open an existing one from your project.
|
From the Project tab, search for OVRCameraRig and drag it in the scene. Skip this step if OVRCameraRig already exists in the scene.
|
From the Hierarchy tab, expand OVRCameraRig > TrackingSpace > LeftControllerAnchor and RightControllerAnchor to add the runtime controller prefabs under each hand anchor.
|
From the Project tab, in the search box, type OVRRuntimeControllerPrefab, and drag the prefab under LeftControllerAnchor and RightControllerAnchor.
|
Under LeftControllerAnchor, select OVRRuntimeControllerPrefab, and then from the Inspector tab, from the Controller list, select L Touch to map the controller. Repeat this step to map the right hand runtime controller to R Touch.
|
From the Shader list, select the shader, if you prefer a different shader than the default.
|
From the Hierarchy tab, select OVRCameraRig to open OVR Manager settings in the Inspector tab.
|
Under OVR Manager > Quest Features > General tab, from the Render Model Support list, select Enabled.
|
If your project contains OVRControllerPrefab, from the Hierarchy tab, expand OVRCameraRig > TrackingSpace > LeftControllerAnchor, select OVRControllerPrefab, and then from the Inspector tab, clear the checkbox to disable the prefab. Repeat this step to disable OVRControllerPrefab under the RightControllerAnchor.
|
Use APIs in Customized Scripts
|
To attach a customized script instead of the default OVRRuntimeController.cs script, use the following APIs to query and render the controller models. For more information about implementing the controller model APIs, refer to the OVRRuntimeController.cs script located in the Oculus/VR/Scripts/Util folder.
|
OVRPlugin.GetRenderModelPaths() - Returns a list of model paths that are supported by the runtime. These paths are defined in the OpenXR spec as: /model_fb/controller/left, /model_fb/controller/right, /model_fb/keyboard/remote, and /model_fb/keyboard/local.
|
If a model is not supported by the runtime, its path will not be excluded from the returned list.
|
OVRPlugin.GetRenderModelProperties(modelPath, ref modelProperties) - Returns properties for the given model when passed a model path. The properties will include the model key, which is used to load the model, as well as the model name, vendor id, and version.
|
OVRPlugin.LoadRenderModel(modelKey) - Will load the model using the model key. The model returned is a glTF binary (GLB) that includes a KTX2 texture which can be loaded using the OVRGLTFLoader provided.Set Up Hand Tracking
|
Unity
|
All-In-One VR
|
Quest
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.