text
stringlengths
0
1.11k
Under Inspector, in the Hand Grab Interactor component, set the Velocity Calculator property to the Hand Velocity Calculator prefab that’s located at Left Hand > Hand Velocity Calculator.
Velocity calculator property
Repeat these steps for RightHand.
(Optional) Put the cube on top of a platform, like a table, so it doesn’t fall to the ground when you start the scene.
Prepare to launch your scene by going to File > Build Settings and clicking the Add Open Scenes button.
Your scene is now ready to build.
Select File > Build And Run, or if you have an Oculus Link connected, click Play.
You can now grab and throw the cube.
ChatGPT
Ok.
User
Map Controllers
The platform for which this article is written (Unity) does not match your preferred platform (Nativa).
Haz clic aquí para ver la página del índice de la documentación para tu plataforma de preferencia.
Note
To get started with controller tracking, see the Interaction SDK.
OVRInput exposes a unified input API for multiple controller types.
It is used to query virtual or raw controller state, such as buttons, thumbsticks, triggers, and capacitive touch data. It supports the Meta Quest Touch controllers.
For keyboard and mouse control, we recommend using the UnityEngine.Input scripting API (see Unity’s Input scripting reference for more information).
Mobile input bindings are automatically added to InputManager.asset if they do not already exist.
For more information, see OVRInput in the Unity Scripting Reference guide. For more information on Unity’s input system and Input Manager, see here: http://docs.unity3d.com/Manual/Input.html and http://docs.unity3d.com/ScriptReference/Input.html.
Requirements
Include an instance of OVRManger anywhere in your scene.
Call OVRInput.Update() and OVRInput.FixedUpdate() once per frame at the beginning of any component’s Update and FixedUpdate methods, respectively.
Touch Tracking
OVRInput provides touch position and orientation data through GetLocalControllerPosition() and GetLocalControllerRotation(), which return a Vector3 and Quaternion, respectively.
Controller poses are returned by the tracking system and are predicted simultaneously with the headset. These poses are reported in the same coordinate frame as the headset, relative to the initial center eye pose, and can be used for rendering hands or objects in the 3D world. They are also reset by OVRManager.display.RecenterPose(), similar to the head and eye poses.
Note: Meta Quest Touch controllers are differentiated with Primary and Secondary in OVRInput: Primary always refers to the left controller and Secondary always refers to the right controller.
OVRInput Usage
The primary usage of OVRInput is to access controller input state through Get(), GetDown(), and GetUp().
Get() queries the current state of a controller.
GetDown() queries if a controller was pressed this frame.
GetUp() queries if a controller was released this frame.
Control Input Enumerations
There are multiple variations of Get() that provide access to different sets of controls. These sets of controls are exposed through enumerations defined by OVRInput as follows:
Control
Enumerates
OVRInput.Button
Traditional buttons found on gamepads, controllers, and back button.
OVRInput.Touch
Capacitive-sensitive control surfaces found on the controller.
OVRInput.NearTouch
Proximity-sensitive control surfaces found on the controller.
OVRInput.Axis1D
One-dimensional controls such as triggers that report a floating point state.
OVRInput.Axis2D
Two-dimensional controls including thumbsticks. Reports a Vector2 state.
A secondary set of enumerations mirrors the first, defined as follows:
Control
OVRInput.RawButton
OVRInput.RawTouch
OVRInput.RawNearTouch
OVRInput.RawAxis1D
OVRInput.RawAxis2D
The first set of enumerations provides a virtualized input mapping that is intended to assist developers with creating control schemes that work across different types of controllers. The second set of enumerations provides raw unmodified access to the underlying state of the controllers. We recommend using the first set of enumerations, since the virtual mapping provides useful functionality, as demonstrated below.
Button, Touch, and NearTouch
In addition to traditional gamepad buttons, the controllers feature capacitive-sensitive control surfaces which detect when the user’s fingers or thumbs make physical contact (Touch), as well as when they are in close proximity (NearTouch). This allows for detecting several distinct states of a user’s interaction with a specific control surface. For example, if a user’s index finger is fully removed from a control surface, the NearTouch for that control will report false. As the user’s finger approaches the control and gets within close proximity to it, the NearTouch will report true prior to the user making physical contact. When the user makes physical contact, the Touch for that control will report true. When the user pushes the index trigger down, the Button for that control will report true. These distinct states can be used to accurately detect the user’s interaction with the controller and enable a variety of control schemes.
Example Usage
// returns true if the primary button (typically “A”) is currently pressed.
OVRInput.Get(OVRInput.Button.One);
// returns true if the primary button (typically “A”) was pressed this frame.
OVRInput.GetDown(OVRInput.Button.One);
// returns true if the “X” button was released this frame.
OVRInput.GetUp(OVRInput.RawButton.X);
// returns a Vector2 of the primary (typically the Left) thumbstick’s current state.
// (X/Y range of -1.0f to 1.0f)
OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick);