text
stringlengths
0
1.11k
Hand_Thumb0 = Hand_Start + 2 // thumb trapezium bone
Hand_Thumb1 = Hand_Start + 3 // thumb metacarpal bone
Hand_Thumb2 = Hand_Start + 4 // thumb proximal phalange bone
Hand_Thumb3 = Hand_Start + 5 // thumb distal phalange bone
Hand_Index1 = Hand_Start + 6 // index proximal phalange bone
Hand_Index2 = Hand_Start + 7 // index intermediate phalange bone
Hand_Index3 = Hand_Start + 8 // index distal phalange bone
Hand_Middle1 = Hand_Start + 9 // middle proximal phalange bone
Hand_Middle2 = Hand_Start + 10 // middle intermediate phalange bone
Hand_Middle3 = Hand_Start + 11 // middle distal phalange bone
Hand_Ring1 = Hand_Start + 12 // ring proximal phalange bone
Hand_Ring2 = Hand_Start + 13 // ring intermediate phalange bone
Hand_Ring3 = Hand_Start + 14 // ring distal phalange bone
Hand_Pinky0 = Hand_Start + 15 // pinky metacarpal bone
Hand_Pinky1 = Hand_Start + 16 // pinky proximal phalange bone
Hand_Pinky2 = Hand_Start + 17 // pinky intermediate phalange bone
Hand_Pinky3 = Hand_Start + 18 // pinky distal phalange bone
Hand_MaxSkinnable= Hand_Start + 19
// Bone tips are position only. They are not used for skinning but are useful for hit-testing.
// NOTE: Hand_ThumbTip == Hand_MaxSkinnable since the extended tips need to be contiguous
Hand_ThumbTip = Hand_Start + Hand_MaxSkinnable + 0 // tip of the thumb
Hand_IndexTip = Hand_Start + Hand_MaxSkinnable + 1 // tip of the index finger
Hand_MiddleTip = Hand_Start + Hand_MaxSkinnable + 2 // tip of the middle finger
Hand_RingTip = Hand_Start + Hand_MaxSkinnable + 3 // tip of the ring finger
Hand_PinkyTip = Hand_Start + Hand_MaxSkinnable + 4 // tip of the pinky
Hand_End = Hand_Start + Hand_MaxSkinnable + 5
Max = Hand_End + 0
Add Interactions
To standardize interactions across apps, OVR Hand provides access to the filtered pointer pose and detection for pinch gestures to ensure your app conforms to the same interaction models as Oculus system apps. Simple apps that only require point and click interactions can use the pointer pose to treat hands as a simple pointing device, with the pinch gesture acting as the click action.
Selection
Pinch
Pinch is the basic interaction primitive for UI interactions using hands. A successful pinch of the index finger can be considered the same as a normal select or trigger action for a controller, i.e., the action that activates a button or other control on a UI.
OVR Hand provides methods to detect whether the finger is currently pinching, the pinch’s strength, and the confidence level of the finger pose. Based on the values returned, you can provide feedback to the user by changing the color of the fingertip, adding an audible pop when fingers have fully pinched, or integrate physics interactions based on the pinch status. Call GetFingerIsPinching() method and pass the finger constant to check whether it is currently pinching. The method returns a boolean to indicate the current state of pinch. The finger constants are: Thumb, Index, Middle, Ring, and Pinky. Call the GetFingerPinchStrength() method and pass the finger constant to check the progression of a pinch gesture and its strength. The value it returns ranges from 0 to 1, where 0 indicates no pinch and 1 is a full pinch with the finger touching the thumb. Call the GetFingerConfidence() method and pass the finger constant to measure the confidence level of the finger pose. It returns the value as low or high, which indicates the amount of confidence that the tracking system has for the finger pose.
var hand = GetComponent<OVRHand>();
bool isIndexFingerPinching = hand.GetFingerIsPinching(HandFinger.Index);
float ringFingerPinchStrength = hand.GetFingerPinchStrength(HandFinger.Ring);
TrackingConfidence confidence = hand.GetFingerConfidence(HandFinger.Index);
Pointer Pose
Deriving a stable pointing direction from a tracked hand is a non-trivial task involving filtering, gesture detection, and other factors. OVR Hand provides a pointer pose so that pointing interactions can be consistent across Meta Quest apps. It indicates the starting point and position of the pointing ray in the tracking space. We highly recommend that you use PointerPose to determine the direction the user is pointing in the case of UI interactions.
Pointer 1
The pointer pose may or may not be valid, depending on the user’s hand position, tracking status, and other factors. Call the IsPointerPoseValid property to check whether the pointer pose is valid. If valid, use the ray for UI hit testing, otherwise avoid using it for rendering the ray.
Track Hand Confidence
At any point, you may want to check if your app detects hands. Call the IsTracked property to verify whether hands are currently visible and not occluded from being tracked by the headset. To check the level of confidence the tracking system has for the overall hand pose, call the HandConfidence property that returns the confidence level as either Low or High. We recommended to only use hand pose data for rendering and interactions when the hands are visible and the confidence level is high.
Get Hand Scale
Call the HandScale property to get the scale of the user’s hand, which is relative to the default hand model scale of 1.0. The property returns a float value as a scale compared to the hand model. For example, the value of 1.05 indicates the user’s hand size is 5% larger than the default hand model. The value may change at any time, and you should use the value to scale the hand for rendering and interaction simulation at runtime.
Check System Gestures
The system gesture is a reserved gesture that allows users to transition to the Meta Quest universal menu. This behavior occurs when users place their dominant hand up with the palm facing the user and pinch with their index finger. The pinching fingers turn light blue as they start to pinch. When the user uses the non-dominant hand to perform the gesture, it triggers the Button.Start event. You can poll Button.Start to integrate any action for the button press event in your app logic.
To detect the dominant hand, call the IsDominantHand property. If true, check whether the user is performing a system gesture by calling the IsSystemGestureInProgress property. We recommend that if the system gesture is in progress, the app should provide visual feedback to the user, such as rendering the hand material with a different color or a highlight to indicate to the user that a system gesture is in progress. The app should also suspend any custom gesture processing when the user is in the process of performing a system gesture. This allows apps to avoid triggering a gesture-based event when the user is intending to transition to the Meta Quest universal menu.
Troubleshooting
The following questions help you troubleshoot issues you may encounter during rendering and integrating hands in the app:
Why don’t I see hands in my app?
There can be many reasons why hands are not rendering in your app. To begin with, verify that hand tracking is enabled on the device and that hands are working correctly in the system menus. Ensure that you have used OVRHandPrefab to add hands in the scene.
Why do I see blurry/faded hands?
Your hands may not be properly tracked since the cameras on the Meta Quest headset have a limited field of view. Make sure the hands are closer to the front of the Meta Quest headset for better tracking.
Can I use another finger besides the index finger for the pinch gesture?
Yes. Use the OVRHand.GetFingerIsPinching() method from OVRHand.cs with the finger that you want to track instead. For more information about tracking fingers, go to the Add Interactions section.
Understanding Hand Tracking Limitations
Hand tracking for Meta Quest is currently an experimental feature with some limitations. While these limitations may be reduced or even eliminated over time, they are currently part of the expected behavior. For more specific issues, go to the Troubleshooting section.
Occlusion
Tracking may be lost, or hand confidence may become low when one hand occludes another. In general, an app should respond to this by fading the hands away.
Noise
Hand tracking can exhibit some noise. It may be affected by lighting and environmental conditions. You should take these conditions into consideration when developing algorithms for gesture detection.
Controllers + Hands
Controllers and hands are not currently tracked at the same time. Apps should support either hands or controllers, but not at the same time.
Lighting
Hand tracking has different lighting requirements than inside-out (head) tracking. In some situations, this could result in functional differences between head tracking and hand tracking, where one may work while the other has stopped functioning.Tutorial - Receive Basic Input from Hand Tracking
Unity
All-In-One VR
PC VR