https://developer.oculus.com/documentation/unity/unity-handtracking
https://skarredghost.com/2020/01/03/how-to-oculus-quest-hands-sdk-unity/
Integrating Pinch
var hand = GetComponent<OVRHand>();
bool isIndexFingerPinching = hand.GetFingerIsPinching(HandFinger.Index);
float ringFingerPinchStrength = hand.GetFingerPinchStrength(HandFinger.Ring);
var hand = GetComponent<OVRHand>();
TrackingConfidence confidence = hand.GetFingerConfidence(HandFinger.Index);
Integrating Pointer Pose
Deriving a stable pointing direction from a tracked hand is a non-trivial task involving filtering, gesture detection, and other factors. OVRHand.cs provides a pointer pose so that pointing interactions can be consistent across apps. It indicates the starting point and position of the pointing ray in the tracking space. We recommend that you use PointerPose
to determine the direction the user is pointing in the case of UI interactions.
- Call the
PointerPose
property from OVRHand.cs.
The pointer pose may or may not be valid, depending on the user’s hand position, tracking status, and other factors. Call the IsPointerPoseValid
property, which returns a boolean indicating whether the pointer pose is valid. If the pointer pose is valid, you can use the ray for UI hit testing. Otherwise, you should avoid using it for rendering the ray. (need to understand this part more)
I found all the answers in this sample framework.
https://developer.oculus.com/documentation/unity/unity-sample-framework/
Make sure the Oculus Integrations is V16+