[1487] Hand tracking shown extremely different in remote users
tracked
uzugu
When using VRChat's native hand tracking on PCVR, the hand movements that are displayed locally do not correspond accurately to what remote users see. The core issue appears to be that finger curl data is transmitted on a per-finger basis rather than per-joint, similar to the legacy Index-controller finger curl implementation. This leads to incorrect representation of specific hand signs, particularly those that involve fingers being straight but angled. Users employing custom OSC hand tracking do not face this problem, highlighting a discrepancy in how hand tracking data is processed and transmitted in VRChat.
Steps to Reproduce:
Use VRChat on PCVR with native hand tracking enabled.
Perform hand signs that require specific joint-based finger curls (e.g., fingers that are straight but angled).
Have a remote user observe the hand signs.
Compare the hand movements observed by the remote user with what is displayed locally.
Observed Result:
Remote users observe incorrect or misaligned hand movements due to finger curl data being transmitted per-finger instead of per-joint. This issue is consistent across different users and instances, regardless of whether VRChat is restarted or even reinstalled.
Expected Result:
Hand tracking data should be transmitted per-joint rather than per-finger, ensuring that remote users see an accurate representation of the local user’s hand movements. This is essential for the proper execution of Sign Language and other precise hand gestures in VRChat.
Frequency: Always
Impact:
This bug severely impacts users who rely on Sign Language for communication, as it prevents accurate representation of certain hand signs. The lack of precise hand tracking translation compromises the ability to communicate effectively using Sign Language in VRChat.
Additional Information:
The issue occurs across multiple users, instances, and devices.
Users with custom OSC hand tracking do not encounter this issue, suggesting a difference in how tracking data is handled.
Accurate transmission of hand movements, down to the individual joint level, is critical for Sign Language and other detailed gestures.
Log In
_
_tau_
Merged in a post:
[1485] Hand IK extremely offset between local and remote players
Kazy
these two pictures were taken at the same time. For me, our fingers are touching, but a remote player sees my hand a good couple of inches off.
_
_tau_
Merged in a post:
[1487] Only networking finger curl is not enough fidelity for finger/hand tracking
Kazy
VRChat only syncs Finger Curl between clients, which might be enough for people with Index controllers, but is not nearly precise enough for hand tracking.
I know you want to optimize network data, but I think more data should be synced if you are using hand tracking (would also help the sign language people out there)
Attached an example, first image is what I see, second image is what another user sees. It's night and day. (both images were taken using the latest VRChat beta)
_
_tau_
marked this post as
tracked
Kazy
I think I've figured part of it out: The network IK seems to be following the translucent hands rather than what I see my local avatar's hands at.
artGhostt
Tested in-game and there seems to be multiple levels of offsets over the network:
- Hand position is offset;
- Finger isn't separating the curl for each knuckle along it, seems to be normalizing it and curling the whole finger (this seems to be a networked IK optimization technique, but if not should definitely be looked at).
(Quest Pro VD, 1486)
Kazy
artGhostt Examples: first image is what I see locally, second image is what a remote player sees. I can see why it'd be good to sync like this for Index, but it's not enough fidelity for hand tracking.
°sky
artGhostt even if a form of reducing network usage, this defeats the entire point of implementing skeletal input. would definitely like to see this solved by the vrc team