Bug Reports

  • No off-topic posts
  • Don't report more than 1 issue at once
  • For isolated issues or customer support visit help.vrchat.com
Thanks for your bug report!
[1487] Hand tracking shown extremely different in remote users
When using VRChat's native hand tracking on PCVR, the hand movements that are displayed locally do not correspond accurately to what remote users see. The core issue appears to be that finger curl data is transmitted on a per-finger basis rather than per-joint, similar to the legacy Index-controller finger curl implementation. This leads to incorrect representation of specific hand signs, particularly those that involve fingers being straight but angled. Users employing custom OSC hand tracking do not face this problem, highlighting a discrepancy in how hand tracking data is processed and transmitted in VRChat. Steps to Reproduce: Use VRChat on PCVR with native hand tracking enabled. Perform hand signs that require specific joint-based finger curls (e.g., fingers that are straight but angled). Have a remote user observe the hand signs. Compare the hand movements observed by the remote user with what is displayed locally. Observed Result: Remote users observe incorrect or misaligned hand movements due to finger curl data being transmitted per-finger instead of per-joint. This issue is consistent across different users and instances, regardless of whether VRChat is restarted or even reinstalled. Expected Result: Hand tracking data should be transmitted per-joint rather than per-finger, ensuring that remote users see an accurate representation of the local user’s hand movements. This is essential for the proper execution of Sign Language and other precise hand gestures in VRChat. Frequency: Always Impact: This bug severely impacts users who rely on Sign Language for communication, as it prevents accurate representation of certain hand signs. The lack of precise hand tracking translation compromises the ability to communicate effectively using Sign Language in VRChat. Additional Information: The issue occurs across multiple users, instances, and devices. Users with custom OSC hand tracking do not encounter this issue, suggesting a difference in how tracking data is handled. Accurate transmission of hand movements, down to the individual joint level, is critical for Sign Language and other detailed gestures.
8
·
tracked
Locomotion Animation Discrepancy Between Local and Remote Users in VRChat
An artist named TuxedoPato (Martin, https://x.com/TuxedoPato ) is currently developing a custom locomotion animation set for VRChat. During testing, we encountered a significant issue: the animation behaves differently on the local machine compared to how it appears to remote users. Specifically, the remote version appears unnaturally smoothed, losing much of its intended nuance. We conducted several tests to isolate the problem. Below are key findings and comparisons: Animation complexity directly affects the severity of the issue. Simpler animations show minimal discrepancy, while more intricate ones suffer noticeable degradation. We tested three locomotion sets: - Studio Moca’s Standard Motion for Women ( https://moca-studio.booth.pm/items/5064825 ): Minimal quality loss, likely due to its relatively simple movement. - VRSuya’s Wriggling Locomotion ( https://vrsuya.booth.pm/items/4995578 ): Exhibited moderate degradation, especially in its exaggerated, erratic motions. I also discussed this with the creator, Levin, who confirmed similar concerns. - TuxedoPato’s Custom Animation: Experienced the most severe quality drop. It features numerous subtle movements and a complete overhaul of the locomotion system, making the smoothing effect particularly disruptive. Additionally, we observed that animation fidelity is significantly better in VR mode compared to Desktop mode, where the degradation is much more pronounced. This leads us to suspect that the issue may be tied to VRChat’s polling rate, server tick rate, or network layer behaviour. It appears that locomotion animations are being transmitted over the network rather than executed locally on each client. Despite extensive testing, we were unable to bypass this network behaviour. Several community members echoed our findings, noting that animation data seems to be filtered or compressed during transmission—despite the fact that such data arguably shouldn’t require network mediation at all. At this point, we’ve reached our limit. We’re hoping for insight into whether this behaviour is intentional, a side effect of network optimization, or a bug in how locomotion animations are handled across clients. For testing purposes please check the following avatars that have the locomotion created and setup for your convenience. Full version Locomotion test (walk, run, crouch, crouch run, prone, idle jump, crouch jump, prone jump). https://vrchat.com/home/avatar/avtr_d794808e-6321-400e-b010-8ac9261b9c9f Stripped down version (walk, run, idle jump). https://vrchat.com/home/avatar/avtr_ade7fbe1-707e-4a22-82d5-05e6de7baeac Stripped down version of the locomotion system is downloadable from the link below. This file is stripped down and simplified heavily to narrow down the scope and reduce errors from other factors. https://drive.google.com/file/d/1xRSV3HTQQEBWUAy2_jLnRODdlcQnzM2t/view?usp=sharing Video example between local and remote users. https://youtu.be/86a_RUFllxY
6
·
tracked
Load More