Bug Reports

  • No off-topic posts
  • Don't report more than 1 issue at once
  • For isolated issues or customer support visit help.vrchat.com
Thanks for your bug report!
[1487] Hand tracking shown extremely different in remote users
When using VRChat's native hand tracking on PCVR, the hand movements that are displayed locally do not correspond accurately to what remote users see. The core issue appears to be that finger curl data is transmitted on a per-finger basis rather than per-joint, similar to the legacy Index-controller finger curl implementation. This leads to incorrect representation of specific hand signs, particularly those that involve fingers being straight but angled. Users employing custom OSC hand tracking do not face this problem, highlighting a discrepancy in how hand tracking data is processed and transmitted in VRChat. Steps to Reproduce: Use VRChat on PCVR with native hand tracking enabled. Perform hand signs that require specific joint-based finger curls (e.g., fingers that are straight but angled). Have a remote user observe the hand signs. Compare the hand movements observed by the remote user with what is displayed locally. Observed Result: Remote users observe incorrect or misaligned hand movements due to finger curl data being transmitted per-finger instead of per-joint. This issue is consistent across different users and instances, regardless of whether VRChat is restarted or even reinstalled. Expected Result: Hand tracking data should be transmitted per-joint rather than per-finger, ensuring that remote users see an accurate representation of the local user’s hand movements. This is essential for the proper execution of Sign Language and other precise hand gestures in VRChat. Frequency: Always Impact: This bug severely impacts users who rely on Sign Language for communication, as it prevents accurate representation of certain hand signs. The lack of precise hand tracking translation compromises the ability to communicate effectively using Sign Language in VRChat. Additional Information: The issue occurs across multiple users, instances, and devices. Users with custom OSC hand tracking do not encounter this issue, suggesting a difference in how tracking data is handled. Accurate transmission of hand movements, down to the individual joint level, is critical for Sign Language and other detailed gestures.
8
·
tracked
VRChat Unity Animator Lag
The number of animation layers heavily affects performance due to unnecessary inverse kinematic calculations. Even blank animation layers with zero weight in the FX layer can trigger this behavior. The offender is seen in the profiler as Animators.IKAndTwistBoneJob. It runs as many times as the currently loaded animation controller with the largest number of layers. Although the individual run time is small, they very quickly add up. Slower CPUs seem to be more adversely affected by this. The time taken by these extraneous calculations quickly grows larger than all normal animator activity. This behavior only happens on animators that have a unity avatar/armature. When that is removed, this behavior goes away and the number of layers ceases to be a significant cause of lag. Most FX layers do not have any interaction with the armature, so it is likely that this behavior can be fixed for almost all FX layers. However from debugging the unity editor in visual studio, it appears the this happens in a native unity function named Animator::UpdateAvatars. It is likely that in most if not all cases, these calculations are completely wasted on FX layers and provide no benefit at all while consuming significant amounts of main thread time. Somehow these calculations need to be only run for layers that require them. Additionally providing a way to get parameters in sub animators would allow logic to be moved out of the main animator completely sidestepping this problem. They would not have a unity avatar associated with them, so this problem would not occur. It would also allow completely disabling animators when not in use. Please attempt to find a way to prevent this behavior or consider convincing your Unity contacts to implement a fix. This seems to be one of the largest contributors to animator lag. I have a full write-up with more profiler images and further explanations. https://docs.google.com/document/d/1SpG7O30O0Cb5tQCEgRro8BixO0lRkrlV2o9Cbq-rzJU
1
·
tracked
Locomotion Animation Discrepancy Between Local and Remote Users in VRChat
An artist named TuxedoPato (Martin, https://x.com/TuxedoPato ) is currently developing a custom locomotion animation set for VRChat. During testing, we encountered a significant issue: the animation behaves differently on the local machine compared to how it appears to remote users. Specifically, the remote version appears unnaturally smoothed, losing much of its intended nuance. We conducted several tests to isolate the problem. Below are key findings and comparisons: Animation complexity directly affects the severity of the issue. Simpler animations show minimal discrepancy, while more intricate ones suffer noticeable degradation. We tested three locomotion sets: - Studio Moca’s Standard Motion for Women ( https://moca-studio.booth.pm/items/5064825 ): Minimal quality loss, likely due to its relatively simple movement. - VRSuya’s Wriggling Locomotion ( https://vrsuya.booth.pm/items/4995578 ): Exhibited moderate degradation, especially in its exaggerated, erratic motions. I also discussed this with the creator, Levin, who confirmed similar concerns. - TuxedoPato’s Custom Animation: Experienced the most severe quality drop. It features numerous subtle movements and a complete overhaul of the locomotion system, making the smoothing effect particularly disruptive. Additionally, we observed that animation fidelity is significantly better in VR mode compared to Desktop mode, where the degradation is much more pronounced. This leads us to suspect that the issue may be tied to VRChat’s polling rate, server tick rate, or network layer behaviour. It appears that locomotion animations are being transmitted over the network rather than executed locally on each client. Despite extensive testing, we were unable to bypass this network behaviour. Several community members echoed our findings, noting that animation data seems to be filtered or compressed during transmission—despite the fact that such data arguably shouldn’t require network mediation at all. At this point, we’ve reached our limit. We’re hoping for insight into whether this behaviour is intentional, a side effect of network optimization, or a bug in how locomotion animations are handled across clients. For testing purposes please check the following avatars that have the locomotion created and setup for your convenience. Full version Locomotion test (walk, run, crouch, crouch run, prone, idle jump, crouch jump, prone jump). https://vrchat.com/home/avatar/avtr_d794808e-6321-400e-b010-8ac9261b9c9f Stripped down version (walk, run, idle jump). https://vrchat.com/home/avatar/avtr_ade7fbe1-707e-4a22-82d5-05e6de7baeac Stripped down version of the locomotion system is downloadable from the link below. This file is stripped down and simplified heavily to narrow down the scope and reduce errors from other factors. https://drive.google.com/file/d/1xRSV3HTQQEBWUAy2_jLnRODdlcQnzM2t/view?usp=sharing Video example between local and remote users. https://youtu.be/86a_RUFllxY
6
·
tracked
Load More