VRChat incorrectly reporting amount of data being sent across network, causing improper rate limiting
available in future release
AltCentauri
Right now, both the ~4 and ~6 network panels are broken and are reporting incorrect data. As part of this, the Bps of the individual udon behaviours are shown to use more than double the amount of network bandwidth that is reported in OnPostSerialization.
As this data is internally used by VRChat for rate limiting, this means that udon serialization gets rate limited in as little as 4-6KB per second of total data. When testing, it is more likely to happen when there is a large amount of behaviours syncing a small amount of data, compared to a small amount of behaviours syncing a large amount of data.
Bandwidth usage can be tested with this tool in any world with synced data: https://github.com/Centauri2442/VRCNetworkDebugger
That tool directly reads from OnPostSerialization of all synced behaviours in the scene. You'll notice that the network starts getting rate limited way before it should. You can also directly compare the ~6 Bps values to the values reported through OnPostSerialization, and you'll see that the ~6 values are more than double what is actually being serialized.
Log In
Tupper - VRChat Head of Community
A few details, as I've been doing some digging on your behalf over the past few days:
The networking panels are indeed broken and will be fixed in an upcoming patch. In addition, our documentation of a 11kbps rate limit is out of date. As of today, it's about 8kbps and spikes to 12. This limit changes every so often, and its usually due to infra changes or other internal changes that are hard to document.
However,
I can absolutely see how this limit changing without creators being informed (esp ones pushing the limits of our networking) can be really frustrating. I also can say that, from my perspective, 8kbps seems really restrictive. Part of the issue is that this extra data (headers and etc) is necessary data. It has to be counted as part of the rate limit, too, because it's
still
data and still
traverses the network. In other words, it's a bit of semantics at play here.Eeeither way, we're having some conversations internally to figure out how to best document these changes and inform the relevant people when these changes are going to happen, and also to figure out how to lift this limit safely and without causing problems or disruptions.
AltCentauri
Tupper - VRChat Head of Community: Glad to see a proper response about the problems! Out of curiosity, is the 8kbps supposed to include the doubling up of the data before being sent out? During my own testing I had found that once the data left udon (Including the udon headers), it then had doubled beyond that when actually being sent out. When testing, I had personally found it to begin rate limiting with serializations as low as 4-6kbps (before doubling), which would line up with your 8-12kbps of total data. From the creators side, would that mean that our actual limit where we can expect unthrottled data is 4-6kbps?
Tupper - VRChat Head of Community
AltCentauri: From what I've been told and understand, there are other rate limits other than total data size like message rates and etc, and you're likely hitting those.
It is best to encode one big object rather than many small objects more often, as there's overhead per serialization (data size limits) in addition to message rates.
These rates aren't documented for security reasons.
I will add: it seems that we don't have any documentation on best practices for networking that would include this advice. That's definitely a mistake on our end, and I plan on trying to rectify that as soon as possible.
This post was marked as
available in future release
StormRel
tracked
FairlySadPanda
The networking panels being useless is not a surprise; the bad serialization is also working as intended. It's header data plus metadata related to what's been synced plus the data itself.
AltCentauri
After more testing and some discussion with other users, it seems I was partially wrong about the data being used for rate limiting. However, the overall data is still completely busted.
Currently, there is an undocumented overhead of 20 bytes on every packet of data serialized, Which means that an integer value that would normally be 4 bytes, is instead 24 bytes. Beyond this 20 bytes of header, there also seems to be even more headers of data beyond that for each behaviour that more than doubles the actual network usage of every serialization. This means that for every serialization of that 4 byte int, there seems to be 30-50 bytes of total data layered on top of it, sent every single serialization.
The data shown in the ~6 menu where we see ridiculously high amounts of bytes per second for each behaviour seems to include all of these overheads, and might be the source of the numbers displayed.
Data also seems to have a level of rate limiting that varies on the number of behaviours attempting to serialize per second.
For example, let's say you do a test with 2 sets of data; 1 behaviour syncing 4KB per second, and 8 behaviours syncing 500 bytes per second. (Including VRChat overhead).
For some reason, having 8 behaviours sending the same amount of data as a single behaviour causes the network to enter a suffering state much more often, rate limiting the entirety of the scene.
Even if the overall amount of data is the exact same as one large behaviour, attempting to spread out serializations into smaller sizes causes the network to intentionally slow down that data, resulting in the values I had seen during my testing, where serializing only 4-6KB of data per second was making the network enter its suffering/clogged state.
Dinky_
Would definitely like this issue looked into as it harms all worlds that try to make a multiplayer experience. Which is almost all worlds!
I'm not sure how much issues exist in VRCs net-code & how much of it is intended server budget saving, but all I can say is it feels more laggy than any game I've played in the last decade by almost a factor of 10. At the point where we are adding V-Bucks its time to be competitive with the rest of the industry surely! Now that we are on a new version of Unity maybe its time for NU24 and 30ms being the standard instead of 300ms
°sky
alarming rise in major issue like this one not being discovered during internal testing.
whats going on?!
Dinky_
°sky: What's funny is that it was discovered during testing. I remember reporting this issue during the NU22 closed beta among a distance based network optimization error & several errors that caused VRC's Obj Sync component to increase its network usage by a large amount. I even provided several videos and detailed code causing the issues and they released in that state anyway =(
°sky
Dinky_: seems typical at the moment. wish theyd start paying attention to people :(