While local strings allow storing from range \u0000 to \uffff , any synced strings are forcefully encoded with UTF-16, making any characters within range \ud7fff to \udfff be truncated to \ufffd . It seems like internal serializer or deserializer is treating all strings as UTF-16 encoded texts, no matter what their purpose is. I do know strings require terminators like \0 to be memory safe, or for some other features like support for emojis via utilizing surrogate pairs, but in raw memory side, string can be treated just like other containers, and I'm utilizing for maximum productivity and memory&network efficiency. (For example, treating each character as 2 bytes, or even go further, 4 hexadecimal numbers) I see this becomes important, due to VRChat's network bandwidth limitations and Udon's performance compared to normal C# codes. Just like gameboy-era old games did, I need to squeeze out all the possible bytes to make the code both performance and network-efficient. Maybe treat string just like byte[] when serialized? or maybe adding a special attributes like NonEncoded to the string variable so that the variable is treated just like another data container?