VRChat has recently added new changes are coming to their Trust and Safety reporting flow. These allow pornographic, profane, graphic or racist content to be moderated with extreme haste, efficiency and scalability.
What these changes do not allow for however is for the reporting of "Crasher" avatars, avatars designed to close VRChat either the moment they are loaded (corrupted asset bundle) or the moment a toggle is activated (crasher shader, mesh, particle system.)
Doing something as simple as re-opening tickets for specifically avatars designed to crash VRChat clients, something that by no means can be reported in-game due to the fact that they close the VRChat client, would be beautifully helpful for reducing the volume of crashers on the platform.
Please at the very least re-open moderation tickets for Avatars that close the VRChat client (AKA Crashers), otherwise, Public instances, where the most impressions for New Users and Visitors happens, will remain ridden with bad-actors who close VRChat clients via "Crasher Avatars." If there still remains no way to report "Crasher" avatars, then the "opportunity cost" to VRChat's user-base growth will be great. I have already noticed an up-tick in "Instant Crashers" in Public instances, and I have no way to report them due to these recent moderation changes.
If you want a practical example, I have encountered a user who crashes the "Furry Hideout" across all of its instances, one by one, using a Trusted rank account. That ticket was closed due to the changes in moderation, and there was no other way to report that type of behavior / malicious avatar. This is problematic, as it allows that malicious behavior to persist, and it still does today.