I believe the ability to drive avatar bones directly via an OSC app (like how native eye tracking is handled) could be a massive benefit to users using more niche or different hardware, as well as opening up new methods of expression yet to be seen in other social VR apps.
This could be in the form of a sensor on the toes driving the toe bone rotation for dancers, or for apps to listen to steamvr skeletal input and drive the bone directly.
Not only would this reduce nu-necessary parameters on avatars, but also let the bones receive proper IK syncing, making the experience more fluid.