Ability for OSC applications to drive avatar bones without the need of special parameters
°sky
I believe the ability to drive avatar bones directly via an OSC app (like how native eye tracking is handled) could be a massive benefit to users using more niche or different hardware, as well as opening up new methods of expression yet to be seen in other social VR apps.
This could be in the form of a sensor on the toes driving the toe bone rotation for dancers, or for apps to listen to steamvr skeletal input and drive the bone directly.
Not only would this reduce nu-necessary parameters on avatars, but also let the bones receive proper IK syncing, making the experience more fluid.
Log In
|KitKat|
This would also allow people to make their own IK solvers!
el ff1493
|KitKat|: it wouldn't work because of networking. ik is done client side. and we already have finalik
|KitKat|
el ff1493: Afaik VRChat doesn't locally calculate the IK for remote players. The bones are synced directly.
VRChat doesn't sync the trackingdata required to drive the IK. In Udon if you try to fetch a remote player's trackingdata it will return a bone position instead.
°sky
el ff1493: all bones (apart from physbones) are networked as IK. as far as im aware of, there is no real reason as to why we couldnt directly drive a bones rotation bypassing parameters - it'd also use less bandwidth on the network by simply networking less things!