Increase the update rate for OSC parameters
tracked
Statek
Currently, float parameters update over the network at around 10Hz with no smoothing which is far too low for acceptable face tracking. To other users, face tracking looks like a laggy slideshow or a bug with your computer that you'd need to restart to fix. However, if you tie 8 of the most important face tracked parameters (eyes, eyelids, and the larger mouth movements) to two 4-axis puppets menus that you keep open at all times, you can force IK sync on these parameters and have a face that updates at the same rate as your body. Since the systems are already in place for this to work, I'd propose pushing the top 8 floats in each parameter menu through IK sync, while more would be better (though I'm aware something is planned on the roadmap, as even this IK sync fix is only a placeholder compared to the face tracking fidelity you can get it in say Neos, but it's still a very simple placeholder to add).
The only other way to get smooth face tracking right now is by using transitions on binary parameters, but these update much slower and have much less precision.
Here is a demo showing the difference IK sync makes:
Log In
DrSakuuCuddles
What is the current status of this? The update rate doesn't necessarily need to be increased, just local interpolation needs to be added for OSC parameters.
Scout - VRChat Head of Quality Assurance
tracked
W
WHSPRS
Any updates on this?
S
Stonebot
This is a pretty big bummer that this doesn't already work. I'm working on developing a way to move around a virtual cat based on a vive tracker I put on my cat and feed location parameters in over OSC, but everything is jittery for everyone else.
Heal Howels
I was trying to get my avatar set up with mouth tracking, and was asking friends what they thought. I thought it didn't look bad, but they were not impressed by how jittery things were. For me everything is smooth, so yeah.
Any chance we can get some vrc feedback on this request?
xantoz・ザントス
Can we get someone to look at this again now, considering the only way to get IK sync for face tracking previously was using a mod?
I, a vanilla player, have been patiently waiting for this feature, and while it's cool to see all the new features suddenly happening now, this is on the top of my personal wishlist.
Fox P McCloud
Since we now have access to 256 bits as opposed to 128, perhaps we could have the option to be able to give IK sync rates for floats in exchange for them taking up double the space?
This effectively keeps network bandwith the same while allowing certain OSC variables to increased priority---at least until Face Tracking is officially supported.
Interpolation/smoothing helps for some facial expressions, but we really need increased update rates for stuff like eyes and some mouth movements.
Furriest
In order to reduce overall network traffic instead there could be client side interpolation of osc parameter values (interpolating going from 1 to 6 at 10hz as going to 1 to 2 to 3... 6 at 60hz)
Statek
Furriest: With each float parameter being 8 bits, you could use several thousand parameters before reaching the traffic that your voice takes up (assuming around 32kb/s for audio)
Furriest
Statek: oh then this should be implemented without hesitation if what you're saying is correct ^^
Zarniwoop
This tweet does clarify the current OSC facetracking implementation is just a very basic first step for functionality and it'll be improved on in time.
Statek
Zarniwoop: Yeah, but the point of this post is to show the support for an easy placeholder that could be pushed tomorrow (as opposed to physical bones which still aren't live after being announced last year) and to potentially bump up face tracking on VRChat's to-do list.
azmidi
I think a really good approach for this is to allow an OSC message to send an extra bool to a float-based parameter to indicate if a parameter should be IK-Synced and give it that status to the first 8 float parameters sent through OSC (since that is all that is allowed to be sent on the puppet), then ignore the request sent by the rest. That way it can take advantage of the existing IK Sync functionality of the puppet menu without changing the existing networking set in place. A lot of avatars setup for VRCFaceTracking already take advantage of this setup ((up to) 8 float parameters, with the rest being smoothed with animations using VRCFT Binary Parameters or other tricks), so this would enable many existing face-tracked avatars to take advantage of IK Sync without having to use puppet-menu tricks to give IK Sync.
Just my two cents and I believe that this solution isn't ideal since it can cause in-fighting between OSC programs, but I think it will be a good compromise until there is a better approach to getting IK Sync on OSC parameters (or possibly a dedicated way to set a parameter per avatar to be IK Synced on avatar setup).
Load More
→