Currently, float parameters update over the network at around 10Hz with no smoothing which is far too low for acceptable face tracking. To other users, face tracking looks like a laggy slideshow or a bug with your computer that you'd need to restart to fix. However, if you tie 8 of the most important face tracked parameters (eyes, eyelids, and the larger mouth movements) to two 4-axis puppets menus that you keep open at all times, you can force IK sync on these parameters and have a face that updates at the same rate as your body. Since the systems are already in place for this to work, I'd propose pushing the top 8 floats in each parameter menu through IK sync, while more would be better (though I'm aware something is planned on the roadmap, as even this IK sync fix is only a placeholder compared to the face tracking fidelity you can get it in say Neos, but it's still a very simple placeholder to add).
The only other way to get smooth face tracking right now is by using transitions on binary parameters, but these update much slower and have much less precision.
Here is a demo showing the difference IK sync makes: