SlimeVR Toe Support Integration
SebrinaRena
Hello!
I’m currently working on developing toe tracking support for the SlimeVR software.
There are many SlimeVR compatible solutions emerging and available today, and hardware I am currently testing which are pushing just how small full body trackers can get which is allowing this development to be possible.
As of present, I’ve been trying to see what I can push through the VRChat expression parameter system, but there have been many limitations when used with avatar systems such as full face tracking, or other third party avatar systems. The 256 bit limit fills quickly when more than a few things need accurate float parameter handling. Memory optimizers, and other systems cause the tracking over expression parameters to simply function poorly.
In an ideal scenario we would want to be able to drive the toes with raw quaternions via OSC, or an equivalent tracking system. 5 toe quaternions per foot, for a total of 10 quaternions. This would allow pitch control of individual toes, and splay on the yaw axis.
Another option would be two floats per toe via OSC, controlling pitch and yaw on local rotation. This would be half the data of syncing quaternions, and since toes don’t have a different roll rotation than the foot they are part of, the foot tracker would provide roll.
I also think native toe control may allow for more passive avatar compatibility. If toes have been set up accordingly in Unity armature settings, they should ideally already be compatible with toe tracking.
As of present, I have been able to make a tech demo that is limited to only pitch rotation on two divided halves of a foot's toes, and it requires making custom blend tree animations for either a curled, neutral, or tip toed state. It took considerable effort to optimize the parameters to have access to 4 total floats of syncable data among other avatar options such as face tracking, and there’s more nuance I would want to capture without having to compromise on other avatar features.
Toe support currently requires building an avatar that is compatible with the expression parameters we’re testing with, which may not be very accessible to most people currently.
Another thing that might be fun with the addition of native toe support, but potentially out of scope, is allowing toes to grab objects similar to hands (fully curled toes, grabbing). The thought of people handing somebody an object or shooting a weapon with their feet is the kind of silly thing that people would probably find amusing.
Current tracking demonstration working with the present limitations we would like to move beyond.
Log In