With a lot of work being done recently with the control schemes for VRchat in the aid of accessibility, I feel a long forgotten and misunderstood feature available to the Vive Wands specifically needs to make a return to the index and other platforms that support it. Namely the use of the touchpad for gestures and dynamic rebinding functionality of "Vive Advanced Control".
For those unaware, "Vive Advanced control" was a setting available to Vive Wands that enabled additional functionality for avatar gesturing and movement. It unlocked the full range of movement for the player on the touchpad, allowing them to move at run speeds, but it also expanded the quadrants on the touchpad to include additional segments (with 'fist' in the middle, and open hand on the grip buttons. It also however, added a dynamic rebinding feature to gestures, which was arguably the most useful of its additions. This allowed you rebind any of the primary hand gestures on either hand to the trigger dynamically as you needed them. You only simply had to preform the gesture, and then click the trigger once to make use of it. The intended purpose of this feature was to allow you to use gestures whilst moving, but this was also instrumentally useful for very animated avatar puppeteering and sign language, as it ensured that the gesture was 100% reliably triggered. Whilst this took a little bit to get used to, (and not much of it was documented very clearly), once practiced it became a very handy tool for preforming very deliberate actions with avatars.
Why I feel this is necessary is for several very crucial reasons:
1) As somebody who has spent a lot of time with Vive Wands and used a lot of manual puppeteering with avatars, the ability for gestures to be performed
consistently
and
reliably
is incredibly important to me. Finger tracking on index controllers for lack of better explanation is wildly inconsistent with how well it might preform one day to the next. Especially with complicated gesture setups, anything designed to work well with Vive Wands or even Oculus controllers are simply a nightmare for index use. I am
primarily
mute on the platform, so essentially losing this feature after transitioning to index controllers severely hampers my ability to communicate with others well.
2) No two avatars are alike, and no avatar creator can agree on a universal standard. Especially with older legacy avatars, the ability to use a gesture reliably and rebind them on the fly meant for a much easier time preforming unique gestures across different setups. Avatar creators that added a gun prop to their avatars that use 'Peace Sign' as the gesture for instance didn't have to worry about this being an uncomfortable gesture to pull off, if it meant you could dynamically just rebind it to the trigger.
3) The addition of the 3.0 radial menu has greatly expanded the functionality of avatars, and has started to relegate the gesture based system of triggering animations a "legacy" feature, but this comes with many downsides. Namely for people looking to puppeteer
well
, the radial menus are
very
dependent on pausing and looking away from the action, as well as hindering your ability to move or turn around whilst they are opened. Puppeteering avatars is a delicate balance of using both in conjunction, but this simply harder with index's finger tracking.
4)
A particularly important note
- Accessibility for those with bad repetitive strain injuries with their fingers could mean a world of difference for people who were previously either unable to perform gestures or even unable to play the game period. Stating from personal experience with a friend, the index controllers are a preferred device for people suffering with injuries like this because of the controller gripping to the hand with a strap instead of having to hold it constantly. Finger based tracking for gestures, however, are nigh impossible for them due to the complexity of the gestures and the frequency they'd be using them. Assigning these to the touchpad alone, allowing this all to be performed with the thumb alone reduces that strain considerably.
5) An argument I have already seen suggests that offering this "Devalues finger tracking existing and is less intuitive/immersive. You should just use the radial menu"; I beg to differ, and I am not asking for finger tracking based gestures to be removed. Simply the addition of an alternative option to use the trackpad instead.
6) People in the sign language communities may attest: Index controllers for sign language either works well for the person or
doesn't
. Many sign language players stuck to using Vive Wands as the input of gestures reliably was far too important of a feature for them to lose.
7) There's many alternative ideas to this as well, such as allowing gestures to be performed with clicking with thumbsticks down for those that prefer it. Really, what I am asking in essence is the ability have gestures be rebindable, with the functionality of 'Vive Advanced Control' available to other input methods. My personal favorite alternative I have seen would just be exposing the touchpad/joystick float values so we can do it ourselves.
8)
Yes
, in saying all this, there are more than likely solutions that can emulate the gestures to the touchpad using SteamVR chords and OSC apps. I stress however –
All of this functionality already exists inside of the game right now, and simply has to be ported to the other input methods.
There isn't even a mismatch of inputs available between the controllers that would make this impossible, too. Everything required is there already. Needing to set this up on your own should not be necessary.
9) The recent additions of the face camera and gesture icons on the hud have been instrumentally helpful in the interim, but are only a half measure. It has made knowing what gesture you are preforming without glancing at your hands easier, but gestures are still unreliable.
There are also other existing canny posts that propose similar ideas:
By Xiexe:
By Nepsy Neptune: