Graphics.Blit Scripts for avatars
Toocanzs
(Edit: Added "for avatars" to the post title. VRCGraphics.Blit is available for worlds, but not avatars.)
The Shader Community's Request for Graphics.Blit
- What is Graphics.Blit?
Graphics.Blit allows you to copy a source texture into a destination render texture using a shader. Being able to do this allows for shaders to write to render textures in a manner that does not require cameras
----
- Why do we want Graphics.Blit?
Currently shader creators are using cameras and render textures to emulate what Graphics.Blit does. Cameras were not made for this purpose and have a relatively high amount of overhead. On top of this, in order to effectively mimic Graphics.Blit we need two cameras to achieve the same effect. This doesn’t mean Graphics.Blit replaces cameras though, it allows for most use cases to run much quicker.
----
- What uses this camera workaround currently?
GPU particles. Anyone who's experimented with GPU particles knows how much faster these particles run compared to Unity's built in particle system. GPU particles in their current form can run a million particles in 1.43ms on the GPU. That's amazing already, but because we have to use cameras, ~0.7ms of CPU time(camera0 and camera1) is spent on updating the cameras. Graphics.Blit can bring the CPU time from ~0.7ms to ~0.07ms.
GPU particles are not the only shader which will perform better using Graphics.Blit. Any shader which has uses a 2 camera and render texture setup will be faster using Graphics.Blit.
Other use potential cases for Graphics.Blit
* GPU dynamic bones
* Cheap Gaussian blur
* Fluid Simulation
* Grass/Hair Simulation
* Procedurally generating Data
* Virtual Keyboard for mutes
* AI for Shader pets following you around
* Simple games and puzzles. Pong, Flappy bird, and the beginnings of Doom have been done with cameras already
* Worlds with snow dynamic, rain puddles, and water ripples
----
- Why should time be spent on Graphics.Blit when only shader creators will be using it?
Even though this is going to mostly only effect shader creators, everyone benefits from this change. GPU particles have been released to the public for a while and has been the only way for VRChat users to get millions of particle without completely destroying FPS. Graphics.Blit would increase the room for innovation.
----
- What would a Graphics.Blit component from the VRC SDK look like?
We would like to see two scripts added to the SDK, a Blit Controller, and a Blit Component.
The Blit Controller should be a component which can be added to game objects on avatars and worlds. The Blit Controller should also have one input, an array of Blit Components. The purpose of this controller is to handle the order in which each blit is executed. In order for a shader to store stateful information, there need to be two render textures and two blits. The order in which these two blits are run is important, and this controller handle that. The Blit Controller component should run Graphics.Blit for every Blit Component in the array, in order.
The Blit Component should hold two inputs, a source texture and a destination texture. Where is the material at? Well in order to allow shaders to be animated in VRC we think that using the shared material from the mesh renderer on the game object would be ideal. The blit component should have a RunBlit function which takes the mesh renderer’s shared material and runs Graphics.Blit with the source,destination, and shared material.
Also the RunBlit function should pass a couple extra bits of information to the shader. Namely the game object’s localToWorldMatrix, worldToLocalMatrix, and position. We think this is important because with Graphics.Blit you get no outside information. Some of the existing camera effect require position information.
We have included a GitHub link which contains a mock implementation of what we think would be the best way to implement this for the use cases we have. It is under the MIT licence so feel free to use anything from our implementation. We want to make this as easy as possible on development time.
Our mock implementation: https://github.com/Merlin-san/Blit-Component
We aren't sure if components update on both PlayerLocal and Player for the local player, but updating more than once per frame should be avoided if that's the case.
----
- Safety System
Currently the render texture/camera work around that we use requires that you are friends with another user in order for that user to see the render texture. This was implemented before the safety system. If Graphics.Blit is implemented, render textures should be either apart of the safety system under "shaders", or its own category under "render textures"
- The shader community,
Toocanzs, Merlin, Elixir, Xiexe, Blake447, Okano, snail, Tykesha, .Captain., Claw, Quantum, Nulliel, Naelstrof, Wakamu, 1001, Nave, Neitri, SCRN, Silent
Log In
KuuraVR
This needs to be tracked again, this would be pretty dope to have on avatars.
°sky
i think we can all say wed like to see this on avatars
Fax
open
Changed the post title and added a note.
ᴋᴀᴡᴀ
Hackebein it was tracked and now it's not 😭
Faxmashine
The latest open beta has this! https://docs.vrchat.com/v2022.4.1/docs/latest-release
naqtn
For the VRC dev. same post: https://feedback.vrchat.com/vrchat-udon-closed-alpha-feedback/p/graphicsblit
M E R C
Sometimes I return here and lament what could have been over the last 4 years if this was actually pursued. Hopefully someday.
llealloo
People frequently ask how they can drive the movement / intensity of lights, lasers, players, gameobjects, shader properties etc in Udon with AudioLink. I hear this all the time. Currently, they have to use the cursed "experimental" mode which leads to heavy performance restrictions (a big stinky bubble in the render pipeline due to the workaround). An async readback (blit) function would be a drop-in performance enhancer for AudioLink. With the release of the next version of AudioLink, worlds using the "experimental" mode would become drastically more performant without having to do so much as update to the latest AudioLink. Basically, the foundation has already been laid in AL as it stands today. It's a simple implementation on our end as well as the end users. This would be amazing not just for making existing worlds more performant, but also for the countless ideas that have been spared in waiting for the perf to not be an issue with AudioLink's "experimental" mode. Thanks for reading, hopefully this helps illuminate why I have been pinging this canny thread so much!
llealloo
it's gonna be lit fam when this goes up!
pema99
please VRChat devs add this. please please the shader dev community is starting to lose sanity. we've begun to pass the time by creating undecipherable memes. do you understand this? does anyone? should they? expose graphics.blit and save us from this madness, we need the distraction of working on new, previously impossible creations. all my shaders are starting to flow together, I don't even know what I am doing anymore
llealloo
I can't wait to see this one marked complete w/ async readback exposed
Load More
→