Creating custom blend/distortion modes by writing AND reading from capture texture.
Hi all,
I'm writing a shader that reads and writes from the capture-texture that's used for rendering. So, it reads pixel values that are currently present in the texture and decides what to add/subtract or even displace/blur etc as a response.
This opens some cool doors to awesome custom blend modes, transparency fanciness, and psychedelic weirdness
And... it works!
..but as far as I know it ONLY works properly when the capture-texture is copied to a matrix directly before drawing each object. This matrix doesn't even have to be used: its the act of copying the capture texture somehow changes its nature...! I want to know what this change is, and if it can be triggered without copying.
Here's an exampe with comments..
I'm really curious what's going on here...!
-t
And, just in case it's impossible to read and write from the same texture in a proper way, here's the solution to my blending issue by using a copy of the capture-texture. I don't particularly like copying the whole screen buffer after each drawn object though:
yo tarik, i couldn't tell you exactly what's going on here, but i imagine it has something to do with reading into the capture texture that you are currently writing to, and some synchronization issues between cpu and gpu.
FWIW, i wondered if instead of copying the texture, triggering a glflush message to a jit.gl.sketch would do the trick. it turns out that simply banging a jit.gl.sketch @automatic 0 in place of the texture copy, is all that's needed to get this working on my windows laptop.
Your mac issues are likely due to the shader not compiling on mac (the mac glsl compiler is much less forgiving than windows with syntax issues). the second arg to texture2DRect must be a vec2 . Although i do see different behavior on the mac. to get it working you don't need any texture copy, but need depth_write, depth_enable, and depth_clear all enabled, and to get rid of a final little glitch i had to add a [ jit.gl.camera vMain @far_clip 0 ]. There's clearly some different driver behaviors in play here, and not much i can say beyond that.
You may be able to suss out some more details by running in gl3 and launching from RenderDoc on windows (you can do similar profiling on Mac in both gl2 and gl3 using openglprofiler, but RenderDoc is a bit more user friendly). running your patch on gl3 requires a different solution. this time banging a jit.gl.slab in place of the texture copy does the trick. I took a look at the output and it looks like a call to glClear triggered by the gl.slab is cleaning things up.
totally strange and confounding behavior, but it's also a strange little trick you're performing.
oh boy, you gave me a lot of material to dig into - this'll be fun!
I'll let you know my findings soon, thanks!
Brilliant yes, of course glflush did the trick in gl2, and indeed very strangely banging a random slab (i love how that sounds) in gl3 did the trick too.
...a strange little trick indeed, but especially with your help it's a pretty elegant way to finally drag blend-modes out of the fixed graphics pipeline and have some fun with it - think for instance of using the normals of shapes in the foreground to distort shapes in the background, like looking through strangely curved glass, that kind of stuff. And then THOSE images can in turn be distorted by other objects even closer to us. Just an example... it's generally just an awesome way to create things that sit right in between the worlds of post-processing and regular 3d rendering.
Also: I had no idea that something like RenderDoc existed!! That's really cool. Will be checking this out in the next days
thanks!