Porting Shaders from Shadertoy
I have been inspired by the site Shadertoy, and am trying to port some of the more simple shaders from shadertoy into .jxs files for use withing max4live.
Shadertoy makes use of some standard uniforms.
uniform vec3 iResolution; // viewport resolution (in pixels)
uniform float iGlobalTime; // shader playback time (in seconds)
uniform float iChannelTime[4]; // channel playback time (in seconds)
uniform vec3 iChannelResolution[4]; // channel resolution (in pixels)
uniform vec4 iMouse; // mouse pixel coords. xy: current (if MLB down), zw: click
uniform samplerXX iChannel0..3; // input channel. XX = 2D/Cube
uniform vec4 iDate; // (year, month, day, time in seconds)
Are there similar uniforms within jitter?
I guess sampler2d and sampler2drect are analagous to the iChannel uniform? What about resolution and time uniforms?
Also is it possible to pass-in/reference an audio signal - I would really like to try to create shaders that reacted to an audio signal playing within live.
simple example of a shader -> gen port with audio input.
Thanks so much for this...
I have found a number of examples of shaders that have been recreated as max patches but none that have an audio input.
Do you know if it is possible to reference audio from within the fp part of a .jxs file that is loaded into a jit.gl.slab object?
Not sure what you mean.. Can you explain that a bit better?
I am such a n00b - apologies if I struggle to explain myself well.
I have become inspired the rather excellent site www.shadertoy.com, and have been thinking about the best way to port some of the shaders into m4l, particularly the ones that are responsive to audio. I guess that shadertoy acts as an environment that can host shaders, and passes in certain variables (the uniforms listed in my first post).
I have been using the excellent V-Module suite created by Fabrizio Poce. He has created a few devices that you can use to load up .jxs files. What I would really like to do is to modify one of these devices to perform a similar job to the shadertoy website, i.e I can load in a .jxs file containing vp/fp programmes ported from shadertoy, and the device will feed in certain uniforms, textures, audio signals, that the shaders can process.
I guess I would be creating an audio effect device in m4l?
Looking at this example
the audio seems to be passed in as a texture into the fp code - I am not really sure how that would work in the world of jitter/m4l.
I hope that clarifies what I am trying to achieve - I really hope you can help point me in the right direction.
If you want to go the "old school way" and work with jit.gl.slab
you will have to adapt your glsl shader.
Thats means that you need to write the corresponding xml header and add a second texture for your audio input (replacing the ichannel0 from the shadertoy example).
Use the "@inputs 2" attribute to get another input for your jit.gl.slab object.
Hope that helps.
you can turn audio into a texture by first sampling into a jit.matrix, and then sending that matrix to the input of either jit.gl.pix, or jit.gl.slab.
lots of info on turning audio into a jitter matrix by searching the forum and tutorial articles, and checking out the following examples folder:
Max 6.1/examples/jitter-examples/audio/
SO I have managed to create a version of the following shader that adds a nice pixellation effect to an incoming texture.
The shader certainly displays the nice grid of pixels, and these change colour as the incoming texture (tex0) changes.
Unfortunately each of these 'pixels' is identical. It is as if the shader is somehow producing an average colour value and applying that to every coordinate rather than producing differing values depending on each tex0 coordinate.
Any ideas on how to get this shader working with the texture as it does on the shadertoy site.
hi
the shader :
Thanks for this LLT
This version just gives me a black screen
I can see you replaced the texdim0 variable with an iResolution uniform. What is the logic behind that?
Hi ARTOO,
"I can see you replaced the texdim0 variable with an iResolution uniform. What is the logic behind that?"
There is no logic, it was just to keep the functions shadertoy ;)
I have problem with the shader version that does not work as I would like.
I made a version jit.gl.pix codebox. And that's good.
OK after a few month of being tied up on other projects I really want to get to grips with shaders in jitter.
In Shadertoy most shaders start with a line a bit like this
vec2 uv = gl_FragCoord.xy / iResolution.xy - 0.5;
textures are sampled using Sampler2DRect/Texture2DRect.
It seems to me that the shadertoy fragment programs start operating on a single rectangular fragment defined in the vertex program. Other textures are then sampled and integrated into the shader later in the code.
I am struggling to replicate this in my own shaders - I can sample textures OK, I can get some of the procedural graphics to display - just not at them same time.
I attach a couple of shaders that show this. I have been plugging in my webcam as a source for the sampled texture.
Both of these shaders display some graphics - if I wave my hand in front of my webcam, things change on the screen - I just don't know what is missing, or what I have done wrong.
Original Matrix Rain Shader can be found here
So I have made some progress porting over some of these shaders, but really only the fully procedural shaders. I am loading them in to Fabrizio Poce's Effect device from his wonderful V-Module suite of M4L devices - this makes it easy for me to pass in up to two different textures.
I still have a couple of questions. The original shaders on shadertoy are all essentially fragment shaders, that are applied to a rectangular piece of geometry. I have been applying these shaders to one of the textures connected to Fab's device (texcoord0) but this feels a bit hacky. I would love to know how to tidy things up and refer to the built in geometry fragment rather than a texture.
I would also love to be able to implement some filters to incoming textures but don't seem to be having any success. Two nice ones areCrosshatch and Trixels.
I attach my .jxs files that don't work - I think it is something to do with the values being passed through to the fragment program from the vertex program.
Any assistance would be greatly appreciated - standing on the shoulders of giants etc etc
Artoo - could you share some of the shaders you ported? Trying to learn the ins and outs of shaders into codebox as well, it is tricky.
Thanks! :)
Hi there. Check the wiki entry porting shadertoy to jit.gl.pix.
I only got to tut 15 before my nooby skills abandoned me but they all start from the ground up.
Andro - thanks for this I will try to have a look at it soon. To be honest I am more interested in loading the GLSL shaders directly into max4live/jitter.
Madie - do you have v-module installed? the shaders I have work well with Fabrizio's framework - if you have it installed I will gladly share what I have got
Andro - Ah thank you, this is super helpful. Especially the dim/cell/norm knowledge.
Artoo - Yes, I do! :)
You need to save the folder Jitter Shaders Artoo somewhere in your max path - I save it in the V-Module Max Externals folder.
Then add the amended M4L device to your project - it needs a texture input (i use Fab's texture device loaded with a flat image.
Then connect it to your mixer and enjoy!!
Bump!!! Any of the megabrains able to help me?
what is your question?
I am using Fabrizio Poce's excellent V-Module suite - and have been having fun using one of his devices to host shaders from shadertoy.
These are mainly procedural animations etc.
The shadertoy shaders all use the following vertex shader:-
attribute vec2 pos;
void main() { gl_Position = vec4(pos.x,pos.y,0.0,1.0); }
The usually have a line near the beginning something like this:-
vec2 xy = -1.0 + 2.0*gl_FragCoord.xy/iResolution.xy;
I discovered that if I had an 'input' texture in Fab's V-FX-GL-MultiEffect device then I could replace these values with texcoord0 and texdim0 to get:-
vec2 xy = -1.0 + 2.0*texcoord0.xy/texdim0.xy;
and the shader will compile, using the sh.passthrudm.vp vertex program. And what fun I have had playing around with some of these shaders. I attach a couple of examples (string theory and rippled darkness)
My first question is this
Is there a way for me to dispense with the texcoord0 & texdim0 and apply the Fragment Program directly to a rectangular piece of geometry, rather than rather hackily applying it to an input texture? At the moment I am having to use an input texture to make things work and that seems very inelegant.
My second question relates to shadertoy filters such as crosshatch andtrixels. I have been using an input texture but cannot seem to get these working. Any ideas on where my shaders are going wrong? I attach my non-working .jxs files.
you need to input a texture or matrix to gl.slab one time, in order to set the dimensions and initialize the internal resources. after that you simply need to bang the object every time you want it to update:
to get your shaders working, you're just going to have to debug them. start by removing all the code except the texture sampler, and the gl_fragcolor, and then put in piece by piece until you find what's not working.
Has anybody had any success in porting the rest of the shaders from the tutorial? I only managed to port tut 15 haha, the rest of them eluded me..