Slit scan time warp MAP
Max comes with a few examples of the slit scan technique, and I have also developed my own.
But I want to expand the slit scan technique with a "time warp map".
In "traditional" slit scan , only one line of the input video is drawn each frame.
With a "time warp map", every pixel is drawn, but each pixel has it own delay as determined by a black and white mask.
An example of its implementation : http://jamesgeorge.org/ofxslitscan/
An examples of its use : https://vimeo.com/10825604
Can this be easily built in Max 6?
The only solution I can image is an input three dimensional matrix where z is time.
The output si a two dimensional matrix where each pixel is selected from the third dimension as set by a second char matrix.
check out the following example patch, that does exactly this:
enable usedstdim on the gl.texture to start the time-warp effect.
Ok, wow, thanks :)
If I understand correctly, "subtex" allows to change a portion of a texture.
The td.plane3d.jxs shader allows to "reduce" a 3D texture to a 2D texture following a "map".
I made something like this a few years ago using JS and a custom shader. I’m pretty sure I posted it somewhere in the forums, but not sure where. Anyways, the way I did it was to create a texture/videoplane for each delay frame and then draw each of those planes with an alpha mask determined by comparing the master delay mask to a threshold. I did it this way specifically because I was working on a 3D effect so I wanted some depth. Here it is if you want to have a look.
This is stunning. Can anyone with some js chops hack the ubiquitous jit.gl.handle and esc-to-fullscreen commands on to this? It seems to already have some jit.gl.handle in it, it generates the colored lines when you click but it doesn’t move – rotating this slightly would be insane.
I’ve got a question about the subtex.3d.maxpat example. In the patch, a subtext_matrix message (with dstdimstart and dstdimend) is used to fill one plane of the 3D texture with a matrix.
Is there a way to fill one plane of the 3D texture with a 2D texture?
In my work I need to record 10 seconds (600 frames) of a render to play it later, and I do not want to go to matrix side (the framerate decreases too much).
Otherwise, is it possible to get the result of the td.plane3d.jxs shader into a jit.gl.texture ? Seems for me it doesn’t work within a jit.gl.slab (no 3D texture input).
In fact I want to send to the 3D buffer a texture and not a matrix (with subtex_matrix message).
Is it possible to do that?
My source is a live capture from a jit.gl.camera, and I want to store the 600 last frames, without converting the jit.gl.camera texture to a matrix before sending it to the 3D buffer.
this is not currently possible.
OK, thanks for the advice.
Here is an patch that create an OpenGL texture delay from a texture source using js (adapted from Andrew’s patch above).
I’ve got a feature request: a new jit.gl.textureset external with the same behavior than jit.matrixset but within openGL. I think a such object could have better performances than js or poly~ stuff.
this is great. thanks for sharing.
I’ve been working on hacking Andrew’s patch, JS and shader into a Jamoma module. I’m attaching what I have so far. Any feedback is appreciated, and please feel free to use the module if you like!
I also tried drawing the videoplanes to their own jit.world @shared 1 @output_texture 1 which worked so long as my input (here, jit.grab) was set to draw to the same context, but this doesn’t really seem like a solution, because I’d like to be able to feed in video from my main render context, process it through the mapped delay (which would have to be in its own render context), and then get it back to the main render context.
It’s got to the point that I’ve been trying to use jit.gl.syphonserver and jit.gl.syphonclient locally within the patch to get the video between render contexts, which suffice to say hasn’t been successful. After bashing my head against this for quite a while, I’m ready to admit that I’m not very proficient with JS. If anyone out there would be willing to take a look and see if it’s possible to get the video back out to the main render context as textures, I’d be very grateful. Ideally it would be possible to switch between 2d and 3d output, along the lines of what’s going on with the torus and sphere in the jit.gl.node help patch – that’d be huge!
Forums > MaxMSP