Slit scan time warp MAP

mrtof's icon

Max comes with a few examples of the slit scan technique, and I have also developed my own.
But I want to expand the slit scan technique with a "time warp map".

In "traditional" slit scan , only one line of the input video is drawn each frame.

With a "time warp map", every pixel is drawn, but each pixel has it own delay as determined by a black and white mask.
An example of its implementation : http://jamesgeorge.org/ofxslitscan/
An examples of its use : https://vimeo.com/10825604

Can this be easily built in Max 6?

mrtof's icon

The only solution I can image is an input three dimensional matrix where z is time.
The output si a two dimensional matrix where each pixel is selected from the third dimension as set by a second char matrix.

Rob Ramirez's icon

check out the following example patch, that does exactly this:
/Applications/Max 6.1/examples/jitter-examples/render/Textures/subtex.3d.maxpat

enable usedstdim on the gl.texture to start the time-warp effect.

mrtof's icon

Ok, wow, thanks :)
If I understand correctly, "subtex" allows to change a portion of a texture.
The td.plane3d.jxs shader allows to "reduce" a 3D texture to a 2D texture following a "map".

Andrew Benson's icon

Hiya,
I made something like this a few years ago using JS and a custom shader. I'm pretty sure I posted it somewhere in the forums, but not sure where. Anyways, the way I did it was to create a texture/videoplane for each delay frame and then draw each of those planes with an alpha mask determined by comparing the master delay mask to a threshold. I did it this way specifically because I was working on a 3D effect so I wanted some depth. Here it is if you want to have a look.

5399.jsgldelay.zip
zip
Wetterberg's icon

This is stunning. Can anyone with some js chops hack the ubiquitous jit.gl.handle and esc-to-fullscreen commands on to this? It seems to already have some jit.gl.handle in it, it generates the colored lines when you click but it doesn't move - rotating this slightly would be insane.

JMC's icon

I've got a question about the subtex.3d.maxpat example. In the patch, a subtext_matrix message (with dstdimstart and dstdimend) is used to fill one plane of the 3D texture with a matrix.
Is there a way to fill one plane of the 3D texture with a 2D texture?
In my work I need to record 10 seconds (600 frames) of a render to play it later, and I do not want to go to matrix side (the framerate decreases too much).

Otherwise, is it possible to get the result of the td.plane3d.jxs shader into a jit.gl.texture ? Seems for me it doesn't work within a jit.gl.slab (no 3D texture input).

Thanks,

Jean-Michel

Rob Ramirez's icon

you can adapt the plane3d shader to a simpler version, allowing you to sample a single 2d texture from a 3d buffer.
i've attached an example.

subtex3d-simple.zip
zip
JMC's icon

Thanks!
In fact I want to send to the 3D buffer a texture and not a matrix (with subtex_matrix message).
Is it possible to do that?
My source is a live capture from a jit.gl.camera, and I want to store the 600 last frames, without converting the jit.gl.camera texture to a matrix before sending it to the 3D buffer.

JMC

Rob Ramirez's icon

this is not currently possible.
instead you can use poly~ or javascript to manage your texture array.

JMC's icon

OK, thanks for the advice.
Here is an patch that create an OpenGL texture delay from a texture source using js (adapted from Andrew's patch above).

I've got a feature request: a new jit.gl.textureset external with the same behavior than jit.matrixset but within openGL. I think a such object could have better performances than js or poly~ stuff.

JM

gldelay.maxpat
Max Patch
gldelay.js
js
Rob Ramirez's icon

this is great. thanks for sharing.

syrinx's icon

I've been working on hacking Andrew's patch, JS and shader into a Jamoma module. I'm attaching what I have so far. Any feedback is appreciated, and please feel free to use the module if you like!

I'm having a hell of a time with one thing. Currently, this module has to be an end-of-chain effect, at least in terms of 2d video processing, in that it's creating a bunch of 3d objects (a stack of videoplanes). I'd like to be able to capture the videoplanes to a single texture so that the output could be processed further as 2d video via shaders. No matter how I try to do this, though, I'm getting no joy. I've tried using a jit.gl.node outside of the Javascript, as demonstrated in this thread. I've also tried building the jit.gl.node into the JS, as discussed here. No joy with either of these methods. The issue seems to be specifically with the jit.gl.shader loaded via the JS - if I comment it out, I can capture the videoplanes, albeit without the mapped delay effect.

I also tried drawing the videoplanes to their own jit.world @shared 1 @output_texture 1 which worked so long as my input (here, jit.grab) was set to draw to the same context, but this doesn't really seem like a solution, because I'd like to be able to feed in video from my main render context, process it through the mapped delay (which would have to be in its own render context), and then get it back to the main render context.

It's got to the point that I've been trying to use jit.gl.syphonserver and jit.gl.syphonclient locally within the patch to get the video between render contexts, which suffice to say hasn't been successful. After bashing my head against this for quite a while, I'm ready to admit that I'm not very proficient with JS. If anyone out there would be willing to take a look and see if it's possible to get the video back out to the main render context as textures, I'd be very grateful. Ideally it would be possible to switch between 2d and 3d output, along the lines of what's going on with the torus and sphere in the jit.gl.node help patch - that'd be huge!

syrinx's icon

Sorry, here's the attachment.

gl_timespace.zip
zip