Forums > MaxMSP

Slit scan time warp MAP

Apr 09 2013 | 3:32 pm

Max comes with a few examples of the slit scan technique, and I have also developed my own.
But I want to expand the slit scan technique with a "time warp map".

In "traditional" slit scan , only one line of the input video is drawn each frame.

With a "time warp map", every pixel is drawn, but each pixel has it own delay as determined by a black and white mask.
An example of its implementation :
An examples of its use :

Can this be easily built in Max 6?

Apr 09 2013 | 3:38 pm

The only solution I can image is an input three dimensional matrix where z is time.
The output si a two dimensional matrix where each pixel is selected from the third dimension as set by a second char matrix.

Apr 09 2013 | 5:47 pm

check out the following example patch, that does exactly this:
/Applications/Max 6.1/examples/jitter-examples/render/Textures/subtex.3d.maxpat

enable usedstdim on the gl.texture to start the time-warp effect.

Apr 10 2013 | 5:15 pm

Ok, wow, thanks :)
If I understand correctly, "subtex" allows to change a portion of a texture.
The td.plane3d.jxs shader allows to "reduce" a 3D texture to a 2D texture following a "map".

Apr 10 2013 | 6:48 pm

I made something like this a few years ago using JS and a custom shader. I’m pretty sure I posted it somewhere in the forums, but not sure where. Anyways, the way I did it was to create a texture/videoplane for each delay frame and then draw each of those planes with an alpha mask determined by comparing the master delay mask to a threshold. I did it this way specifically because I was working on a 3D effect so I wanted some depth. Here it is if you want to have a look.

Apr 11 2013 | 7:49 am

This is stunning. Can anyone with some js chops hack the ubiquitous and esc-to-fullscreen commands on to this? It seems to already have some in it, it generates the colored lines when you click but it doesn’t move – rotating this slightly would be insane.

Sep 09 2013 | 3:17 pm

I’ve got a question about the subtex.3d.maxpat example. In the patch, a subtext_matrix message (with dstdimstart and dstdimend) is used to fill one plane of the 3D texture with a matrix.
Is there a way to fill one plane of the 3D texture with a 2D texture?
In my work I need to record 10 seconds (600 frames) of a render to play it later, and I do not want to go to matrix side (the framerate decreases too much).

Otherwise, is it possible to get the result of the td.plane3d.jxs shader into a ? Seems for me it doesn’t work within a (no 3D texture input).



Sep 10 2013 | 10:39 am

you can adapt the plane3d shader to a simpler version, allowing you to sample a single 2d texture from a 3d buffer.
i’ve attached an example.

Sep 10 2013 | 2:54 pm

In fact I want to send to the 3D buffer a texture and not a matrix (with subtex_matrix message).
Is it possible to do that?
My source is a live capture from a, and I want to store the 600 last frames, without converting the texture to a matrix before sending it to the 3D buffer.


Sep 10 2013 | 3:18 pm

this is not currently possible.
instead you can use poly~ or javascript to manage your texture array.

Oct 04 2013 | 7:11 am

OK, thanks for the advice.
Here is an patch that create an OpenGL texture delay from a texture source using js (adapted from Andrew’s patch above).

I’ve got a feature request: a new external with the same behavior than jit.matrixset but within openGL. I think a such object could have better performances than js or poly~ stuff.


Oct 04 2013 | 11:29 am

this is great. thanks for sharing.

May 09 2016 | 11:23 pm

I’ve been working on hacking Andrew’s patch, JS and shader into a Jamoma module. I’m attaching what I have so far. Any feedback is appreciated, and please feel free to use the module if you like!

I’m having a hell of a time with one thing. Currently, this module has to be an end-of-chain effect, at least in terms of 2d video processing, in that it’s creating a bunch of 3d objects (a stack of videoplanes). I’d like to be able to capture the videoplanes to a single texture so that the output could be processed further as 2d video via shaders. No matter how I try to do this, though, I’m getting no joy. I’ve tried using a outside of the Javascript, as demonstrated in this thread. I’ve also tried building the into the JS, as discussed here. No joy with either of these methods. The issue seems to be specifically with the loaded via the JS – if I comment it out, I can capture the videoplanes, albeit without the mapped delay effect.

I also tried drawing the videoplanes to their own @shared 1 @output_texture 1 which worked so long as my input (here, jit.grab) was set to draw to the same context, but this doesn’t really seem like a solution, because I’d like to be able to feed in video from my main render context, process it through the mapped delay (which would have to be in its own render context), and then get it back to the main render context.

It’s got to the point that I’ve been trying to use and locally within the patch to get the video between render contexts, which suffice to say hasn’t been successful. After bashing my head against this for quite a while, I’m ready to admit that I’m not very proficient with JS. If anyone out there would be willing to take a look and see if it’s possible to get the video back out to the main render context as textures, I’d be very grateful. Ideally it would be possible to switch between 2d and 3d output, along the lines of what’s going on with the torus and sphere in the help patch – that’d be huge!

May 09 2016 | 11:49 pm

Sorry, here’s the attachment.

Viewing 14 posts - 1 through 14 (of 14 total)

Forums > MaxMSP