Tutorials

Max and OpenGL: GL Texture Delay

Whenever a frame delay effect is called for, most Jitter programmers know to reach for the jit.matrixset object to handle this task with ease, however when working with OpenGL textures, no single object will do the job. The technique described below was first documented in Andrew Benson's excellent Video Processing System tutorial, which is required reading for Jitter programmers.

The delay is accomplished by using two poly~ patches, one serves as a bank of jit.gl.texture objects, and the other serves to either send out capture messages for writing or jit_gl_texture messages for reading. The magic happens by using a jit.gl.videoplane object with the automatic attribute disabled, sending it the input frames, and setting its capture attribute to the name of one of the textures in the texture-bank.

The basic steps of the delay are:

  • Send a bang message to a counter to output a frame from the delay bank the delay-length of frames prior to the current frame

  • Set the current frame in the delay bank for writing, by setting the jit.gl.videoplane object's capture to that texture name

  • Send in the current frame to the jit.gl.videoplane

  • Send a bang message to the jit.gl.videoplane object, causing it to capture to the aforementioned texture

In the example patch, the delay time is hard-coded to 60 frames (1 second at 60 FPS), but this could be easily modified. We also perform a simple multiplication with the original frame.

by Rob Ramirez on September 30, 2015