The second installment of Jitter Recipes is a collection of simple examples that began as weekly posts to the MaxMSP mailing list. Here you will find some clever solutions, advanced trans-coding techniques, groovy audio/visual toys, and basic building blocks for more complex processing. The majority of these recipes are specific implementations of a more general patching concept. As with any collection of recipes, you will want to take these basic techniques and personalize them for your own uses. I encourage you to take them all apart, add in your own touches and make these your own.
- Using jit.buffer~ to create simple audio solutions.
- Using jit.matrix "exprfill" message
On the right half, we use a multislider and jit.fill to provide an easy drawing interface for waveshaping. Inside of the "use exprfill" subpatch, we've provided several common expressions that can be used to fill a jit.matrix with a windowing curve. This is particularly useful for doing granular synthesis.
- Recording gestural parameter data over time
The right half uses a matrix to store our parameter values, and then records frames into a jit.qt.record object.
Note that we're using jit.coerce to force our float32 matrix into an acceptable format for QuickTime (char).
- Using Jitter to create generative video
- Using Matrix feedback and distortion to create movement
This matrix is then distorted using jit.repos and then fed back to be drawn again using our named matrix.
To generate a constantly shifting repos-matrix, we've implemented a similar tactic as that used for the drawing (see bot2 subpatch). This matrix is then mixed with a matrix of normalized coordinates generated by the jit.expr object. This has the effect of gently repositioning the matrix at each frame, thus giving us smooth movement.
The output of jit.repos is then mirrored by sending it to jit.glue and inverting the x-dimension of the right input using jit.dimmap.
- Using the stroke messages to jit.gl.sketch to render curves in OpenGL
- Using a matrix to generate sketch commands for procedural drawing
This matrix of point-locations is then sent to jit.iter to be broken down into 3-part lists.
For every matrix that is sent, we reset our sketch object and then begin our stroke using the "beginstroke line" message, and specifying the order (or smoothness of our curve) as well as color and width.
Each point of the matrix is then sent using the "strokepoint...."
- Using expressions to generate values across a matrix
- Texture Mapping
- Reshaping a NURBS by setting control-point coordinates
The z-plane is left empty for us to fill in as desired to create spatial distortion.
All of these matrices are then packed into one 3-plane matrix and sent on as the "ctlmatrix" for jit.gl.nurbs
We use jit.gl.slab to map a movie onto our jit.gl.nurbs shape. We could also load a shader program into this object to do processing on our texture.
- Creating one's own particle system using matrix operations
- Rendering multiple copies of the same geometry
- Manipulating text using matrix operations
- Holiday visuals
The text sprites are generated by using jit.str.fromsymbol to convert the text to a matrix, which is then fed into a jit.op object to get randomized every three seconds. This matrix is then sent on to the jit.gl.text2d object to be rendered as bitmapped text on an OpenGL plane. This is repeated for every cell of the "cheer" matrix (see gLFO).
The white strip is generated by simply passing "cheer" to our jit.gl.mesh object with @draw_mode set to tri_strip.
- Using video to create synthesis parameters
- Creating audio-rate sequencing of parameters
- sync~,rate~, and all sorts of MSP objects
Using jit.altern to create periodic breaks and jit.brcosa to manipulate the brightness and contrast, we manipulate our video data to give us more desirable results. This matrix is then converted to a 1-plane luminance matrix,broken up into 4 rows using jit.scissors, and then packed into a 4-plane matrix. This is then used to create a 4-channel jit.buffer~
A second jit.buffer~ is filled using the familiar multislider-jit.fill combo (see JitBufferEx).
Rather than playing these buffers back as waveforms, we instead use them as CV-like controls for our synthesizer. These controls can be assigned using the matrixctrl UI.
Inside the "synth" subpatch we find all of our MSP processing, which I leave to your own experimentation.
- Capturing OpenGL geometry to a texture
- Using GLSL shaders to perform efficient video processing
- Adding video effects to a live OpenGL patch
By using the @capture super, we tell jit.gl.nurbs to render to our jit.gl.texture named "super" instead of rendering to our render context.
Once we have the rasterized geometry in texture form, we can then use jit.gl.slab to process our scene.
- Using audio to control OpenGL geometry
From there, we use jit.slide to smooth the movement, and jit.op to scale the values to the desired range. This is then given to the jit.gl.nurbs object as a "ctlmatrix".
- Creating visual display of Audio data
- Procedural drawing using OpenGL commands
For each matrix that we send out, we step through our drawing procedure. The trigger object is very helpful for this.
Our drawing prodedure works as follows:
- reset the jit.gl.sketch
- draw the background rectangle
- draw the blue-line
- draw the start-point at (-1.,-0.5,0.)
- draw a line segment and point for each cell of the matrix
- draw the end-point at (1.,-0.5,0.)
Note that we are using jit.gl.render with orthogonal rendering turned on. This allows us to do flat 2D drawing without perspective distortions.
- Using the GPU to perform typical video effects
- Capturing OpenGL scenes to textures
We are using two different jit.gl.videoplane objects. One of these simply displays the output of our jit.qt.movie, with additive blending turned on. The other one will display the texture generated by capturing our scene.
You will notice that we must always render to a texture that has the same dimensions as our rendering context. To work around this, we have two different textures to switch between for fullscreen mode.
- Performing manipulations of a 3-dimensional matrix
- Using Audio signals to generate video
- Matrix feedback
- Visualizing 3-D data as a geometrical figure
The jit.poke~ drawing subpatch should be familiar from previous recipes (see AsteroidGrowths,VideoSynth1)
Because jit.rota will only perform 2D rotations, we need to use jit.dimmap to reassign our dimensions and then perform a second rotation.
We then need to replace our dimensions using jit.dimmap again. This matrix is then fed back to be drawn over in the next iteration.
After we perform our rotations, we do quad mirroring by using jit.rota in boundmode 4 (fold), and then set the jit.rota to double the x and y dimensions. We could also double the z dimension as well if we wanted to do octal mirroring.
Once our matrix has been prodded and twisted sufficiently, we send it off to be rendered using jit.gl.isosurf.