In this final article in the DMX set, I’m going to walk through the process of creating the DMX system we used to make the video below.
We needed to write a very specific hardware–focused Max patch to get this to work.
I've been thinking about The Material World recently.
Actually, I've been thinking about it by working my way through the recent series of Physics Patch o' the Day postings and doing things like turning gravity off and flinging things around the room that weigh nothing and watching what happens.
Welcome to Physics Patch-a-day, part of our spotlight on the Max physics engine and some of its capabilities.
DMX, or more accurately, DMX 512, is a network protocol most commonly used for the control of stage lighting and effects.
Audacity is a free and pretty full-featured multitrack sound production tool, but there aren't many ways to create and experiment with new sounds, beyond some basic tone generators.
In the first and second tutorials tutorials in our Gen series, we've taken a look at the gen~ object for processing audio data and the jit.pix object (and its cousin jit.gl.pix) for processing 2d, 4-plane image data.
The Livid Code is designed to be useable out of the box without programming.
The MGraphics patch-a-day series was very successful, so it's time for a Gen patch-a-day! This time, though, a few things change.
The jit.pix-based patches we created in our last tutorial do cool things and use patching techniques that will probably be accessible to the average Max user, they're not all that they could or should be as Jitter Gen patches.
Don't get me wrong - they make sense and introduce the idea of swizzling data from a vector in the Jitter Gen world.
Welcome to the second in a series of “drive-by” tutorials – quick introductions to the world of Max Gen objects.
In part 1, Darwin showed us all the fundamentals behind step sequencing in Max, and extended that from the computer to the controller.
Welcome to the first installment of a new series: “Working With Hardware”.
For the month of November 2011, I'm going to try to produce and post one new JSUI patch that uses the new MGraphics system.
After doing a quick tutorial at the Cycling ’74 Expo, it became clear that lots of people out there were really surprised and happy to discover that they didn’t need to be a supergenius to have fun with the gen~ object.
Ever thought about making your own Vizzie modules? With Max 5.1.8, you get the Vizzie Kit and below you will find detailed instructions on how to use it, including how to make existing patches into a Vizzie effects module.
The response to the appearance of Vizzie has been really exciting and rewarding – people all over the place are finding it to be a quick way to start working with Max, and more seasoned users have found it a useful source of "quick-start" modules they can plug into their regular workflow.
In the last several tutorials I’ve written, I’ve been talking about a subject that interests me a great deal – how to add variety to a Max patch in ways that both provide you with surprising and interesting combinations and do so in ways that make the transition between your input and what your patch is doing more subtle than hitting a button object and having everything start behaving in ways that are obviously not you.
To be more specific, I’ve been talking about ways to use the humble LFO as a generator of that variety by summing, sampling, and otherwise using it to produce less ordinary control curves than can be easily intuited by your audience by the time the second sweep of the LFO comes around.
There’s another obvious source of variety generation that Max users often gravitate toward: random number generators.
A simple truth emerges from the practice of writing Max patches like the Max for Live device we've been working on: The trajectory of “finishing” your Max patch is something you approach on an asymptotic curve - you approach being “done,” but never quite reach it.
One of the most feared and respected objects in the Jitter collection, jit.expr arrived on the scene as part of Jitter 1.5.
Since a lot of people are interested in what the process of porting a Max patch for use in Max for Live looks like, I thought I’d take this tutorial as an opportunity to go over the steps I used to take my waveplayah patch and to convert it to a Max for Live device waveplayah.amxd.
In my last LFO tutorial, I took the basic LFO module I’ve been working with in the previous tutorials, added some new extensions, and created a nice little patch called the waveplayah that used a summed set of the LFO modules to drive the playback of the contents of a buffer~.
A while back, I wrote a series of four tutorials based around the idea of how you could generate and organize variety in Max patches.
While many people are looking at Max for Live as a great way to integrate their favorite hardware controllers, build really unique effects, and add variety to their productions, I was eager to explore what could be done with video inside of Max for Live.
I have collaborated with musicians before that work exclusively inside of Ableton Live, so it struck me as a huge advantage to be able to build a triggered video playback and live processing system that worked inside of Live natively.
Coming up with ways to get information about the physical world into Max is one of the most fun aspects of working with the software.
In this installment of the Video Processing System, we're going to tackle two big hurdles that Jitter users often find themselves coming up against.