Articles

Working with Hardware: DMX, Part 3

In this final article in the DMX set, I’m going to walk through the process of creating the DMX system we used to make the video below.

We needed to write a very specific hardware–focused Max patch to get this to work. My hope is that some of the ideas and techniques used to create this system will be useful to you in your explorations of your own DMX hardware based projects.

All the tutorials in this series: Part 1, Part 2, Part 3.

Part 1 – the hardware.

For this system I used the following gear.

  1. LanBox LCX: The Lanbox is my go-to DMX interface, it’s bomb-proof and loaded with functionality.

  2. American DJ Revo 4: This is a DMX controlled 256 channel led projector. It can output 4 grids of 16x16 leds in RGB and white.

  3. Microsoft Kinect: The Kinect is great for detecting bodies in a 3D space.

Part 2 – the software.

Max 6 – of course!

An install of the helper utilities and externals for the LanBox. You can get these from the Lanbox website.

If you use different DMX hardware, you’ll have your own system for this.

For access to the Kinect data, I used ref="http://jmpelletier.com/freenect/">the jit.freenect external developed by Jean-Marc Pelletier. This is a fantastic, easy to use way to interact with your Kinect from Max.

I also decided to use some more of Jean-Marc’s awesome software to manage the video data from the Kinect. In particular, I just wanted to light up one “blob” of activity in the camera’s view, in order to keep the show from getting all crowded out with lights. For this I used objects from the cv.jit set.

I downloaded all of these objects and placed them in a folder inside my Cycling ’74 folder in my Max application folder.

Part 3 – the programming.

The basic idea in this system is that we select a body in part of a 3 dimensional space, create a video of our detected body, then convert the 2 dimensional spatial data of the video into the 1 dimensional data which is the DMX packet controlling the projector.

Let’s start at the top of the patch and work our way down.

The master metro in the patch drives the whole show. It is set to 20fps, which matches the frame rate of my Lanbox. When working with DMX, there is never any reason to have an update rate faster than the frame rate of the system.

The Kinect specific code is addressed first. We want to be able to open our Kinect as a video input device to Max, and also adjust the angle the Kinect is viewing at. Most importantly, we wish to set up a range slider so that we can “focus” the Kinect on objects at a certain range from the device. Inside the kinect_code subpatcher, you can see how this all comes together.

Without going into extensive detail, this code adjusts the view from the Kinect to “see” at the range we wish to capture, then analyses this video for the presence of bodies, or “blobs”. We then label our blob, select only the first blob for viewing, and then create a video stream with only that blob in it.

Next up we need to convert the 2 dimensional video data into a 1 dimensional packet that is going to control the display of our LED projector. This is where the jit.iter object comes in handy. The code after this “spreads out” the coordinates of the 16x16 video into a 1 dimensional array with 256 elements.

The Revo 4 projector has a particular channel mapping which I figured out by reading the manual and goofing around with sending it various DMX messages ( welcome to the world of DMX programming! )

The channel mapping looks like this.

In the format_chans subpatch, you can see how I used the coll object to provide this mapping for 32 channels, and this mapping can be repeated right up to channel 256.

Finally, I use another coll object to set up the DMX packet that is going to be sent out on my DMX network. I’m using the out-of-sequence incoming number pairs to update the value for each channel with the nsub message to coll.

The project’s master metro also drives the dump message to the coll DMX_packet object, which sends the current packet out and on to the udpsend object via our LanBox specific packet-formatting object.

And there you have it! My thanks to Tom Hall and Janeva Zentz for coming over and making the video happen.

Thanks for following along in this Working with Hardware Series. If you’ve got ideas for other types of hardware you’d like to see covered, please let us know.

by Andrew Pask on November 5, 2012

Rodrigo's icon

These have been great.

For other hardware stuff, some straight Arduino+sensor and/or Arduino+actuator type things would be amazing.

Paolo Siri's icon

It's really great !!!!
I'm an Audio Engeneer from Alchemea College of London,and of course I'll want to show at the other guys which now studying in the college!
Paolo.

Ricardo's icon

Hey. Great tutorial. I posted this in the forum. Could you please have a look and let me know if it is possible?
https://cycling74.com/forums/kinect-and-jit-freenect-grab

Ginger's icon

Awesome!!