Take Care

Video for the track “Take Care” by LSKA.
The video was mad using a particle system interacting with a Kinect’s depth map.

3d footage captured with MS Kinect using the dp.kinect Max external by Dale Phurrough (hidale.com) elaborated in real-time with Jitter.
Music and video synched using LiveGrabber (http://showsync.info/index.php/tools/livegrabber/) and a custom Max4Live device.
Video recorded with jit.gl.syphonserver (https://github.com/Syphon/Jitter/releases/) and Syphon Recorder (http://syphon.v002.info/recorder/)

More Links

Mar 24 2014 | 4:10 am

@lska : wow, this is a really impressive work, well done.

I´m still working on a similar Project for University. Could you give me any advice how to setup my project. Which map is the correct one to get particles into silhouette? It would be great to point in a direction.


Mar 24 2014 | 5:18 am

Hi gantzgraf,
I use the @depthonlyplayers 1 attribute in the [dp.kinect] external, and record matrices with [jit.matrixset], then I sift out all "black" matrix cells using Wesley Smith’s [xray.jit.sift] http://www.mat.ucsb.edu/~whsmith/xray/xray.jit.sift.html , so there are no spare points in my silouhette matrix on the Z axis.
Here’s the "Zfilter" I used:

-- Pasted Max Patch, click to expand. --


To control the particle system’s attraction and rotation, I adapted a gen patch I found in the forum http://cycling74.com/forums/topic/sharing-is-hairy-brains-gen-particles/
(again, thanks to Wesley Smith! I just realized I have to edit the credits in the video… ;) ), using the recorded matrices to attract the particles.

Mar 24 2014 | 11:31 am

Hello LSKA,

thank you for answering quickly and your help. I think the worst is yet to come but i see a light at the end of the tunnel …. ;)

thanks a lot

Aug 17 2014 | 2:33 pm

Hello LSKA, @lska
I am new learner of MAX, and I need to do a similar project now. I don’t understand the detail of this sentence, could you help me?

"use the @depthonlyplayers 1 attribute in the [dp.kinect] external, and record matrices with [jit.matrixset], then I sift out all "black" matrix cells using Wesley Smith’s [xray.jit.sift] http://www.mat.ucsb.edu/~whsmith/xray/xray.jit.sift.html"

Could you tell me the detail of how to do it? Thank you very much!

Aug 20 2014 | 8:39 am

Hello @alicexin,
Before answering your question, I want to warn you that this project could be quite complex for a new Max/Jitter user, as it involves a good knowledge of Jitter, Gen, OpenGl, and MSP, since audio analysis is also involved.
I don’t want to share the complete patch at the moment, because it’s really messy and crafted for this very project, so hardly useable for other projects. But sometime in the future I may come to a more "general purpose" patch to share with all the community.

Answering your question:
dp.kinect is a Max external made by Dale Phurrough, which I use to capture and record Kinect data. https://hidale.com/shop/dp-kinect/
What I use is the the so-called "depth map" which is a collection of 3D coordinates of the scene captured by the Kinect, output as a matrix.
The data that comes out is then recorded and reproduced as a stream of matrices using jit.matrixset.
I apply the "@depthonlyplayers 1" parameter to dp.kinect to isolate the bodies from the rest of the scene.
Doing so, I get a 320×240 matrix in which most of the cells are empty (i.e. their value is 0). This leads to a lot of artifacts when feeding this data to jit.gl.mesh, as the object will draw points whose coordinates are 0,0,0.
Thus, the xray.jit.sift object allows me to filter all cells with value 0, and send only the relevant data to jit.gl.mesh.

Hope this helps,


Viewing 5 posts - 1 through 5 (of 5 total)

Explore More

Subscribe to the Cycling ’74 Weekly Newsletter

Let us tell you about notable Max projects, obscure facts, and creative media artists of all kinds.

* indicates required