Kinect and sample triggering

Nadroj's icon

Hi all,

I'm very new to Max MSP and not very fluent in using it. I have worked through the tutorials and have (to an extent) grasped the bare basics. I've searched and searched, both on these forums and on google, but can't find any explanation or tutorial that makes sense for what I'm trying to do.

Basically, I want to use Microsoft's Kinect sensor to control various samples - for example, individual notes. I've seen videos and patches where the video feed has been split into several different sections, with certain sections controlling certain things (for example, moving your right hand controls volume, moving the left controls pitch, etc...) and have tried emulating things like that but to no avail.

I've looked at Synapse and jit.freenect.grab, but to be perfectly honest, I've no idea where to start using these things, or how to implement them into my work. I'm looking for tips to create a "basic" startup patch where the Kinect feed can be split into several segments that control different things - even just something where I can control two separate samples would be fine. My project is a very basic one (compared to other things I've seen online) where I plan to use movement to perform improvised solos over a basic blues backing track. The backing track I have already created, but I'd like to use Max/Kinect to play the improvised solos (for example, setting every sample as a note from a basic blues scale, and moving my hands/body in different places to trigger different notes). I plan to make it simple enough so that members of the audience are able to come up on stage and 'perform' a solo simply by moving around.

There's no patch for me to link because I haven't created anything solid yet. I've played around with various patches I've found on these forums, and tried adding things from them to my own past patches, but I'm unable to get things to work.
I'm aware this looks like I'm asking for people to do my work for me, so I apologise, but that's not my intention. I simply want to know (in the most basic of terms) how to or where to get started! Once I have a basic idea or patch in place, I will study it and develop it myself.

Apologies if threads like this are tedious, but I'm in a bit of a rut here.
Thanks all!

Stephane Morisse's icon

Kinect via synapse sends the informations for the various joints as osc messages. To get them, you can use udpsend and udpreceive then separate them using osc-route. Make some search with synapse on the forum and you will find useful infos about the syntax of these messages.

Tobias Rosenberger's icon

the kinect max beatwheel patch might help you to get started:

Nadroj's icon

Hey guys, thanks for your replies, and sorry for taking so long to get back!

I've been playing around with synapse, the beatwheel patch and many, many other things. After weeks of headaches, countless deleted patches through trial and error, I've come back to the Beatwheel patch. It seems to be the closest to what I'm after, but the problem I have is that it lets me control a sample that is continuously playing.

I've played around with the patch itself and looked around the inspector, but I can't find any solution; maybe one of you guys can help me. Just now the beatwheel is playing a continuous sample loop, and where I place my hand in the Kinect feed controls where the sample is played from (using the segments of the circle). Can anyone think of a way in which I can edit it so that the sample loaded in won't continuously play? I want it to only play when I move my hand into one of the segment's areas, meaning I can play short bursts of the sample and then stop it quickly by moving my hand back.

diablodale's icon

One approach to simplify this is to separate this project into two parts.

First, make a match which can play the samples that you want. Make them trigger/stop/stop with simple clickable message boxes in the patch. Get it working as you like. Maybe use the tutorials that comes with Max like MSP#14 or MSP#16. If you have Ableton Live, you could use Max for Live.

Second, create a patch that can detect your hand movements/locations/etc in a manner that works for you. This is a non-trivial challenge.but there are many solution that have been created. Remember to avoid gorilla arm http://en.wikipedia.org/wiki/Gesture_recognition#.22Gorilla_arm.22
Have this patch output a note/number/message/midi note.

Finally, when you have the two working, just connect them together with a (route) object or something like it.

Dg's icon

Is this work goes the way you're looking for?
http://tinyurl.com/3eoj5zs

Nadroj's icon

Hi again, thanks for your input.  Another update!  I've done what you've said, Diablodale, and that simplified it a lot.  My patch is unbelievably simple, and Kinect-Via-Synapse was able to recognise it as soon as I opened the programme.  My triggers are ready, and are set to be triggered with a bang.  Using Kinect-Via-Synapse, I've been able to assign my various short samples to Right Elbow, knee, left elbow, etc, but I'm now faced with a new (and probably easily overcome) problem that no one can seem to help with.

As soon as I adopt the "Psi" pose in synapse, my patch goes crazy and triggers every sample continuously, until I either mute the sound or turn Synapse off.  I've set the parameters on K-V-S to what I want, but nothing seems to fix it.  As soon as KVS recognises my Psi pose, every sample is triggered and won't stop.  I can't find a way to control it.

As I said, it's probably a simple fix, but if anyone is able to help me control when my samples are triggered (and turned off) using KVS, I'd be very grateful.

LSka's icon

If you post your patch, people can see what's wrong.

maxdance's icon

Hey facing similar problems. Mind sharing your patch?

Koko Loco's icon

Could you please share your patch?