What is the best way of using the kinect with max?
Hey,
I designing a virtual drum kit using the kinect in max. At the moment i'm using synapse but it doesn't seem to track the feet very well. Also when i make the movements involved with playing the drums it sometimes gets confused. For example it starts to track the torso instead of the hands.
Any help would be greatly received
Ed
is synapse made for feet as well? I use synapse to experiment in max at the moment...
Hey Ed,
I'm the developer of a Kinect to MIDI/OSC toolkit, "Kinectar" and I'm happy to answer any questions.
You can download my software free, however it doesn't yet include anything except hand data (I'm slowly rebuilding the framework to include all the joints).
It's for both Windows and OSX (current OSX version is fairly broken, however I'm just working on the update now and should have the new working version out in the next day or two).
On to your query:
Most of the Kinect OSC "proxies" utilize all of the joints, but in general I've found feet tracking not as reliable as the hands.
I've done extensive experiments on many styles of using the Kinect as an instrument, and the unfortunate reality is that the sensor cannot track joints correctly if they're obstructed.
For this reason, playing drums in the regular way (hands crossed) will cause certain joints to be poorly tracked, and you'll often end up with jumpy data which will cause sounds to trigger late, not at all, or even when you don't want them to.
You'll find will happen when parts of the body are obstructed, the tracking algorithm will attempt to reconstruct your skeleton with a non-complete picture of your body. This is most likely what causes the Kinect to be confused between your torso and your hands.
Additionally, considering the skeleton is calculated and tracked live, there is occasionally some latency when using very fast movements. I've spent quite a bit of time in trying to optimize my "speed" calculations, but I believe a lot of my issues are to do with the positional data that's coming in from the Kinect, ie. very sharp/quick movements sometimes don't register.
If you'd like to see videos of the different musical applications I've experimented with using the Kinect, head over to my YouTube channel:
Chris.
The "best" way is really a personal matter. That being said, I'm using the SimpleOpenNI library for Processing with great satisfaction. The Processing app handles all the Kinect tracking stuff and sends full-body joint coordinates to Max using OSC.
The big advantage for me is that it's an open system (library & sample code) that lets you adapt it to your specific needs, as opposed to closed compiled apps like Synapse etc (granted, some of these are open source but I'm not much of a programming ninja so Processing is easier for me). This might or might not be of interest to you.
@dtr: Have you got any decent links for getting started with that? So far I've been a Synapse man but I've not had too much time to use my Kinect yet so I'm prone to change.
I've also started getting into Processing recently and I'm having great fun with it, learning is so quick and simple.
The library contains lots of sample sketches to get you started. There's a forum here: http://groups.google.com/group/simple-openni-discuss
Oh and the google code page also has an automated OpenNI/NITE installer which does all the crazy install work for you... Just need to install the Avin2 Kinect driver yourself.
There is also an excellent book dealing with kinect and processing: