I planning to do some video tracking in order to sonify some performance: a dancer and a kung fu form.
Basically I want to connect some movements to some sounds.
I’m evaluating different technologies at the moment.
Kinect looks like the best way to track body movement.
What’s the best way to track body movements ? Is Kinect really the best solution ?
What would you use ?
Its a big question, here’s a few options to help your research.
1. Have a look at the cv.jit objects, particularly the tracking ones. The downside is you need someone to click on a point to track and it can get lost.
2. Do some colour tracking a la jitter tutorial 25
3. Use a setup with an infrared camera and put some infra-red LED’s on your dancer to track
4. Use a setup with an infrared camera and shine an infrared light on your dancer (search the forums for the details)
5. Maybe forget the camera and use some accelerometers, flex sensors, pressure pads via a microcontroller like Arduino
6. Have a look at this too: http://cycling74.com/2009/10/26/making-connections-camera-data/
Haven’t got my kinect yet, I think people have been using the depth perception of the device rather than skeleton tracking, but I think you could still use it as a replacement for option 4 (I think it projects IR dots onto anything in front of it and picks them up with an IR camera). Someone else may be better able to inform you.
You can find some nice tutorials for cheaply converting a PS3eye camera to an infrared one on the web.
I think that a way to go it’s work with 3d sensing thru kinect.
Some days a go I found that kinect is based in a product called primesensor by primesense. They will also release a product with asus (See links below). As far as I undestand, primesense also has a API for reading the data from the primesensor (NITE API) which would provide ways for working with bones.
well, I haven’t tried it yet but I am very curious about it.
Thanks for the answers.
@scatalogic: thanx for the infos.
I’ve already looked into Option 1 and 2 but they are not really precise.
The other options sound good, but I hope that with kinect is kind of easier.
@sandroid: that’s great! unfortunately it’s not possible to buy the controller thing from them – or at least I’ve found no price on the website.
Still an interesting news, maybe it’s better to wait for asus :)
I’m interested in this, and can briefly share my kinect experiences.
You can get kinect data into max through Jean-Marc Pelletier’s jit.freenect.grab object –
This will give you the rgb camera image, and also the depth map. This is very cool, but only the first step of what you (and I) want to do. It seems that the body/skeleton tracking part of the kinect interface is done in software, not by the kinect hardware itself. It seems to be very sophisticated (reliant on an enormous database of body data), and would be very hard to reverse engineer I suspect.
The good news is that PrimeSense – mentioned by sandroid and the originators of the technogies used by the kinect, have made this software available. It’s called NITE –
As I understand it, you can use this with a kinect and you don’t need any PrimeSense hardware. I haven’t tried this yet, as it’s currently Windows/Linux only, and I’m on a Mac (there are rumours of an OSX version coming). I also suspect you’d need to do some coding, though I don’t know that for sure.
I got my kinect for Christmas, so I’ve only been looking into this for quite a short time. Anyone – please tell me if I’m talking rubbish on any of this, or if you have any further pointers!
thanks for sharing your knowledge.
I’ve already knew Jean-Marc Pelletier’s jit.freenect.grab , what I didn’t know is that the mapping work is a software thing and not hardware.
Good to know that PrimeSense let the software available.
It could be probabily a good idea to buy the PrimeSense hardware instead of kinect… I’ll think about it.
The opensource community behinde the Primesense sdk is also interesting:
The Software development kit of PrimeSense is tempting too:
I’m on a mac, but I can use linux too.
I can also code a little.
If you look closely at the OpenNI site you’ll see that MacOSX support is being actively developed — currently OSX binaries are included in the latest "unstable" release.
The primesense people made sure to not provide the SDK for using with kinect (license matters). Only with the primesensor.
At least not openly, but maybe with some hack that could be possible.
I tried to contact about primesensor but no answers yet.
Anyways, I will wait for the asus/primesense stuff. It looks promising…