Jan 6, 2011 at 12:35pm
I planning to do some video tracking in order to sonify some performance: a dancer and a kung fu form.
I’m evaluating different technologies at the moment.
What’s the best way to track body movements ? Is Kinect really the best solution ?
Jan 6, 2011 at 1:57pm
Its a big question, here’s a few options to help your research.
1. Have a look at the cv.jit objects, particularly the tracking ones. The downside is you need someone to click on a point to track and it can get lost.
Haven’t got my kinect yet, I think people have been using the depth perception of the device rather than skeleton tracking, but I think you could still use it as a replacement for option 4 (I think it projects IR dots onto anything in front of it and picks them up with an IR camera). Someone else may be better able to inform you.
Jan 6, 2011 at 3:04pm
I think that a way to go it’s work with 3d sensing thru kinect.
Jan 6, 2011 at 3:20pm
Thanks for the answers.
@scatalogic: thanx for the infos.
@sandroid: that’s great! unfortunately it’s not possible to buy the controller thing from them – or at least I’ve found no price on the website.
Jan 6, 2011 at 4:34pm
I’m interested in this, and can briefly share my kinect experiences.
You can get kinect data into max through Jean-Marc Pelletier’s jit.freenect.grab object –
This will give you the rgb camera image, and also the depth map. This is very cool, but only the first step of what you (and I) want to do. It seems that the body/skeleton tracking part of the kinect interface is done in software, not by the kinect hardware itself. It seems to be very sophisticated (reliant on an enormous database of body data), and would be very hard to reverse engineer I suspect.
The good news is that PrimeSense – mentioned by sandroid and the originators of the technogies used by the kinect, have made this software available. It’s called NITE –
As I understand it, you can use this with a kinect and you don’t need any PrimeSense hardware. I haven’t tried this yet, as it’s currently Windows/Linux only, and I’m on a Mac (there are rumours of an OSX version coming). I also suspect you’d need to do some coding, though I don’t know that for sure.
I got my kinect for Christmas, so I’ve only been looking into this for quite a short time. Anyone – please tell me if I’m talking rubbish on any of this, or if you have any further pointers!
Jan 6, 2011 at 5:33pm
thanks for sharing your knowledge.
I’ve already knew Jean-Marc Pelletier’s jit.freenect.grab , what I didn’t know is that the mapping work is a software thing and not hardware.
Good to know that PrimeSense let the software available.
The opensource community behinde the Primesense sdk is also interesting:
The Software development kit of PrimeSense is tempting too:
I’m on a mac, but I can use linux too.
Jan 6, 2011 at 7:23pm
If you look closely at the OpenNI site you’ll see that MacOSX support is being actively developed — currently OSX binaries are included in the latest “unstable” release.
Jan 6, 2011 at 7:28pm
The primesense people made sure to not provide the SDK for using with kinect (license matters). Only with the primesensor.
Anyways, I will wait for the asus/primesense stuff. It looks promising…
You must be logged in to reply to this topic.