Motion Tracking with Kinect (Mac OS)

Peter Redecopp's icon

Hi everyone,

I am not new to Max, but I do not use Jitter much and I am new to the Kinect (late to the party, I know, but I have access to one and I want to experiment).

I have downloaded Jean-Marc Pelletier's jit. freenect.grab object, as well as Blair Neal's example patch. That said, I am still having a tough time wrapping my head around the data I am getting. If possible I would like to use motion tracking to control lighting and sound. Any help would be appreciated.

kcoul's icon

I had always wondered why so many people insist on using Kinect together with MacOS and not Windows, considering that just about any Mac can run Windows via Bootcamp. The Kinect was never developed for Mac and so the third party tools do not have nearly as rich of a feature set as the native ones developed for Windows by Microsoft themselves.

I strongly recommend setting up Windows and trying something like dp.Kinect which is basically a wrapper for Microsoft's Kinect API with some nice API remapping for Max.

For controlling lighting and sound you will probably want the relative precision of using the Skeleton maps, most likely you just want to grab the data of the hand's 3D position.

My impression of the Mac tools for Kinect1 is that they are mostly based around image processing which is a ton of extra work. There was Synapse which had Skeleton tracking but my understanding is that it's not compatible with newer versions of MacOS.

http://synapsekinect.tumblr.com/post/6305020721/download

Peter Redecopp's icon

Thanks for the insight @kcoul. This makes sense; unfortunately the Macs we are working with are in an educational institution, so just from the logistic/bureaucracy stand point getting windows installed would prove difficult.

This leads me to a second question; is there a depth sensor with motion tracking/skeleton maps that works better with the MacOS?

Anton Kuznetsov's icon

After a short research I got to the conclusion that the only two options of getting OSC data from a 3d sensor on macOS is: 1) through NI-mate + Kinect v2 (their recommended for mac) or Xtion Pro Live; 2) through Orbbec Astra + OpenFrameworks (as of Jan. 2019 Orbbec is about to release a skeleton tracking sdk for macOS!)

Don't know if Kinect-Via-OSCeleton or Kinect-Via-Synapse still works and would very much appreciate the input of the community.

Also - does anyone know of any other option?

Does OpenNI's resurrection thanks to https://structure.io/ somehow lead to the consequent resurrection of jit.openni or something direct jitter-related or is that not applicable at all?

P.S. Of course, the OpenPose development is an exciting prospect. As OpenPose is now compatible with macOS and AMD GPUs (I hope that's correct) maybe you could try and convince Hollyhook to port her creaton to macOS :P (or maybe we could crowdfund that endeavour? is that naive to say?)

P.P.S. There's also a pretty cool utilisaton of PoseNet by our good buddy Sam: https://cycling74.com/forums/skeleton-tracking-with-posenet-and-node-for-max

Also, the Nuitrack API seems to be an exciting prospect.

Dante's icon

Has anyone tried these methods that Anton Kuznetsov talks about. I'm trying to up-my-tracking-game and looking for the most practical OSX option possible. Going to try that thing Sam posted and also try openPose, but looking at the requirements idk if my 2017 laptop is strong enough.