Motion Tracking with Kinect (Mac OS)

    Mar 20 2018 | 8:42 pm
    Hi everyone,
    I am not new to Max, but I do not use Jitter much and I am new to the Kinect (late to the party, I know, but I have access to one and I want to experiment).
    I have downloaded Jean-Marc Pelletier's jit. freenect.grab object, as well as Blair Neal's example patch. That said, I am still having a tough time wrapping my head around the data I am getting. If possible I would like to use motion tracking to control lighting and sound. Any help would be appreciated.

    • Mar 21 2018 | 6:59 pm
      I had always wondered why so many people insist on using Kinect together with MacOS and not Windows, considering that just about any Mac can run Windows via Bootcamp. The Kinect was never developed for Mac and so the third party tools do not have nearly as rich of a feature set as the native ones developed for Windows by Microsoft themselves.
      I strongly recommend setting up Windows and trying something like dp.Kinect which is basically a wrapper for Microsoft's Kinect API with some nice API remapping for Max.
      For controlling lighting and sound you will probably want the relative precision of using the Skeleton maps, most likely you just want to grab the data of the hand's 3D position. My impression of the Mac tools for Kinect1 is that they are mostly based around image processing which is a ton of extra work. There was Synapse which had Skeleton tracking but my understanding is that it's not compatible with newer versions of MacOS.
    • Mar 22 2018 | 5:36 pm
      Thanks for the insight @kcoul. This makes sense; unfortunately the Macs we are working with are in an educational institution, so just from the logistic/bureaucracy stand point getting windows installed would prove difficult.
      This leads me to a second question; is there a depth sensor with motion tracking/skeleton maps that works better with the MacOS?