Using Kinect with jit.freenect.grab


    Feb 02 2017 | 5:20 pm
    I know you can use jit.freenect.grab to get the visual and depth data from Kinect but how would you get one numerical set of X,Y and Z co-ordinates from it? For example the depth data coordinates of a person walking round a room? I'm guessing perhaps using the blobs to get an average reading of the person with the background depth cancelled out.
    I've searched around for about a day, even tried to collect the Kinect data through the Processing language and send it to Max and have so far got nowhere.
    Any help here would be much appreciated

    • Feb 13 2017 | 12:04 am
      jit.freenect.grab does not have that capability.
      Another options is dp.kinect or dp.kinect2 which provides skeleton joint xyz and much more. Downloads, examples, docs, setup, and all are at: https://hidale.com/shop/dp-kinect/ https://hidale.com/shop/dp-kinect2/
    • Feb 16 2017 | 4:50 pm
      Yeah that's for Windows only isn't it? Drawing a blank with Mac after Apple purchased the drivers for the openni stuff then discontinued it, seems like a bit of a dick move to me! :) Might just have to use Processing but I dont know it nearly as well as Max.
    • Feb 16 2017 | 8:13 pm
      NI mate just started supporting Kinect 2 on OS X, I'm sure Dale has commented they use the Nite component that Apple bought out before so not sure where it stands legally. Dale?
    • Feb 17 2017 | 3:25 am
      I'm not an attorney, can't give legal advice, and I provide a competitive alternative technology. With all those caveats...
      I don't know. I've not downloaded their v2 product. Their v1 product is using NITE. According to an email conversation I had with Primesense in March 2014, no one can sell/distribute NITE after April 2014 when Apple canceled everyone's NITE license. You see...NITE was always "pre-production" and Primesense never granted redistribution rights. One could only do a direct download to one's own PC and play with it there. One could never redistribute NITE as this company appears to be doing. That's why my Mac solution, jit.openni, is orphaned. I respect people's licensing rules. I want devs to get paid, pay rent, buy food, ...
      Perhaps with their v2 product they found another solution. Perhaps, they have a Microsoft-quality research team and wrote their own technology. Kudos to them in that case. What is also possible is that they are using the open-source solutions for the easy rgb, depth, and ir data. Those data feeds are available across many platforms and sensors. The difficult data feeds are the body joints, facial recognition, audio triangulation, speech, etc. I haven't seen open source solutions for those.