Using Kinect with jit.freenect.grab

Johnnyc777's icon

I know you can use jit.freenect.grab to get the visual and depth data from Kinect but how would you get one numerical set of X,Y and Z co-ordinates from it? For example the depth data coordinates of a person walking round a room?
I'm guessing perhaps using the blobs to get an average reading of the person with the background depth cancelled out.

I've searched around for about a day, even tried to collect the Kinect data through the Processing language and send it to Max and have so far got nowhere.

Any help here would be much appreciated

diablodale's icon

jit.freenect.grab does not have that capability.

Another options is dp.kinect or dp.kinect2 which provides skeleton joint xyz and much more.
Downloads, examples, docs, setup, and all are at:
https://hidale.com/shop/dp-kinect/
https://hidale.com/shop/dp-kinect2/

Johnnyc777's icon

Yeah that's for Windows only isn't it? Drawing a blank with Mac after Apple purchased the drivers for the openni stuff then discontinued it, seems like a bit of a dick move to me! :)
Might just have to use Processing but I dont know it nearly as well as Max.

Luke Woodbury's icon

NI mate just started supporting Kinect 2 on OS X, I'm sure Dale has commented they use the Nite component that Apple bought out before so not sure where it stands legally. Dale?

diablodale's icon

I'm not an attorney, can't give legal advice, and I provide a competitive alternative technology. With all those caveats...

I don't know. I've not downloaded their v2 product. Their v1 product is using NITE. According to an email conversation I had with Primesense in March 2014, no one can sell/distribute NITE after April 2014 when Apple canceled everyone's NITE license. You see...NITE was always "pre-production" and Primesense never granted redistribution rights. One could only do a direct download to one's own PC and play with it there. One could never redistribute NITE as this company appears to be doing. That's why my Mac solution, jit.openni, is orphaned. I respect people's licensing rules. I want devs to get paid, pay rent, buy food, ...

Perhaps with their v2 product they found another solution. Perhaps, they have a Microsoft-quality research team and wrote their own technology. Kudos to them in that case. What is also possible is that they are using the open-source solutions for the easy rgb, depth, and ir data. Those data feeds are available across many platforms and sensors. The difficult data feeds are the body joints, facial recognition, audio triangulation, speech, etc. I haven't seen open source solutions for those.

E_S_'s icon

Hi all,

I've been scouring for the last couple days to find a way to use my Kinect 1414 with Max on my 2019 MacBook Pro running Sonoma 14.4.1. Last posts on all these forums seem to be from quite a few years ago. Anyone know how to do it in 2024?

I'm specifically after skeleton joint info which, as the above user mentioned, jit.freenect.grab can't provide.

Thanks!

Rob Ramirez's icon
E_S_'s icon

Thanks Rob - funnily enough I downloaded jit.freenect.grab from that page yesterday and have it up and working. The issue is that all other max externals/softwares such as simpleKinect and Synapse which enable you to use skeleton points no longer seem to work on Mac. Can you confirm jit.freenect.grab is the only currently usable option?

Thanks!

Rob Ramirez's icon

I believe that's the case. I guess many folks have moved on to mediapipe - https://cycling74.com/forums/n4m-facemesh-handpose-google-mediapipe#reply-647a66915acdcc0014f28fce

E_S_'s icon

Good to know - thank you very much. I was just hoping my old kinect might be useful for something aha!

TFL's icon

E_S_ playing with the point cloud is still a lot of fun!