jit.openni (inc. Kinect)
This is a Max Jitter external for OpenNI on Windows and Mac OSx.
I caution everyone that OpenNI is dead. April 2014, the OpenNI website is closing. When this happens, there may be no legal place to download NITE (the essential component of OpenNI that does skeletal tracking). If you are on Windows, please use dp.kinect instead.
Currently supports:
Configuration of OpenNI by an OpenNI XML configuration file; see OpenNI documentation for format
ImageMap of RGB24 output in a 4-plane char jitter matrix
DepthMap output in a 1-plan long, float32, or float64 jitter matrix
IrMap output in a 1-plan long, float32, or float64 jitter matrix
UserPixelMap output in a 1-plan long, float32, or float64 jitter matrix
Skeleton joints in an OSC, native max route, or legacy OSCeleton format
Skeleton events in an OSC, native max route, or legacy OSCeleton format (e.g. user seen, user lost, calibration success, etc.)
User centers of mass
Scene floor identification and data
Values for user centers of mass and joints in OpenNI native, projective coordinate, or OSCeleton legacy "normalized" values
Attributes to filtering data based on position or orientation confidence, display or not the orientation data, and smooth skeleton data using OpenNI's smoothing API
Depth camera field of view
Compiled as a Win32 and Mac OSx max/jitter external
https://github.com/diablodale/jit.openni/wiki is the location of the project documentation and setup instructions.
Example of some OSC joint output:
/userid/jointname x y z confidPosition [Xx Xy Xz Yx Yy Yz Zx Zy Zz orientPosition]
It has been casually tested with Max 5 and 6, SensorKinect, and OpenNI binaries on Windows and Mac OSx. If you find problems and can reproduce them with clear steps, I encourage you to open an issue at https://github.com/diablodale/jit.openni/issues
This external's output is very similar to jit.freenect.grab and therefore it can often be used in its place with small changes. My object outputs depth in mm and jit.freenect.grab outputs in cm. A simple math operation can resolve this. Note, my object does not provide the "raw" values of jit.freenect.grab; instead it provides the mm depth values via OpenNI.
The OSC skeleton data should be easy to use if you are familiar with OSCeleton.
I would like to see support for other generators (gestures, hand point, etc.) in the future added by myself or with the assistance of others.
Download this tool
stoersignal
6月 29 2011 | 10:18 午前
any chance to get a mac verrsion of this?
diablodale
6月 29 2011 | 11:06 午前
I would like to see a Mac developer join the jit.openni project and make any needed changes for it to be cross-platform. I do not have the equipment or resources for any Mac development myself.
steve
10月 31 2011 | 4:36 午後
i was wondering if you had anyone join the mac side of dev? i would like to get involved with this.
your patch looks amazing but i can not use it on mac.
let mw know what i can do. where are you at? have you had any mac forks started?
talk soon
diablodale
11月 01 2011 | 5:04 午前
I haven't yet connected with an interested Mac developer. They keep using separate programs like OSCeleton or the freenect which doesn't have skeletons. I understand its easier to use what is already there.
Physically, I'm located in Berlin. VIrtually, I'm worldwide. ;-)
steve
11月 01 2011 | 3:54 午後
Hey ,
i was reading some of your documentation (as thats all i can do at this point on mac)
and was wondering where this jit.simple came from?
and if maybe i could get that working on the mac,
if i can get to some documentation about the maxsdk (which i can not find)
let me know.
i also emailed you.
geddonn
3月 17 2012 | 7:37 午後
Any further news on a Mac version? Using freenect.grab at the moment and being tortured by the memory leak bug.
div
5月 27 2012 | 10:34 午後
Would also love to see a Mac version if there's one about!
Looks great!
diablodale
5月 29 2012 | 11:40 午前
Still no Mac developer has volunteered or contacted me to compile it, check for errors, and make any needed tweaks.
Parmerud
8月 21 2012 | 1:50 午後
Is there any Mac developer out there that want to make a bit of money? I will pay to have this great external moved into Mac land if no one cares to move just for knowledge and fun. Come on! All the Mac community that have a Kinect will jump for joy.... Just go to my website to get my contact info.
PS I HATE CAPTCHA!!!!!
ControlFreak
10月 02 2012 | 12:57 午前
Hey DiabloDale. Thanks for sharing this great Max object. I'm unfortunately on the Mac side of things so I haven't used it myself yet.
About a year ago, I spent a little time trying to get this to compile for Mac and got pretty close without having to make many tweaks. Eventually I hit a roadblock and started working on something else. First let me say, I'm not a hardcore coder, I can sort of tweak stuff from time to time, but I couldn't create what you did from scratch. That's why I use Max.
So after commenting out the "#include "targetver.h" instances, there were only a few spots I had to replace "boolean" with "bool". Besides that it seems to line up.
Where I hit the wall was a few errors "Redefinition of typedef XnModuleExportedProductionNodeInterface". I did some searches on the OpenNI forums and such and saw a few other people asking about this error, but never getting an answer. I was so determined to just see it compile, I went into the OpenNI includes and changed a few to get rid of these errors and got it to compile. The object does show up in patch fine, the summary command works, but I can't load the jit.openni_config.xml file, even if I pointed right at it, I would get the file not found error. I suspect that has to do with all my tweaks to the ProductionNodeInterface stuff. Without that, the camera never tries to initialize. That's where I moved onto another project where I made a bunch of tweaks to Synapse to output the depth data in more formats.
Anyways, I'm back and am going to give it another go. You have already done the hard stuff. The OpenNI/NITE libraries are huge and at times hard to follow. I'm hoping once I can get past this "Redefinition of typedef XnModuleExportedProductionNodeInterface" error it should compile on the Mac. Thanks again for sharing the source.
Stuart
ControlFreak
10月 05 2012 | 2:53 午前
Hello again DiabloDale.
I'm happy to report I now have jit.openni running on my Mac in OSX 10.68 (Snow Leapord) . I'm currently testing on Max 5.18 + Max 6.07, OpenNI v1.540, Nite v1.5.2.21. All seems good so far, no crashes, launches correctly each time. (knock on wood) I even tried reloading the xml file over and over and it just takes a second or two to come back online each time. Skeleton data starts outputting within seconds.
At the end of the day, I didn't have to change much to get it functioning, I just had to learn a lot more about OpenNI as well as debugging Max externals. This was a good first project for compiling a Max external from PC source in OSX. Once I got past the Redefinition errors by commenting out one of the includes in the OpenNI headers, "XnUtils.h" to be specific, I at least had it building the .mxo which loaded into the example patch.
From there I had to figure out why I was getting "File not Found" from OpenNI. After inserting a bunch of breakpoints in the source and stepping through them, I noticed that the file path for the jit.openni_config.xml was in a format that OpenNI couldn't understand. I.E Macintosh HD:/folder/jit.openni_config.xml. I then just replaced the s->s_name field with the filename in quotes in the xnContextRunXmlScriptFromFileEx call. At that point I noticed that it seemed to take longer for it to error with a different error than I got before. Almost seemed like OpenNI was going through the OpenNI Context + ProductionNode stuff then getting stuck.
Then I remembered you mentioning in one of the posts about the new version of OpenNI caused an issue whereas you had to modify the xml file to remove the Scene Node stuff. I did that tried it again and was excited to see the laser fire up. I didn't notice that the metro in the help patch was set to 500 ms. by default, which caused it to run at 2 fps. I didn't care at that moment, I at least had it all running and then left to pick up the kids with a smile on my face. As soon as I got home I spotted the metro and was totally relieved I didn't have to spend the night figuring out why it was running slow. It runs nice and fast 30fps and higher with all the stuff on at 640 480.
I really like that your object takes advantage of auto skeleton acquire which now longer needs the cactus pose, that came with last years updates to OpenNI/Nite. I just installed the latest version the other night and am really impressed how much they improved the skeleton tracking since v1 + v2. I also noticed they got it booting up much faster now. When I launch one of the OpenNI samples such as "Sample-NiUserTracker", it boots up the kinect and is tracking a skeleton in 2-3 seconds tops. It used to take like 10 seconds to launch and fire up the kinect, then 5 seconds - 30minutes to acquire a skeleton ;-)
So although it seems to be running good, I currently have it hacked to find the " jit.openni_config.xml" file at the root level of the HD, ie "Macintosh HD". I need to figure out how to format the name/path so OpenNI can read an OSX path correctly. In the meantime, Mac users can just put it at the root of their HD like I did. I also want to figure out how to include all the binaries and bits so users won't have to do the whole install MacPorts, OpenNI, Nite, and Sensor Driver in the terminal. Since I'm not a very good in C and Xcode, more of a node based guy, I will have to stumble through it like I did with getting this object going. I think that step will be worth it since that is such a painful process for the guys who just want to plug in the kinect and start Maxing, not spending hours installing all that stuff. I'll learn more about that by looking at the ofx OpenFrameworks examples which have an external bin folder of the needed bits. That would even be acceptable, say in the "max-externals" folder. In the meantime, users can use if they have the OpenNI/Nite libraries installed on their machines.
I think tonight I'll expand on your example patch and max a proper .maxhelp patch. I noticed you already wrote message handles to all the things I'm most interested in. Perhaps I'll add in the tilt motor control some day and some more skeleton output formats. Before I took us this project again, I was using NI-Mate for skeleton data and working out the rotation angles of each joint to make them bvh mocap compatible and better suited to control rigged 3d models. Using position to animate a 3d character is pretty ugly imo. Even though NI-Mate is a wonderful app, I wanted all the data at the same time including the 11 bit depth data, which non of them supply as far as I know. NI-Mate, Synapse, Osceleton.
Let me know how you want to release this, it's your baby. Do you want to make a separate forks on git.hub or would you rather modify one set of source files that have the built in platform checks? I'll clean up what I'm working on and can pass it to you at any time.
Cheers
Stuart
stuart@controlfreaksystems.tv
diablodale
10月 05 2012 | 4:18 午前
Stuart, your email address stuart@controlfreaksystems.tv is bouncing. Do you have another email?
ControlFreak
10月 05 2012 | 1:18 午後
My bad, I was pretty tired last night stuart@controlfreak.tv
diablodale
11月 14 2012 | 9:30 午前
We got a working Mac OSx port of this external. Its ready for broader testing/release. Full info in normal locations and as listed here in this toolbox.
Beep
1月 28 2013 | 3:42 午前
This is amazing, thanks so much guys! Quick recommendation for anyone that cares, I found the installation process on github to be a little confusing owing to my lack of knowledge regarding terminal. Can I recommend Zigfu to anyone else who is also having problems in this regard, http://zigfu.com/en/downloads/browserplugin/ it installs everything you need to get going through a more "conventional" installer process i.e one download, one installation. After that one installation you can open upo the max patch and a way you go!
Vagelis
3月 04 2013 | 12:37 午前
I have downloaded and used jit.openni And it's really amazing and very useful combined with kinect. I have created an application in max/msp in which I use kinect to get data. I have the PMD Camboard nano depth sensor and i have already connect it with the openni. I would like to know if i can use your jit.openni with this specific depth sensor. I have already tried to connect it but without any results. What could I do to make it work. Any help would be appreciated.
Thanks
Vagelis
diablodale
3月 04 2013 | 8:55 午前
@Vagelis, pleases open an issue at https://github.com/diablodale/jit.openni/issues where I can better support you.
vogt
11月 11 2013 | 5:40 午後
a truly awesome external, just one question:
is it possible to extend the range (i.e. depth/distance from camera) of the tracked area?
doesn't matter for me if it's unprecise on high distances, but i need to track a bigger area, and since this external doesn't allow 2 kinects i can't use it unless the depth can be enlarged.
i know that the depth image itself produces reasonable/usable results up to about 8 meters; at least the old freenect external allowed that, alas, without skeleton tracking.
is this a limit of openni itself, or do you have any influence on this?
would be great!
kind regards
dtr
11月 11 2013 | 9:53 午後
Correct me if I'm wrong but AFAIK both OpenNI and Kinect (MS) SDK limit the skeleton tracking range. It's not a changeable parameter, sadly. I would also prefer a degraded longer range than the cutoff we have now.
Melinka
1月 11 2014 | 1:35 午後
Hi, jit.openni is great! However, I'm trying to filter the position of individual joints, so that I can, for example, then apply algorithms only to the x-position of the left hand. How can I do that?
Thanks in advance!
diablodale
1月 11 2014 | 2:51 午後
@Melinka, is using the OSC format (skeleton_format=1), then look at the demo patch provided with the external. You can see it filtering only head data.
If using the native Max format (skeleton_format=1), then use all the usual Max message filtering techniques like the (route) object.
The depthmap goes until about 8m (4-8m is less reliable). Unfortunately, NITE v1.x limits skeleton joint recognition to 4m.
Melinka
1月 16 2014 | 9:36 午後
Thanks a lot @dialodale! You suggestions have been very helpful. For some reason OSC-route in the demo patch is not working on my Mac. Could it be that there is a compatibility problem for OS X 10.7.5? Have filtered the x,y,z position of single joints by using regular expressions. Is there a better way of filtering what skeleton/user I'm targeting than using regex OR is there a version of OSC-route that is compatible with OS X Lion, which I could possibly switch to? Cheers.
diablodale
1月 16 2014 | 10:26 午後
OSC-route is a 3rd party external http://cnmat.berkeley.edu/downloads
Referencing the wiki for jit.openni, it provides other formats for message output https://github.com/diablodale/jit.openni/wiki#skeleton-joint-data
Melinka
8月 27 2014 | 6:51 午後
Hi Diablodale,
Sorry about delay responding. I got the joint tracking working. Thanks a lot! Could you also tell me how I can restrict the tracking to one user at a time and when the user disappears and a new user enters that the new user will be tracked as user 1 again?
diablodale
8月 27 2014 | 7:10 午後
You will need to do your own patching to implement restrictions, selecting (watching) a single person, etc. OpenNI does not provide that functionality and the "rules" for watching a single person will be highly specific to your application/patch. Is it the closest person? Is it the tallest person? Is it the person waving their hands? You will need to make those choices yourself and develop you patch for those rules.
You will also need to do your own user number/ID management. OpenNI does not provide that functionality. jit.openni and dp.kinect both provide you a unique ID for each person that is currently in the view of the Kinect and recognized by its human body tracking software. Once that person leaves the view of the Kinect, that ID is available again for reuse by the same or a different human that the Kinect sees.
IF you can, I recommend you move to using dp.kinect (it is Windows only). OpenNI was killed by Apple and it may be increasingly hard to find support, drivers, etc. for it.
Anthony Palomba
8月 28 2014 | 8:50 午後
Hey Dale, do you know of any existing OSX drivers for the new Kinect2 camera?
diablodale
8月 28 2014 | 10:59 午後
Microsoft to date has only released solutions for Windows.
There is an open-source project to get basic depthmap, rgb, and ir images from the Kinect2. it is a very young project: libfreenect2.
Like its sibling (libfreenect), libfreenect2 may never get things like skeleton, joints, facial recognition, speech, etc.
didjefuz
2月 10 2015 | 1:22 午後
Hello Diablodale,
i'm on Mac and i cannot run Jit.openni external on Max7, as it looks like is accepting only 32bit externals - even if i run the application in 32bit mode, or reboot my mac to work at 32bit. Other externals are working just fine tho. Do u maybe know any way to make that i can run it anyway?
Thanks a lot!
diablodale
2月 10 2015 | 2:24 午後
Please consider what I wrote at the top of this post:
I caution everyone that OpenNI is dead. April 2014, the OpenNI website is closing. When this happens, there may be no legal place to download NITE (the essential component of OpenNI that does skeletal tracking). If you are on Windows, please use dp.kinect instead.
Apple legally killed NITE the skeleton tracking part of OpenNI. There is no method to run NITE on any computer anywhere in the world. Therefore, I provide no support on jit.openni because it is no longer possible to run it without NITE.
I encourage you to use dp.kinect or dp.kinect2 on a Windows machine for a well supported solution.
Author