here is my working jit.openni max jitter openni external for windows
Also I installed a fesh copy of windows7 on other partition and installed the following configuration only
OpenNI 126.96.36.199/NITE 188.8.131.52 for Win32 + SensorKinect Win32 Device Driver v0.7 (184.108.40.206)
and now when I open the max test file you’ve provided just crashes Max directly
Here is a screenshot of today’s try on fresh windows 7 with max and the same set of drivers. It just keep giving the same error… any ideas why that might be?
KinectSDK 1.5 + OpenNI/NITE at the same time
just found out that, while looking how to sort my problem with the primesense drivers out.
It might be interesting for you to look at
@diablodale; maybe its possible for you do incremental releases for the K4W version? i mean, if at the moment audio support isnt available, but main features as depth, rgb and skeleton tracking is working, maybe for many people is enought to start working and we are save of all the pain with openni driver instalations issues. if you implement in the future, the new features like skeleton orientation or face tracking then you can do the updates to the public object…
is just an idea without know the all the insights in object development.
of course all the respect for your decision as kindly developer of the object! :)
Well guys, I just tested the kinect-mssdk-openni-bridge with the kinect 1.5 SDK and the openni-win32-220.127.116.11-dev & nite-win32-18.104.22.168-dev
and everything works perfect! and jit.openni is getting the data! my nightmare from the last few days is over :)
Sounds great. Are you using K4W or K4Xbox? (I couldnt see that from a quick scan of your posts.)
Is it possible to post a walkthrough or the steps to get working jit.openni on windows. It will ease the process of using it because right now it’s a little bit obscure…
@lembert.dome with kinnect for xbox
I have download the latest external put it in the jitter external file and download all the driver but the red light of the kinect is not opening when I bang in the test file…..
Somebody have an idea ?
The max window is always saying that config initializtion open failed (this operation is invalid!)
Thanx a lot Diablodale for your help and external works great….. I was using the jit.openni.xml instead of an openni xml…..
@diablodale well I will argue about that because I installed the OpenNI and NITE SDKs by themselves and it didnt run, once i put the kinect SDK and that kinect-mssdk-openni-bridge then started working. Also I was wondering is there a way to configure the colours for tracked objects/persons in the UserPixelMap?
Another thing I’ve been inquiring is glitches in my audio. I suspect this is caused by the kinect-mssdk-openni-bridge but just wanted to mention it in case someone has experienced it. Basicaly while the kinnect is runing if I play music from any player the whole saund starts glitching. Another question is, is there a way to close the kinect when I stop the patch?
One thing I forgot to mention is, if you are using the Kinect SDK & OpenNI & Nite with the bridge, you have to remove the sceene node
@EletronicElement, I have no testing or support for the mssdk-openni-bridge. You will want to speak to that developer. I have never gotten reports of glitching of audio. Such behavior tends to be due to excessive CPU usage and/or USB bottlenecks.
If you want to use the OpenNI and NITE code (not the bridge) then remove the Microsoft SDK, remove the bridge. Install only the OpenNI and NITE *full sdks* and then test several of the samples they provide. If these samples don’t work, then nothing else (like jit.openni) will work.
The userPixelMap has no color. Its just data which you can interpret as you want.
Max patches never stop; they are always running. If instead you ask asking what happens when you close your patcher, the answer is that I call the OpenNI functions to stop the attached sensor. The same happens if you delete the jit.openni object. At this time, there is no separate "close" or "stop" command. I think you could simulate it by sending a read message with an invalid filename.
The Scene issue is known. Its documented above in this thread and in the jit.openni Wiki issues tab. OpenNI or NITE has a bug unrelated to the bridge.
i get back wiith more kinect tests using the last version and now im trying to get floor coords, but instead of normal values im getting these:
-1.#IND -1.#IND 0. 0. 0. 0.
the xml is configured only to get skeleton data and i added Scene1 to get floor messages…
im missing something? any idea?
@carsol, it is possible for you to get undefined values like that in OpenNI. I have seen it. It is in situations where the OpenNI stack can’t determine reliably where the floor is at. You need to bang the jit.open external at a reasonable rate (at least once per 50ms) for it to derive the floor. If the floor is not visible to the Kinect, that is more challenging for their algorithms. It can still work, but becomes more unreliable.
——–> JIT.OPENNI RUNNING ON OSX < -----------
@dialbodale I’m happy to report I now have jit.openni running on my Mac in OSX 10.68 (Snow Leapord) . I’m currently testing on Max 5.18 + Max 6.07, OpenNI v1.540, Nite v22.214.171.124. All seems good so far, no crashes, launches correctly each time. (knock on wood) I even tried reloading the xml file over and over and it just takes a second or two to come back online each time. Skeleton data starts outputting within seconds as well.
See my comments on bottom of your Toolbox page. http://cycling74.com/toolbox/jit-openni/
I left you a my contact so I can pass it along to you so we can get all the Mac users doing OpenNI directly in Max. It’s so nice and it gives more OpenNI data simultaneously than any other app I’ve seen.
Great news! Let me know if you need some more testing…
@dtr and other OSX types. Send me your email address and I can send you the compiled object, help patch, and modified xml file to get you started. I would really like to release the source to diablodale first and let him decide how, when, and where he wants to post that.
As far as testing goes, it’s running great so far. Still no crashes or hiccups. Works every time, which is amazing after spending 2 years using the Kinect and the various libs and tools, which were more prone to crashes or not starting up each time. Perhaps that has more to do with OpenNI/Nite library improvements as well as better implementation of NI calls from developers. I must say I’m also impressed with how well Cycling74 has done with their cross platform implementation. At the end of the day, I barely had to change anything to get it to compile run in OSX. diablodale has really wrote a pretty robust object. Hat’s off to him for sharing this with us.
stu that is awesome.
@controlfreak: I finally found the time to give your OSX port a try. Got it running fine on my 10.8.2 hackintosh and 10.7.5 macbookpro, both in Max6.0.7. The included help patch is great and incorporating it in my system, replacing the SimpleOpenNI Processing app I had previously, was a breeze. Time ran out before I could do serious stress testing but that will surely happen shortly.
One thingy: you’re using the slice object in the help patch which seems not to be a standard Max object, thus doesn’t load. It’s of minor concern as it’s just use for displaying the data.
Oh and the user map window doesn’t display anything. Is that expected?
Many many thanks for this! It’s great to finally be able to integrate skeleton tracking entirely in my Max workflow.
I also tried if there’s a way to address multiple kinects connected to the same computer for skeleton tracking. I didn’t succeed by messing with the XML file and from what I read on the openni dev google group it is not possible with the XML config file approach but it is in C++ code. See David Menard’s post: https://groups.google.com/d/msg/openni-dev/u3uWYWwJid0/tyOtePCejGsJ
Diablodale and ControlFreak, do you think this could be incorporated so that the jit.openni object can be given an argument/message for selecting which camera to use? That would fix the issue that for skeleton tracking with multiple kinects you need as many computers as kinects.
If there’s issues with multiple of those openni processes all running in the Max process (I have too little programming experience to formulate this better), I’d be totally fine with it if it requires exporting the patches as standalone apps so they run in individual processes.
@dtr, where do you see a (slice) object? I can’t find one. I see (zl slice) which documentation says is supported on both Max 5 and 6.
What is the Max version and OS under which you see this error? And what is the specific error message that you receive. Meanwhile, you are welcome to use the original patch that I created for jit.openni https://github.com/diablodale/jit.openni/blob/v0.7.3/bin/jit.openni_test1.maxpat
The usermap window should show data after a user is detected. Only after then. Did you stand away from the kinect so it can see your whole body? Also, in some cases the color it draws the body will be a dark grey so look closely if yo monitor can’t display the distinction between black and dark grey. In both the demo patches, you will only get a usermap if you also get skeleton joints. If you are unfamiliar with usermaps, I encourage you to read over what a user generator does/generates in the OpenNI documentation http://openni.org/Documentation/ProgrammerGuide.html#Concepts
I have integrated controlfreak’s changed into the main codebase of jit.openni. He and I are coordinating some future testing and updates to come likely in the next few weeks. I *CAUTION* people using the tilt functionality in controlfreaks test build. You may cause damage to your kinect with frequent use. He and I are discussing this and will likely include some changes in an upcoming update.
Personally, I have no intention for jit.openni to support configuration of selecting and initializing OpenNI devices (like Kinect, Xtion pro, etc.) outside the functionality of the XML file. The core method of choosing which device, what resolutions, fps, which middleware, etc. is all with the XML config file. I encourage avin (maker of sensorkinect) or someone else to enhance the sensorkinect driver so that it will support multiple Kinects via the XML file. After that is addressed, is a potential issue with having multiple user generators in the same Max process.
For Windows users, dp.kinect supports multiple Kinects and I encourage its use over jit.openni on the Windows platform.
The slice objects (not zl slice) are in the OSC_JOINTS subpatch. From the way it’s patched it looks like it should be zl slice indeed. Error:
newobj: slice: No such object
If you don’t have it in your patch perhaps we’re using a different version of controlfreak’s help patch? I received this one on the 6th of october.
About user maps, I’m indeed seeing no output while skeleton tracking is active. I’ll doublecheck when I’m back in the lab tomorrow.
As stated, my systems are 10.8.2 hackintosh and 10.7.5 macbookpro, both Max6.0.7.
@dtr, ok thanks for confirming the os/max. Just use the demo patch I provided you via the link (and is on github). It has had a nearly a year of testing on it. Remember that you are running a private branch of v0.7.3 so use the link I provided you or navigate to the v0.7.3 version on github. Reason being, there is a newer version of the demo patch but it requires a newer version of jit.openni which hasn’t been compiled/tested on mac yet.
Using that demo patch, stand away from your kinect and ensure you are getting OSC output in the Max window. If you are, you should also be getting an outline of your shape in the usermap pwindow in a dim grey. Could be bright grey, but likely dim.
If that still doesn’t work, get the updated XML config file at https://github.com/diablodale/jit.openni/blob/master/bin/jit.openni_config.xml as there might be an error in the config file he distributed privately. Use the current version of the XML file because you do need a OpenNI bug workaround that I’ve included in it.
This will all be sorted out with a publicly released version on https://github.com/diablodale/jit.openni very soon. Until then, I don’t recommend that you use the files he distributed for anything else than a proof-of-concept. I greatly appreciate and recognize controlfreak’s work and collaboration on this! :-) I have long hoped for a Osx partner! We now just need a week or two to sync, compile, and test the code to release it for everyone’s general use.
Splendid ;-) !
Alright, good to hear it’s developing! I’m gonna hold off further testing till the new version’s out. I have several shows in the coming weeks and rather rely on my tried and tested Processing solution for now. Feel free to send stuff over if you want some extra feedback.
Btw, I had attempted to compile jit.openni for OSX a while back. Got help from a programmer friend and we managed to get a basic object to compile, but as soon as he left I was lost already. Too complex for a novice coder like me, so I left it at that.
Hello, I’m back on the grid.
dtr, I believe you have an older release I sent out to people I saw asking about this object for OSX. It wasn’t an official release, more to share with others that wanted OpenNI natively inside Max on OSX such as I did. My early help patch did use the slice object, which shows how long I’ve been using Max. I always got used to that one even when the zl was available. Excuse my not keeping up with the times. I’ve changed it in the newer help patch which will get some more updates soon as I’ve had more time to play with the object myself. To be honest, I’ve kept using Max 5.x until just recently using 6 more. I have just dove into jit.gen which is faster for some of the math I was doing to reformat the depth information to what I prefer. Dale has already begun the merge of the platforms and it shouldn’t be long before we have a nice clean Xcode project on github with the features I added, only implemented more correctly.
So a question to anyone on the forum, have any of you been able to get a real Primesense Sensor?
I haven’t after submitting many requests over the past 2 years. It has higher resolutions of both the RGB and depth sensor than the Kinect so it’s improvements will be welcomed additions.
Sorry, bit frustrated of building my own solution since last night. This is perfect news, will build a new Max app that will have big similarities with Synapse, but then with all of the updated OpenNI driver goodness + tilt + Syphon output.
Since it will be in this app that I will continue could somebody maybe help me out with the problems I now have?
The port of jit.openni to Mac OSx is done and we’re doing some private testing now. Based on feedback from the testing, we could release a compiled build of it next week. :-)
Thanks for your efforts, I’m very much looking forward to see it working. I tried to compile version 0.8.5 on my Mac but I’m not getting any output. I have installed openni 1.5.4 using the zigfu installer but when I try to open the config.xml file, I get XML config initialization open fail (the file is corrupted!). Would you have any idea or should I just wait for the final version ? The simpleNI program in Processing works so I guess that the installation using zigfu worked.
@nimar, your timing is amazing. Today I was to post the following and it directly addresses your inquiry.
There is a bug that I have yet been able to resolve on Mac OSx. The bug is related to an interaction of OpenNI running in a library (Max external), loading files/xmlconfig/defaults, and the current working directory. Adding to this, the OpenNI logging to console and file both fail when OpenNI runs in a library.
Loading the debug XML file (the 2nd message box in the distributed patcher) that loads a prerecorded ONI works on all platforms. I can see that the external works after casually testing that. However, a real Kinect continues to be out of reach on Mac OSx.
Because I cannot get any debugging information from OpenNI’s internals, I am going to try to compile OpenNI today on Windows. I can cause a similar error on Windows. Perhaps if I resolve the issue on that platform, I can resolve it on Mac. I am far more skilled on Win.
If anyone can assist in resolving the issue, all will appreciate.
Hi all. The Mac OSx port of jit.openni is ready for a wider set of testing. I believe we have a fix that has blocked us for weeks.
Full doc and setup instructions are referenced in the README file and the wiki at https://github.com/diablodale/jit.openni/wiki
Downloads are at https://github.com/diablodale/jit.openni/downloads
If you find issues, please report them at https://github.com/diablodale/jit.openni/issues
i may sound noobish, but does that mean it’s (maybe/going to be) possible to use the supposedly windows-only kinekt with MaxMSP on a macintosh ?
Great ! though i don’t have a kinekt for now, i always thought it would be impossible to use a windows kinect on a mac… so this is good news :)
Sorry I’m not entirely clear about this yet. There’s a Kinect for Xbox and a Kinect for Windows sold by Microsoft. Will jit.openni also work for the Win version? Or does it need the KinectSDK?
ah eh… in fact, same question… i read too fast Diablodale, and i thought you were talking about the Kinect for Windows, which is designed to work with Windows and to not work with the Xbox. Iirc, the Kinect for Windows has a better accuracy (?) or something like that… so that’s why it would be awesome…
The Kinect hardware is manufactured in two forms: Kinect for XBox (KX) -and- Kinect for Windows (K4W). Both of them are pieces of electronics which attach to other electronics via USB. So if you can talk USB, then you can use the hardware.
Drivers exist which allow the KX hardware to run on Windows and Mac. https://github.com/avin2/SensorKinect
Those same drivers have preliminary support for the K4W hardware. I do not know how stable it is.
There are very subtle differences disclosed between the two pieces of Kinect hardware. The K4W firmware allows for much closer use "near mode" which trades long-distance usage for near-distance usage. This is probably what you have heard about. Other than that, there is no known differences in accuracy. In normal (not using near) mode, the two Kinects have the same accuracy.
There are licensing differences between the two pieces of hardware. Microsoft publishes those licenses and I leave it to you to be compliant with them.
Thanks diablodale ! It got it working now.
anyone have a clue about this error "jit_openni: XML config initialization open failed (Failed to open the file!)"
Is it correct that there is no way to terminate or pause the tracking process? One can stop banging jit.openni and it will not output matrices but as far as I can tell NITE keeps running in the background. Would be a handy addition if it could be stopped without closing the patch.
Talking of matrices, it seems that when depth, user map, etc outputs are turned off the corresponding outlets are still firing matrices, though empty ones. This can be eating up CPU cycles while users think they turned those off, especially when stuff is connected to it down the line, like jit.pwindow.
I’m using the Mac version.
Oh and here’s another suggestion: an additional mode where joints are output in a 15×1 matrix with 4 planes: confidence, x , y , z. For each bang it will output the currently stored matrix instead of all the separate messages it does now. Now that I think of it, not sure how this should work for multiple users/skeletons. I guess it could output a variable dim matrix with 1 row for each tracked skeleton.
I propose this ’cause I do all my processing on the joint data with matrix operations. Much more efficient/convenient than messages, colls, etc. Can also be run straight into a jit.gl.mesh for visualization.
Does anyone can get the IR stream from the jit.openni on Mac ?
If I try to comment the "Image1" node in the configuration file i get an error
There have been people successful using the IR stream working on the Mac.
Perhaps there is a language/translation difficulty. I don’t understand what you write above. Did you mean "uncomment"?
Please remember, the Kinect does not support simultaneous Color, Depth, and IR streams all at the same time. The Kinect doesn’t not have enough bandwidth. You can only have color -or- IR. You cannot have color -and- IR.
Thanks for the reply I’ll try next week,
but now I have a new question if you can help
I’m on win now with two xtion pro live connected
I create two jit.openni objects and sent to each a message "read jit.openni_config.xml"
but i get the same image from all the two jit.openni
I can’t start the second xtion.
I’m using a macbook pro with bootcamp.
You will need to configure an XML file; one for each sensor and send distinct read messages for each.
In each configuration file, you will need to configure it to connect and attach to a distinct Xtion. I do not know how that is done for the Xtion. The OpenNI hardware driver for the Xtion defines how this is done and I recommend you consult the ASUS documentation to see how to configure OpenNI for it. It is likely something that goes in the section within a <query> section of a <node>
Perhaps also query around in google. I found at least this http://answers.ros.org/question/61211/problem-with-xtion-pro-live-and-openni_camera/
Ok you are so kind
I’ll try to play with configuration file
I googled a lot and tried many configuration files but without lucky:
I’m trying to use two xtion in the same time,
this is my test patch
<code>-- Pasted Max Patch, click to expand. --Copy all of the following text. Then, in Max, select New From Clipboard.----------begin_max5_patcher---------- 623.3oc6VsriaBCEcM7UX40oQ17JIyt9cTUgLAOLNBrYLNSncz7uWigjPZBI LITqYQ2fEW+5bN9360u65.SD0zJH3IvO.NNu653XB0Dvo6eGXAodcNoxLLXA sphjQgyZ6SQqUl3RJIErgolKJobNKdsf+LKKFOutHe+fEaU4Tk5Wkz1cDBA+ rqqmEbUE62lNvdyQcg4aKXb8bL6sWWvRhZ8KLdVrjtV0tR9d954.vgsMQHSy x4nCafdkZ2dyRg6hxRMfWjr4a3.XOrvIEFr.+tjQxgMc7gqaymY+SDJ6HSQ8 Eov6Qj7mVQpQGJ2w3ohcCy9Y8UfSnJ9lNBuv.CKCZIKJXP15cA1h9ZwIbziy oU2Ck3zc54dlUtTx3J3mvZdsyqP77PM2vqZHkuggdnAYF5blsbZslVfyd9nG jyKrBmOlsZPCsdHwEDkjU2XsG+emdQ3Q0yksWLVYtlrLxHtCqmQmqmQVQOUf DPxfRYBgm0HKl1oRZ5RcfQ3doNth1bgLGg+2qcAA898ZS76MThrrb5fhVSVq 6sfSvgLSWkiGWHolMJpLlxII4zAxc4OgtlW06mT.7QW+V0T8NpNcYgQWhBGo tz+kEi6r2rfvbF+ueitASMwOUppDakq2y3t7YfvCvJkVoXbhhI38FittI3Hz egklR48OxRYUMmhFvit3Q1mAMnafF8atrEbzIz.3a.mHqhlaINAVCMAi.Mqr JZtkMdg0Pi+XrwVCM3wbTYOaL1+KkOFOBzDdmnoM6Lor7MprpaIM.QWxZiPd nrOTWTo8WSk.nj9Fa+3MEYfDotBiRWdYqrsRP8xHcc.897g6e72CY7B -----------end_max5_patcher-----------
and this is my simplify configuration file
<Node type="Depth" name="Depth1">
<Mirror on="true" />
The first device works correctly but when I try to open the second device sending "read jit.openni_config_2.xml" I got this error:
jit_openni: XML config initialization open failed (Failed to set USB interface!)
Based on openni documentation the "query" element has these properties
_________________________________"Vendor", specifying the requested node vendor. "Name", specifying the requested node name. "MinVersion", specifying the requested node minimum version. "MaxVersion", specifying the requested node maximum version. "Capabilities", specifying a list of capabilities the node must support, each under a "Capability" sub-element. "MapOutputModes", specifying a list of map output mode that should be supported by the map generator, each under a "MapOutputMode" object, containing three attributes: "xRes", "yRes" and "fps". "MinUserPositions", specifying the minimum number of user positions supported by a depth generator that has the "UserPosition" capability. "ExistingNodeOnly", specifying that only existing nodes (e.g. nodes that were already created) will enumerate. "NonExistingNodeOnly", specifying that only non-existing nodes (e.g. nodes that were not created yet) will enumerate. "NeededNodes", specifying only production trees that contains specific nodes are valid. Those nodes are declared using a sub-element named "Node".
I think nothing about USB interface
I tried also to modified the file in (Mac OS X) use/etc/primesense/GlobalDefaults.ini uncommented the line
; USB interface to be used. 0 – FW Default, 1 – ISO endpoints, 2 – BULK endpoints. Default: Arm – 2, other platforms – 1
but got the same error.
I tried my patch on macbook pro and on mac mini with 4 USB port and got the same error.
Does anyone can use two xtion or two kinect with jit.openni?
If someone is able to do can post your configuration file?
Thanks so much
It is not possible to use two Kinects with jit.openni because the Openni hack hardware driver for Kinect did not fully implement handling for multiple devices.
The XML file that you post above doesn’t have anything in the query area that would suggest a specific device. No id, no index number, usb hub, nothing. And you list PrimeSense…but isn’t your sensor an Asus? I’m surprised that one even works.
If ASUS wrote the driver to support multiple devices, then I would hope they also document how to enable this through query to a specific device. That’s how OpenNI was designed to work. If you find someone successful or get support from ASUS (have you called them to ask how you configure the OpenNI xml file?), please do post here so everyone can benefit.
The Asus Xtion is a PrimeSense/OpenNI sensor/camera, just like the Kinect.
If 2 depth maps is all you’re after then jit.freenect.grab supports multiple Kinects, though I don’t know if it works for the Xtions as well. If you want 2 skeletons then you’re outta luck. Not supported by OpenNI, Kinect SDK or Freenect (the latter doesn’t do skeletons at all). I have 2 computers for my 2 Kinects because of this.
Some people on the OpenNI forums found a way to run 2 instances of openni skeleton tracking on 1 computer but it’s too technical for me/most.
The topic is about The Xtion and Openni. However, I would like to correct some out-of-date information that DTR shares with good intention. :-)
Good news. Things are better than DTR suggests with the Kinect SDK and dp.kinect
Because dp.kinect is based on the Microsoft SDK and supports multiple Kinect sensors, the functionality has improved substantially. Up to 4 Kinect sensors can together be used for skeletal tracking. Each sensor can track 6 people and 2 detailed skeletons. That means if you have a powerful CPU then…
4 Kinect sensors on one computer can…
detect (4 x 6) = 24 people
track detailed joints on (4 x 2) = 8 skeletons
Details at http://msdn.microsoft.com/en-us/library/dn188677.aspx
I appreciate your interest in multiple sensors. The Mac continues to be a world of hacks and half implemented hardware. Its just the way it is. :-/ I hope that you have luck contacting ASUS and they provide you the way to configure OpenNI for multiple ASUS sensors. Don’t forget to tell them this is OpenNI 1.x (not 2.0).
> Because dp.kinect is based on the Microsoft SDK and supports multiple Kinect sensors…
You serious?! Damn you PrimeSense! Why the f*ck can’t you do it if M$ can…?
Btw, do they aggregate in 1 tracking process or do they stay separate? Ie. do they automatically merge their tracked data?
No aggregation, that same MSDN URL speaks to that. ;-) No matter who’s tech you use, if you get down to tracking users (or further down to skeleton joints) across sensors, the challenges to solve are at least:
- Aligning the coordinate spaces across the sensors
- Merging seen logical users if 2+ Kinect observe the same physical user
- Handoff/merging of a user as they leave one Kinect’s range and enter another
- Managing the delay that occurs for a Kinect to identify and track a user. For fast moving people, its problematic if you have little overlap between Kinects
I started creating some Max code to tackle some of these, but stopped as I chose to take another approach than using skeleton tracking. (I am releasing an update to dp.kinect within the next 2 weeks that does help -a little- with this.)
I see… (sorry, in all my excitement I missed that link you posted)
Well at least both Kinects can be attached to the same computer and do skeletons now. I already have skeleton merging logics (though merging pointcloud data *before* skeleton tracking would surely yield better results).
Drop me a message at dtr(d0t)vndrn(@t)gmail(d0t)com if you’d like to see how I’m merging. Perhaps we can exchange techniques.
Off to test my system on Windows… And now I stop spoiling this jit.openni thread with dp.kinect stuff.
haha, no worries. I know you’re not dumb. Just a friendly "ribbing" along w/ the info. ;-)
I’m still working on how to use multiple devices (xtion) with jit.openni
My results until now:
– On Macbook pro OS-X 10.8.3 i successful tested two xtion in the same time using two different applications (openframeworks example + drgb openni) so now I can exclude the USB BUS problem
– I found on internet two working example of openni and multiple sensors
here the links:
multi sensor example working code
Openframework multi sensors version
– I downloaded the jit.openni source code and successful compiled it but I’m not able enough to add the code to list available devices and open the second devices if the first is just opened
I’m not sure it is enough add xml code into the config file to force jit.openni to open the second device maybe we need some code into the object itself.
Hope someone can help me
I am also aware of custom hacked C code that has been written to connect two Kinects at the same time. I have not investigated that code. Instead, I am using the SDKs released by OpenNI and the hardware drivers that come from the manufacturers or other people without hacking beyond them. If you find a way to easily adjust the jit.openni code that falls in line with the SDKs, I don’t mind partnering with you to add that feature.
I caution you about running multiple sensors on the same USB controller. For example, *one* Kinect capturing depth, color, and IR all at the same time is impossible at 30fps.
USB 2.0 theoretical bandwidth = 60 MB/sec (its actually less due to overhead)
depth 640×480 = 18.432 MB/sec
color 640×480 = 27.648 MB/sec
IR 640×480 = 18.432 MB/sec
Total needed = 64.512 MB/sec (its too much…just *one* Kinect)
The authors of hardware drivers for the sensors can each choose if they want to limit multiple sensors via code or leave the error handling due to bandwidth congestion to us. If you limit your data types collected and your hardware driver doesn’t force separate hubs due to the bandwidth limitations, it could be possible to have them on the same hub.
Very good news
– uninstall Openni 1.5
– download OpenNI2 for OSX
– run sudo ./install.sh
– copy all files from the Redist directory to usr/lib
– go to Openni2/Samples/bin folder
– double click to "MultiDepthViewer"
and voilà two xtion are working together
Tomorrow I’m looking for the same in Openni 1.5
did you manage to have 2 xtion (pro or pro live?) on the same mac osx (MBP?)?
with 2 jit.openni in 1 or 2 max app?
It is possible use two xtion on a mbp but not with two jit.openni
I used the examples app you can find in the openni2 sdk for mac
I dont get it. you are not using it in max?
do you mean you have 2 Xtion Pro Live in 1 hit.openni in max in osx???
Can you still use a kinect, or the asus driver only work for xtion?
With the live version, do you have the rub stream
I tried to use two xtion in max but with no success
jit.openni can manage only 1 xtion right now
you can’t create two objects to manage two xtions because you will get an error
so my idea was modify the multicamera xtion example (in xtion mac sdk) to send the two xtion frames by syphon to max but now I don’t have time.
I can use the kinect I install the driver by sensecast dmg
I caution everyone that OpenNI is dead. Next month, the OpenNI website is closing. When this happens, there may be no legal place to download NITE (the essential component of OpenNI that does skeletal tracking). This could lead to software piracy and illegal distribution. NiMATE, jit.openni, Synapse…they all use OpenNI.
Thank you Diablodable for the caution, this is a big problem for me and other person that are using asus xtion on OSX,
I’m thinking to back to use the kinect with libfreenect, probably now the problem on the 1473 model has been fixed.
Any news on this subject are appreciated
Forums > Jitter