Forums > Jitter

here is my working jit.openni max jitter openni external for windows

June 2, 2012 | 7:18 pm

Also I installed a fesh copy of windows7 on other partition and installed the following configuration only
Max/MSP/Jitter 6.0.5
OpenNI 1.3.2.3/NITE 1.4.1.2 for Win32 + SensorKinect Win32 Device Driver v0.7 (5.0.3.4)

and now when I open the max test file you’ve provided just crashes Max directly


June 3, 2012 | 2:13 pm

Here is a screenshot of today’s try on fresh windows 7 with max and the same set of drivers. It just keep giving the same error… any ideas why that might be?

[attachment=195654,3981]

Attachments:
  1. problem.jpg

June 4, 2012 | 3:56 am

KinectSDK 1.5 + OpenNI/NITE at the same time

http://code.google.com/p/kinect-mssdk-openni-bridge/

http://groups.google.com/group/openni-dev/browse_thread/thread/b56f8587a1217aa1

just found out that, while looking how to sort my problem with the primesense drivers out.
It might be interesting for you to look at


June 4, 2012 | 12:20 pm

@diablodale; maybe its possible for you do incremental releases for the K4W version? i mean, if at the moment audio support isnt available, but main features as depth, rgb and skeleton tracking is working, maybe for many people is enought to start working and we are save of all the pain with openni driver instalations issues. if you implement in the future, the new features like skeleton orientation or face tracking then you can do the updates to the public object…

is just an idea without know the all the insights in object development.
of course all the respect for your decision as kindly developer of the object! :)

thanks!


June 4, 2012 | 4:07 pm

Well guys, I just tested the kinect-mssdk-openni-bridge with the kinect 1.5 SDK and the openni-win32-1.5.2.23-dev & nite-win32-1.5.2.21-dev
and everything works perfect! and jit.openni is getting the data! my nightmare from the last few days is over :)


June 4, 2012 | 4:17 pm

Sounds great. Are you using K4W or K4Xbox? (I couldnt see that from a quick scan of your posts.)


June 4, 2012 | 5:44 pm

Is it possible to post a walkthrough or the steps to get working jit.openni on windows. It will ease the process of using it because right now it’s a little bit obscure…

Thanx Ben


June 4, 2012 | 5:53 pm

@lembert.dome with kinnect for xbox


June 4, 2012 | 7:49 pm

@benoit-1842 check this -> https://github.com/diablodale/jit.openni/wiki


June 4, 2012 | 8:28 pm

I have download the latest external put it in the jitter external file and download all the driver but the red light of the kinect is not opening when I bang in the test file…..

Somebody have an idea ?


June 4, 2012 | 9:25 pm

The max window is always saying that config initializtion open failed (this operation is invalid!)


June 4, 2012 | 10:38 pm

@benoit-1842 follow the discution between me and diablodale above, I went through some issues and have wrote about everything


June 10, 2012 | 1:33 am

@benoit-1842, do you have the OpenNI XML config file required?


June 10, 2012 | 1:34 am

@ElectronicElement, it reads as if you have not installed the SDK. Only the OpenNI and NITE SDKs have the samples.


June 10, 2012 | 3:00 am

Thanx a lot Diablodale for your help and external works great….. I was using the jit.openni.xml instead of an openni xml…..

thanx again


June 10, 2012 | 11:15 am

@diablodale well I will argue about that because I installed the OpenNI and NITE SDKs by themselves and it didnt run, once i put the kinect SDK and that kinect-mssdk-openni-bridge then started working. Also I was wondering is there a way to configure the colours for tracked objects/persons in the UserPixelMap?
Another thing I’ve been inquiring is glitches in my audio. I suspect this is caused by the kinect-mssdk-openni-bridge but just wanted to mention it in case someone has experienced it. Basicaly while the kinnect is runing if I play music from any player the whole saund starts glitching. Another question is, is there a way to close the kinect when I stop the patch?
Thanks


June 10, 2012 | 5:07 pm

One thing I forgot to mention is, if you are using the Kinect SDK & OpenNI & Nite with the bridge, you have to remove the sceene node from the jit.openni_config.xml


June 11, 2012 | 12:00 pm

@EletronicElement, I have no testing or support for the mssdk-openni-bridge. You will want to speak to that developer. I have never gotten reports of glitching of audio. Such behavior tends to be due to excessive CPU usage and/or USB bottlenecks.

If you want to use the OpenNI and NITE code (not the bridge) then remove the Microsoft SDK, remove the bridge. Install only the OpenNI and NITE *full sdks* and then test several of the samples they provide. If these samples don’t work, then nothing else (like jit.openni) will work.

The userPixelMap has no color. Its just data which you can interpret as you want.

Max patches never stop; they are always running. If instead you ask asking what happens when you close your patcher, the answer is that I call the OpenNI functions to stop the attached sensor. The same happens if you delete the jit.openni object. At this time, there is no separate "close" or "stop" command. I think you could simulate it by sending a read message with an invalid filename.

The Scene issue is known. Its documented above in this thread and in the jit.openni Wiki issues tab. OpenNI or NITE has a bug unrelated to the bridge.


July 4, 2012 | 7:09 pm

hi..
i get back wiith more kinect tests using the last version and now im trying to get floor coords, but instead of normal values im getting these:

-1.#IND -1.#IND 0. 0. 0. 0.

the xml is configured only to get skeleton data and i added Scene1 to get floor messages…


im missing something? any idea?
thanks!


September 6, 2012 | 7:35 pm

Hi all. Here is the Microsoft SDK version of this external.
Please see the post at http://cycling74.com/forums/topic.php?id=42319


September 8, 2012 | 6:32 pm

@carsol, it is possible for you to get undefined values like that in OpenNI. I have seen it. It is in situations where the OpenNI stack can’t determine reliably where the floor is at. You need to bang the jit.open external at a reasonable rate (at least once per 50ms) for it to derive the floor. If the floor is not visible to the Kinect, that is more challenging for their algorithms. It can still work, but becomes more unreliable.


October 5, 2012 | 1:28 pm

——–> JIT.OPENNI RUNNING ON OSX < -----------

@dialbodale I’m happy to report I now have jit.openni running on my Mac in OSX 10.68 (Snow Leapord) . I’m currently testing on Max 5.18 + Max 6.07, OpenNI v1.540, Nite v1.5.2.21. All seems good so far, no crashes, launches correctly each time. (knock on wood) I even tried reloading the xml file over and over and it just takes a second or two to come back online each time. Skeleton data starts outputting within seconds as well.

See my comments on bottom of your Toolbox page. http://cycling74.com/toolbox/jit-openni/

I left you a my contact so I can pass it along to you so we can get all the Mac users doing OpenNI directly in Max. It’s so nice and it gives more OpenNI data simultaneously than any other app I’ve seen.

Cheers

Stuart White



dtr
October 5, 2012 | 2:50 pm

Great news! Let me know if you need some more testing…


October 5, 2012 | 9:47 pm

@dtr and other OSX types. Send me your email address and I can send you the compiled object, help patch, and modified xml file to get you started. I would really like to release the source to diablodale first and let him decide how, when, and where he wants to post that.

As far as testing goes, it’s running great so far. Still no crashes or hiccups. Works every time, which is amazing after spending 2 years using the Kinect and the various libs and tools, which were more prone to crashes or not starting up each time. Perhaps that has more to do with OpenNI/Nite library improvements as well as better implementation of NI calls from developers. I must say I’m also impressed with how well Cycling74 has done with their cross platform implementation. At the end of the day, I barely had to change anything to get it to compile run in OSX. diablodale has really wrote a pretty robust object. Hat’s off to him for sharing this with us.

Stuart stuart@controlfreak.tv


October 6, 2012 | 2:29 am

stu that is awesome.

you rock



dtr
October 21, 2012 | 9:16 am

@controlfreak: I finally found the time to give your OSX port a try. Got it running fine on my 10.8.2 hackintosh and 10.7.5 macbookpro, both in Max6.0.7. The included help patch is great and incorporating it in my system, replacing the SimpleOpenNI Processing app I had previously, was a breeze. Time ran out before I could do serious stress testing but that will surely happen shortly.

One thingy: you’re using the slice object in the help patch which seems not to be a standard Max object, thus doesn’t load. It’s of minor concern as it’s just use for displaying the data.

Oh and the user map window doesn’t display anything. Is that expected?

Many many thanks for this! It’s great to finally be able to integrate skeleton tracking entirely in my Max workflow.



dtr
October 21, 2012 | 9:31 am

I also tried if there’s a way to address multiple kinects connected to the same computer for skeleton tracking. I didn’t succeed by messing with the XML file and from what I read on the openni dev google group it is not possible with the XML config file approach but it is in C++ code. See David Menard’s post: https://groups.google.com/d/msg/openni-dev/u3uWYWwJid0/tyOtePCejGsJ

Diablodale and ControlFreak, do you think this could be incorporated so that the jit.openni object can be given an argument/message for selecting which camera to use? That would fix the issue that for skeleton tracking with multiple kinects you need as many computers as kinects.

If there’s issues with multiple of those openni processes all running in the Max process (I have too little programming experience to formulate this better), I’d be totally fine with it if it requires exporting the patches as standalone apps so they run in individual processes.


October 21, 2012 | 12:01 pm

@dtr, where do you see a (slice) object? I can’t find one. I see (zl slice) which documentation says is supported on both Max 5 and 6.
What is the Max version and OS under which you see this error? And what is the specific error message that you receive. Meanwhile, you are welcome to use the original patch that I created for jit.openni https://github.com/diablodale/jit.openni/blob/v0.7.3/bin/jit.openni_test1.maxpat

The usermap window should show data after a user is detected. Only after then. Did you stand away from the kinect so it can see your whole body? Also, in some cases the color it draws the body will be a dark grey so look closely if yo monitor can’t display the distinction between black and dark grey. In both the demo patches, you will only get a usermap if you also get skeleton joints. If you are unfamiliar with usermaps, I encourage you to read over what a user generator does/generates in the OpenNI documentation http://openni.org/Documentation/ProgrammerGuide.html#Concepts

I have integrated controlfreak’s changed into the main codebase of jit.openni. He and I are coordinating some future testing and updates to come likely in the next few weeks. I *CAUTION* people using the tilt functionality in controlfreaks test build. You may cause damage to your kinect with frequent use. He and I are discussing this and will likely include some changes in an upcoming update.

Personally, I have no intention for jit.openni to support configuration of selecting and initializing OpenNI devices (like Kinect, Xtion pro, etc.) outside the functionality of the XML file. The core method of choosing which device, what resolutions, fps, which middleware, etc. is all with the XML config file. I encourage avin (maker of sensorkinect) or someone else to enhance the sensorkinect driver so that it will support multiple Kinects via the XML file. After that is addressed, is a potential issue with having multiple user generators in the same Max process.

For Windows users, dp.kinect supports multiple Kinects and I encourage its use over jit.openni on the Windows platform.



dtr
October 21, 2012 | 12:29 pm

The slice objects (not zl slice) are in the OSC_JOINTS subpatch. From the way it’s patched it looks like it should be zl slice indeed. Error:

newobj: slice: No such object

If you don’t have it in your patch perhaps we’re using a different version of controlfreak’s help patch? I received this one on the 6th of october.

About user maps, I’m indeed seeing no output while skeleton tracking is active. I’ll doublecheck when I’m back in the lab tomorrow.

As stated, my systems are 10.8.2 hackintosh and 10.7.5 macbookpro, both Max6.0.7.


October 21, 2012 | 1:07 pm

@dtr, ok thanks for confirming the os/max. Just use the demo patch I provided you via the link (and is on github). It has had a nearly a year of testing on it. Remember that you are running a private branch of v0.7.3 so use the link I provided you or navigate to the v0.7.3 version on github. Reason being, there is a newer version of the demo patch but it requires a newer version of jit.openni which hasn’t been compiled/tested on mac yet.

Using that demo patch, stand away from your kinect and ensure you are getting OSC output in the Max window. If you are, you should also be getting an outline of your shape in the usermap pwindow in a dim grey. Could be bright grey, but likely dim.

If that still doesn’t work, get the updated XML config file at https://github.com/diablodale/jit.openni/blob/master/bin/jit.openni_config.xml as there might be an error in the config file he distributed privately. Use the current version of the XML file because you do need a OpenNI bug workaround that I’ve included in it.

This will all be sorted out with a publicly released version on https://github.com/diablodale/jit.openni very soon. Until then, I don’t recommend that you use the files he distributed for anything else than a proof-of-concept. I greatly appreciate and recognize controlfreak’s work and collaboration on this! :-) I have long hoped for a Osx partner! We now just need a week or two to sync, compile, and test the code to release it for everyone’s general use.


October 21, 2012 | 1:45 pm

Splendid ;-) !



dtr
October 21, 2012 | 2:45 pm

Alright, good to hear it’s developing! I’m gonna hold off further testing till the new version’s out. I have several shows in the coming weeks and rather rely on my tried and tested Processing solution for now. Feel free to send stuff over if you want some extra feedback.

Btw, I had attempted to compile jit.openni for OSX a while back. Got help from a programmer friend and we managed to get a basic object to compile, but as soon as he left I was lost already. Too complex for a novice coder like me, so I left it at that.


October 22, 2012 | 12:11 pm

Hello, I’m back on the grid.

dtr, I believe you have an older release I sent out to people I saw asking about this object for OSX. It wasn’t an official release, more to share with others that wanted OpenNI natively inside Max on OSX such as I did. My early help patch did use the slice object, which shows how long I’ve been using Max. I always got used to that one even when the zl was available. Excuse my not keeping up with the times. I’ve changed it in the newer help patch which will get some more updates soon as I’ve had more time to play with the object myself. To be honest, I’ve kept using Max 5.x until just recently using 6 more. I have just dove into jit.gen which is faster for some of the math I was doing to reformat the depth information to what I prefer. Dale has already begun the merge of the platforms and it shouldn’t be long before we have a nice clean Xcode project on github with the features I added, only implemented more correctly.

So a question to anyone on the forum, have any of you been able to get a real Primesense Sensor?

http://www.primesense.com/en/solutions/solsensor

I haven’t after submitting many requests over the past 2 years. It has higher resolutions of both the RGB and depth sensor than the Kinect so it’s improvements will be welcomed additions.

Cheers S.


October 24, 2012 | 9:51 am

YESYESYES!

Sorry, bit frustrated of building my own solution since last night. This is perfect news, will build a new Max app that will have big similarities with Synapse, but then with all of the updated OpenNI driver goodness + tilt + Syphon output.

Since it will be in this app that I will continue could somebody maybe help me out with the problems I now have?

http://cycling74.com/forums/topic.php?id=43238


November 2, 2012 | 10:23 pm

The port of jit.openni to Mac OSx is done and we’re doing some private testing now. Based on feedback from the testing, we could release a compiled build of it next week. :-)


November 13, 2012 | 12:58 pm

Hi diablodale,

Thanks for your efforts, I’m very much looking forward to see it working. I tried to compile version 0.8.5 on my Mac but I’m not getting any output. I have installed openni 1.5.4 using the zigfu installer but when I try to open the config.xml file, I get XML config initialization open fail (the file is corrupted!). Would you have any idea or should I just wait for the final version ? The simpleNI program in Processing works so I guess that the installation using zigfu worked.

Nicolas


November 13, 2012 | 1:22 pm

@nimar, your timing is amazing. Today I was to post the following and it directly addresses your inquiry.

There is a bug that I have yet been able to resolve on Mac OSx. The bug is related to an interaction of OpenNI running in a library (Max external), loading files/xmlconfig/defaults, and the current working directory. Adding to this, the OpenNI logging to console and file both fail when OpenNI runs in a library.

Loading the debug XML file (the 2nd message box in the distributed patcher) that loads a prerecorded ONI works on all platforms. I can see that the external works after casually testing that. However, a real Kinect continues to be out of reach on Mac OSx.

Because I cannot get any debugging information from OpenNI’s internals, I am going to try to compile OpenNI today on Windows. I can cause a similar error on Windows. Perhaps if I resolve the issue on that platform, I can resolve it on Mac. I am far more skilled on Win.

If anyone can assist in resolving the issue, all will appreciate.


November 14, 2012 | 5:16 pm

Hi all. The Mac OSx port of jit.openni is ready for a wider set of testing. I believe we have a fix that has blocked us for weeks.
Full doc and setup instructions are referenced in the README file and the wiki at https://github.com/diablodale/jit.openni/wiki
Downloads are at https://github.com/diablodale/jit.openni/downloads

If you find issues, please report them at https://github.com/diablodale/jit.openni/issues


November 15, 2012 | 12:59 am

Hello,
i may sound noobish, but does that mean it’s (maybe/going to be) possible to use the supposedly windows-only kinekt with MaxMSP on a macintosh ?


November 15, 2012 | 12:03 pm

@vichug, that is correct. You can use the Kinect XBox on a Mac with MaxMSP. This ported external makes it easier than other solutions. :-)


November 15, 2012 | 12:57 pm

Great ! though i don’t have a kinekt for now, i always thought it would be impossible to use a windows kinect on a mac… so this is good news :)



dtr
November 15, 2012 | 1:24 pm

Sorry I’m not entirely clear about this yet. There’s a Kinect for Xbox and a Kinect for Windows sold by Microsoft. Will jit.openni also work for the Win version? Or does it need the KinectSDK?


November 15, 2012 | 1:59 pm

ah eh… in fact, same question… i read too fast Diablodale, and i thought you were talking about the Kinect for Windows, which is designed to work with Windows and to not work with the Xbox. Iirc, the Kinect for Windows has a better accuracy (?) or something like that… so that’s why it would be awesome…


November 15, 2012 | 4:06 pm

The Kinect hardware is manufactured in two forms: Kinect for XBox (KX) -and- Kinect for Windows (K4W). Both of them are pieces of electronics which attach to other electronics via USB. So if you can talk USB, then you can use the hardware.

Drivers exist which allow the KX hardware to run on Windows and Mac. https://github.com/avin2/SensorKinect
Those same drivers have preliminary support for the K4W hardware. I do not know how stable it is.

There are very subtle differences disclosed between the two pieces of Kinect hardware. The K4W firmware allows for much closer use "near mode" which trades long-distance usage for near-distance usage. This is probably what you have heard about. Other than that, there is no known differences in accuracy. In normal (not using near) mode, the two Kinects have the same accuracy.

There are licensing differences between the two pieces of hardware. Microsoft publishes those licenses and I leave it to you to be compliant with them.


November 15, 2012 | 5:05 pm

Thanks diablodale ! It got it working now.


January 5, 2013 | 6:16 pm

anyone have a clue about this error "jit_openni: XML config initialization open failed (Failed to open the file!)"


January 7, 2013 | 1:55 pm

@oceanmachine89, I request you open a ticket at https://github.com/diablodale/jit.openni/issues
In your ticket, please provide full computer type, config, os, how you installed all the jit.,openni software components, and finally repro steps



dtr
March 31, 2013 | 1:12 pm

Hey,
Is it correct that there is no way to terminate or pause the tracking process? One can stop banging jit.openni and it will not output matrices but as far as I can tell NITE keeps running in the background. Would be a handy addition if it could be stopped without closing the patch.

Talking of matrices, it seems that when depth, user map, etc outputs are turned off the corresponding outlets are still firing matrices, though empty ones. This can be eating up CPU cycles while users think they turned those off, especially when stuff is connected to it down the line, like jit.pwindow.

I’m using the Mac version.



dtr
April 1, 2013 | 1:25 pm

Oh and here’s another suggestion: an additional mode where joints are output in a 15×1 matrix with 4 planes: confidence, x , y , z. For each bang it will output the currently stored matrix instead of all the separate messages it does now. Now that I think of it, not sure how this should work for multiple users/skeletons. I guess it could output a variable dim matrix with 1 row for each tracked skeleton.

I propose this ’cause I do all my processing on the joint data with matrix operations. Much more efficient/convenient than messages, colls, etc. Can also be run straight into a jit.gl.mesh for visualization.


April 2, 2013 | 2:05 pm

@dtr, these are good ideas. I request that you open issues for them at https://github.com/diablodale/jit.openni/issues so they can be tracked and you updated. I recommend 3 issues as they are independent and some focused discussion on them is needed.


May 30, 2013 | 2:28 am

Does anyone can get the IR stream from the jit.openni on Mac ?

If I try to comment the "Image1" node in the configuration file i get an error


May 30, 2013 | 4:22 am

There have been people successful using the IR stream working on the Mac.

Perhaps there is a language/translation difficulty. I don’t understand what you write above. Did you mean "uncomment"?

Please remember, the Kinect does not support simultaneous Color, Depth, and IR streams all at the same time. The Kinect doesn’t not have enough bandwidth. You can only have color -or- IR. You cannot have color -and- IR.


June 6, 2013 | 3:48 am

Thanks for the reply I’ll try next week,

but now I have a new question if you can help

I’m on win now with two xtion pro live connected

I create two jit.openni objects and sent to each a message "read jit.openni_config.xml"

but i get the same image from all the two jit.openni

I can’t start the second xtion.

I’m using a macbook pro with bootcamp.

Any ideas?


June 6, 2013 | 4:25 am

You will need to configure an XML file; one for each sensor and send distinct read messages for each.

In each configuration file, you will need to configure it to connect and attach to a distinct Xtion. I do not know how that is done for the Xtion. The OpenNI hardware driver for the Xtion defines how this is done and I recommend you consult the ASUS documentation to see how to configure OpenNI for it. It is likely something that goes in the section within a <query> section of a <node>

Perhaps also query around in google. I found at least this http://answers.ros.org/question/61211/problem-with-xtion-pro-live-and-openni_camera/


June 6, 2013 | 6:58 am

Ok you are so kind

I’ll try to play with configuration file


June 8, 2013 | 4:13 am

I googled a lot and tried many configuration files but without lucky:

I’m trying to use two xtion in the same time,
this is my test patch

<code>

– Pasted Max Patch, click to expand. –

</code>

and this is my simplify configuration file

_________________________________
<OpenNI>

<ProductionNodes startGenerating="true">
<Node type="Device">
<Query>
<Vendor>PrimeSense</Vendor>
</Query>
</Node>

<Node type="Depth" name="Depth1">
<Configuration>
<Mirror on="true" />
</Configuration>
</Node>
</ProductionNodes>

</OpenNI>
_________________________________

The first device works correctly but when I try to open the second device sending "read jit.openni_config_2.xml" I got this error:

_________________________________
jit_openni: XML config initialization open failed (Failed to set USB interface!)
_________________________________

Based on openni documentation the "query" element has these properties

_________________________________
"Vendor", specifying the requested node vendor.
"Name", specifying the requested node name.
"MinVersion", specifying the requested node minimum version.
"MaxVersion", specifying the requested node maximum version.
"Capabilities", specifying a list of capabilities the node must support, each under a "Capability" sub-element.
"MapOutputModes", specifying a list of map output mode that should be supported by the map generator, each under a "MapOutputMode" object, containing three attributes: "xRes", "yRes" and "fps".
"MinUserPositions", specifying the minimum number of user positions supported by a depth generator that has the "UserPosition" capability.
"ExistingNodeOnly", specifying that only existing nodes (e.g. nodes that were already created) will enumerate.
"NonExistingNodeOnly", specifying that only non-existing nodes (e.g. nodes that were not created yet) will enumerate.
"NeededNodes", specifying only production trees that contains specific nodes are valid. Those nodes are declared using a sub-element named "Node".
_________________________________

I think nothing about USB interface

I tried also to modified the file in (Mac OS X) use/etc/primesense/GlobalDefaults.ini uncommented the line

_________________________________
; USB interface to be used. 0 – FW Default, 1 – ISO endpoints, 2 – BULK endpoints. Default: Arm – 2, other platforms – 1
UsbInterface=2
_________________________________

but got the same error.

I tried my patch on macbook pro and on mac mini with 4 USB port and got the same error.

Does anyone can use two xtion or two kinect with jit.openni?
If someone is able to do can post your configuration file?
Thanks so much


June 8, 2013 | 5:39 am

It is not possible to use two Kinects with jit.openni because the Openni hack hardware driver for Kinect did not fully implement handling for multiple devices.

The XML file that you post above doesn’t have anything in the query area that would suggest a specific device. No id, no index number, usb hub, nothing. And you list PrimeSense…but isn’t your sensor an Asus? I’m surprised that one even works.

If ASUS wrote the driver to support multiple devices, then I would hope they also document how to enable this through query to a specific device. That’s how OpenNI was designed to work. If you find someone successful or get support from ASUS (have you called them to ask how you configure the OpenNI xml file?), please do post here so everyone can benefit.



dtr
June 8, 2013 | 6:22 am

The Asus Xtion is a PrimeSense/OpenNI sensor/camera, just like the Kinect.

If 2 depth maps is all you’re after then jit.freenect.grab supports multiple Kinects, though I don’t know if it works for the Xtions as well. If you want 2 skeletons then you’re outta luck. Not supported by OpenNI, Kinect SDK or Freenect (the latter doesn’t do skeletons at all). I have 2 computers for my 2 Kinects because of this.

Some people on the OpenNI forums found a way to run 2 instances of openni skeleton tracking on 1 computer but it’s too technical for me/most.


June 8, 2013 | 7:23 am

The topic is about The Xtion and Openni. However, I would like to correct some out-of-date information that DTR shares with good intention. :-)

Good news. Things are better than DTR suggests with the Kinect SDK and dp.kinect

Because dp.kinect is based on the Microsoft SDK and supports multiple Kinect sensors, the functionality has improved substantially. Up to 4 Kinect sensors can together be used for skeletal tracking. Each sensor can track 6 people and 2 detailed skeletons. That means if you have a powerful CPU then…
4 Kinect sensors on one computer can…
detect (4 x 6) = 24 people
track detailed joints on (4 x 2) = 8 skeletons
Details at http://msdn.microsoft.com/en-us/library/dn188677.aspx

I appreciate your interest in multiple sensors. The Mac continues to be a world of hacks and half implemented hardware. Its just the way it is. :-/ I hope that you have luck contacting ASUS and they provide you the way to configure OpenNI for multiple ASUS sensors. Don’t forget to tell them this is OpenNI 1.x (not 2.0).



dtr
June 8, 2013 | 8:54 am

> Because dp.kinect is based on the Microsoft SDK and supports multiple Kinect sensors…

You serious?! Damn you PrimeSense! Why the f*ck can’t you do it if M$ can…?



dtr
June 8, 2013 | 8:56 am

Btw, do they aggregate in 1 tracking process or do they stay separate? Ie. do they automatically merge their tracked data?


June 8, 2013 | 10:00 am

No aggregation, that same MSDN URL speaks to that. ;-) No matter who’s tech you use, if you get down to tracking users (or further down to skeleton joints) across sensors, the challenges to solve are at least:

  1. Aligning the coordinate spaces across the sensors
  2. Merging seen logical users if 2+ Kinect observe the same physical user
  3. Handoff/merging of a user as they leave one Kinect’s range and enter another
  4. Managing the delay that occurs for a Kinect to identify and track a user. For fast moving people, its problematic if you have little overlap between Kinects

I started creating some Max code to tackle some of these, but stopped as I chose to take another approach than using skeleton tracking. (I am releasing an update to dp.kinect within the next 2 weeks that does help -a little- with this.)



dtr
June 10, 2013 | 6:31 am

I see… (sorry, in all my excitement I missed that link you posted)

Well at least both Kinects can be attached to the same computer and do skeletons now. I already have skeleton merging logics (though merging pointcloud data *before* skeleton tracking would surely yield better results).

Drop me a message at dtr(d0t)vndrn(@t)gmail(d0t)com if you’d like to see how I’m merging. Perhaps we can exchange techniques.

Off to test my system on Windows… And now I stop spoiling this jit.openni thread with dp.kinect stuff.


June 10, 2013 | 4:03 pm

haha, no worries. I know you’re not dumb. Just a friendly "ribbing" along w/ the info. ;-)


June 14, 2013 | 2:16 am

I’m still working on how to use multiple devices (xtion) with jit.openni

My results until now:
- On Macbook pro OS-X 10.8.3 i successful tested two xtion in the same time using two different applications (openframeworks example + drgb openni) so now I can exclude the USB BUS problem
- I found on internet two working example of openni and multiple sensors
here the links:

multi sensor example working code

https://groups.google.com/d/msg/openni-dev/_e5p56HGZvk/Va52GLL8TqkJ

Openframework multi sensors version

https://github.com/gameoverhack/ofxOpenNI/blob/master/examples/opeNI-SimpleExamples/src-ImageAndDepthMutltDevice-Medium/testApp.cpp

- I downloaded the jit.openni source code and successful compiled it but I’m not able enough to add the code to list available devices and open the second devices if the first is just opened

I’m not sure it is enough add xml code into the config file to force jit.openni to open the second device maybe we need some code into the object itself.

Hope someone can help me


June 14, 2013 | 3:06 am

I am also aware of custom hacked C code that has been written to connect two Kinects at the same time. I have not investigated that code. Instead, I am using the SDKs released by OpenNI and the hardware drivers that come from the manufacturers or other people without hacking beyond them. If you find a way to easily adjust the jit.openni code that falls in line with the SDKs, I don’t mind partnering with you to add that feature.

I caution you about running multiple sensors on the same USB controller. For example, *one* Kinect capturing depth, color, and IR all at the same time is impossible at 30fps.
USB 2.0 theoretical bandwidth = 60 MB/sec (its actually less due to overhead)
depth 640×480 = 18.432 MB/sec
color 640×480 = 27.648 MB/sec
IR 640×480 = 18.432 MB/sec
Total needed = 64.512 MB/sec (its too much…just *one* Kinect)

The authors of hardware drivers for the sensors can each choose if they want to limit multiple sensors via code or leave the error handling due to bandwidth congestion to us. If you limit your data types collected and your hardware driver doesn’t force separate hubs due to the bandwidth limitations, it could be possible to have them on the same hub.


June 14, 2013 | 5:40 pm

Very good news
- uninstall Openni 1.5
- download OpenNI2 for OSX
- run sudo ./install.sh
- copy all files from the Redist directory to usr/lib
- go to Openni2/Samples/bin folder
- double click to "MultiDepthViewer"
and voilà two xtion are working together

Tomorrow I’m looking for the same in Openni 1.5

Attachments:
  1. Schermata-2013-06-15-alle-02.37.33


Spa
August 5, 2013 | 5:14 pm

hi micron,
did you manage to have 2 xtion (pro or pro live?) on the same mac osx (MBP?)?
with 2 jit.openni in 1 or 2 max app?
which config?
thanks


August 6, 2013 | 1:47 pm

Hi spa,
It is possible use two xtion on a mbp but not with two jit.openni
I used the examples app you can find in the openni2 sdk for mac



Spa
August 7, 2013 | 2:44 am

Hi micron,
I dont get it. you are not using it in max?
do you mean you have 2 Xtion Pro Live in 1 hit.openni in max in osx???
Can you still use a kinect, or the asus driver only work for xtion?
With the live version, do you have the rub stream
Thanks


August 7, 2013 | 3:29 am

I tried to use two xtion in max but with no success
jit.openni can manage only 1 xtion right now
you can’t create two objects to manage two xtions because you will get an error
so my idea was modify the multicamera xtion example (in xtion mac sdk) to send the two xtion frames by syphon to max but now I don’t have time.
I can use the kinect I install the driver by sensecast dmg


March 17, 2014 | 9:27 am

I caution everyone that OpenNI is dead. Next month, the OpenNI website is closing. When this happens, there may be no legal place to download NITE (the essential component of OpenNI that does skeletal tracking). This could lead to software piracy and illegal distribution. NiMATE, jit.openni, Synapse…they all use OpenNI.


April 2, 2014 | 2:40 am

Thank you Diablodable for the caution, this is a big problem for me and other person that are using asus xtion on OSX,
I’m thinking to back to use the kinect with libfreenect, probably now the problem on the 1473 model has been fixed.
Any news on this subject are appreciated
Thank again
bye


Viewing 73 posts - 151 through 223 (of 223 total)