here is my working jit.openni max jitter openni external for windows
I have written a Windows Max Jitter external for OpenNI. I reached a
major milestone today.
https://github.com/diablodale/jit.openni is the location of the
project and within the bin directory are the Win32 external, XML
config file, and a sample patcher.
It has been casually tested with SensorKinect and OpenNI binaries.
Please see the README file for install instructions.
Currently it supports:
-ImageMap of RGB24 output in a 4-plane char jitter matrix
-DepthMap output in a 1-plan long, float32, or float64 matrix
Its output is very similar to jit.freenect.grab and therefore it can
often be used in its place with small changes. My object outputs depth
in mm and jit.freenect.grab outputs in cm. A simple math op can
resolve this. Note, my object does not provide the "raw" values of
jit.freenect.grab; instead it provides the mm depth values via OpenNI.
I would like to see support for other generators (skeletons, gestures,
hand point, etc.) in the future added by myself or with the assistance
If you find problems and can reproduce them with clear steps, I
encourage you to open an issue at https://github.com/diablodale/jit.openni/issues
Do I have to throw out the MS sdk for this to work?
I suspect you would need to uninstall Microsoft’s Kinect device drivers since OpenNI needs OpenNI compatible drivers (e.g. SensorKinect).
This is a Win32 Jitter external. No other Win32 external exists of which I know. I wrote this because I needed an external and could no longer wait for Microsoft’s SDK. Also, this external is intended to work with any OpenNI device.
I’ve posted an update. It now supports rgb, depth, and ir cameras with outputs in flexible matrix types. Next up…skeletons.
It is likely I will port this to Microsoft’s SDK; I like their non-pose needed calibration. However, my priority is getting an object working and tested for my next art installation.
I tried to install but kept getting an error that OpenNI.dll wasn’t found. I copied the OpenNI.dll file to the local directory but that didn’t help.
Do I have to throw out the MS sdk for this to work?
Yeah I think because you dont need a calibration pose using the MS sdk – then that makes it alot better for interactive art installations
I have heard some say that the NI skeleton is faster though, but on my computer (windows 7 64bit) the Ms sdk skeleton seems faster (havent tried it with many applications though)
dambik, would you please send me detailed repro steps as well as your computer software/os setup. For example (this is only an example, you will need to write your own setup and repro steps)…..
Windows 7 Ultimate x64 SP1
Max/MSP/Jitter 5.1.8 for Windows
1. I first installed OpenNI 22.214.171.124 for Win32
2. Then I installed PrimeSense NITE 126.96.36.199 for Win32
3. then I installed PrimeSense Sensor KinectMod 188.8.131.52 for Win32
4. then I copied the jit.openni.mxe, jit.openni_test1.maxpat, and jit.openni_config.xml into the same directory.
5. then I double clicked on jit.openni_test1.maxpat
I got an error from (windows, max, etc) saying "blah blah"
blah blah blah
I uninstalled everything and reinstalled as outlined below. The OpenNI.dll not found error is gone but now there’s an XML initialization error. Is the Kinect SUPPOSED to appear as three unknown devices in Device Manager???? Or are there other drivers I’m supposed to install???
Windows Vista x64 Ultimate SP2
Max/MSP/Jitter 5.1.8 for Windows
Downloaded and installed the following in order:
Extracted all files from diablodale-jit.openni-1cd3781 and ran jit.openni_test1.maxpat from /bin directory
Clicked on read jit.openni_config.xml message object
Max then returns the following error:
jit_openni: XMLconfig initialization failed (One or more of the following nodes could not be enumerated:
Device: PrimeSense/SensorV2/184.108.40.206: The device is not connected!
Image: PrimeSense/SensorV2/220.127.116.11: Can’t create any node of the requested type!
Device: PrimeSense/SensorV2/18.104.22.168: The device is not connected!
OK, I’ve got jit.openni working fine now. I needed avin2-SensorKinect-28738dc.zip for the neccessary Windows drivers. Was Sensor-Win32-22.214.171.124-Redist.msi the wrong package to insall???
I use the same install components written up in the README; part of the distribution at https://github.com/diablodale/jit.openni
I always encourage people to look there because that will have the current info while the very post I’m writing now will become out of date.
You appear to be using using differerent OpenNI and SensorKinect versions that I am using. At moment, as in the README file, I am using
– OpenNI 126.96.36.199 for Win32
– PrimeSense NITE 188.8.131.52 for Win32
– PrimeSense Sensor KinectMod 184.108.40.206 for Win32
Good to hear its working for you. Skeleton will likely start showing up in the codebase next week. I have been reading the needed sections of the two SDKs, thinking through my coding approach, and how the output of the data should look.
Do you have an opinion on how the skeleton data should be output? First though is to output some block of data which would hold all joints and all locations for those joints.
If I used a matrix type=long, that I could require the patcher knowing the integer->friendly join name mapping. I hesitate at this approach.
If I used a long list where the 1st element is the friendly name for a join, next is the location of that joint, 3rd is friendly name for another join, next is…and so on. This would work but then requires list management to split it up into separate messages (zl iter 2) for storage and action. I lean towards this approach.
Anyone else have an opinion on how the external should output skeleton data?
Not having access to pc cos’ moving now…
Thanks for the good work.
Of what i tested in osceleton, It’s difficult to know in the series of joints when a new frame is starting.
and it sends only active joints independently with appending of the user id.
I sometimes reformated the tuio for a 1 line list, sending it as 1 frame through osc from my tracking analysis to main patch, then cutting it with Lchunk | . this way, i can store all datas in 1 list for coll…
| frame 145 | active 0 2 4 5 | 0 0.345 0. 567 | 2 0.24 etc…
perhaps an approach with:
frame 145 (or new frame)
active left_hand right_hand etc…
left_hand 0.23 0.45
right_hand 0.67 0.78
bang (end of sequence)
that could easily be repacked with a subpatch:
| 1 145 | active left_hand right_hand | left_hand 0.23 0.45 | right_hand 0.67 0.78 |
| 2 145 | etc…
Well what seems to me important is to be notified by your external of the beginning and end of the frame.
And the fact that some joints appears or disappear, makes difficult to pack it in a numeric only list or matrix.
my 2 …
By the way, It will be really happy of having an external that could output at the same time:
depth map + skeleton (+ eventually the rgb).
Spanning 2 parallel processing of the kinect datas for a richer output:
depth map > fluid3D analysis in matrixes
skeleton > joints processing (interact, 3D object, particles…)
nice! anyone up for making a Mac version? i wish i could do this myself…
"You appear to be using using differerent OpenNI and SensorKinect versions that I am using."
I actually looked for the same versions listed in your README file but couldn’t find them for some reason. Perhaps I looking in the wrong place? Luckily, the most recent verions (unstable) worked fine.
"Do you have an opinion on how the skeleton data should be output?"
I do like the simplicity of being able to use ROUTE and/or UNPACK objects to parse the data. Maybe a fixed sized implicit list with a header (sequence number, player number) and a series of location coords?
Diablodale, a BIG thanks for doing this :-)
@offceplus: google "register dll", it is necessary to tell windows where a dll is located.
"Officeplus" appears to be a spambot parroting random posts and inserting commercial links.
v0.5.0 is up on github. I now output userpixel maps aka xnGetUserPixels(). This was a good stepping stone on my way to support skeletons.
I am going to try a few skeleton output approaches privately and then release one of them for everyone’s feedback. After I hear your feedback, I may change the way skeletons are output and it may not be backwards compatible.
** I don’t promise any backwards compatibility as this is all prelease pre-version 1.0 coding. **
I recognize that I can’t predict everyone’s uses. That is why I want your feedback. I ask that you consider I may not be able to accommodate everyone’s wishes. However, I do hope the final skeleton output can be massaged by Max to be whatever you want it to be.
COOL! Got a user map after doing a quick calibration dance. Very nice. How do I get the user ID value(s)?
BTW, what’s the purpose of the jit.op operators in the example?
v0.6.0 is now up on GitHub in the normal place. I now output user skeletons.
This preliminary version using OSC format and leaves all the values as floating point as OpenNI outputs.
/userid/jointname x y z confidPosition x1 x2 x3 y1 y2 y3 z1 z2 z3 orientPosition
This release also has attributes to filtering data based on position or orientation confidence, display or not the orientation data, and smooth skeleton data using OpenNI’s smoothing API
– I do not yet output user seen/lost, calibration started/failed, or pose events via OSC.
– Are you getting valid skeleton data?
– Do you like this OSC format for skeleton data?
– For position data, should I keep it as a float or change it to an integer? If change it, why?
@dambik, the it.ops are scaling values so they look a smooth ramp of greyscale rather than psychedelic waves of black/white
I got v0.6.0 working – happy :-)
To share my installation experiences:
– I did not find the versions from your readme on the OpenNI site, but these did it (Win 7 Ultimate, 32 bit):
avin2-SensorKinect-28738dc.zip (used this one according to dambik’s hint)
Initially, I got error-messages about the xml-file being corrupted and parsing errors. The cause was obviously that I DLed the three files by right-clicking and using "save as…"
DLing the complete zip cured this.
As a start, I now moved around in front of the Kinect with my daughter a bit and watched the patcher window. I see output in the jit.pwindows, but not in the 3rd one.
In the 4th we both were clearly distinguishable as seperate persons. Does this mean I am getting valid skeletal data?
@dambik: I almost think it is obligatory to post something naive/silly or even stupid, when it is the very first contribution to a new forum. Regard it as my way of showing that I am not a bot :-)
"I almost think it is obligatory to post something naive/silly or even stupid, when it is the very first contribution to a new forum. Regard it as my way of showing that I am not a bot :-)"
Welcome aboard, Not-a-Bot. :) Heh, I recently got caught by that silly spambot too.
OK, I got v0.6.0 setup and running today. No skeleton data all afternoon so I finally dragged the Kinect back to my desk and BINGO! I think I had the same thing happen when I tried the Nite examples come to think of it.
So, here are some observations from my tests:
– Are you getting valid skeleton data?
I checked the right_hand skeleton data against the center pixel value in the depth map and they did appear to agree when they overlapped. Didn’t seem to get any data on the right_finger at all and the left_finger didn’t appear to be tracking. That’s all I was able to test today.
- Do you like this OSC format for skeleton data?
Not really. I haven’t figured out how to parse the forward slashes in Max so I ended up using statements like ROUTE /3/RIGHT_HAND which treats everything from the first / to the RIGHT_HAND as a single word. No user number unlesss I can parse the forward slashes.
By the way, I was wondering if a sequence number or frame number might be helpful for grouping data across players.
- For position data, should I keep it as a float or change it to an integer? If change it, why?
You’re outputting millimeters, right? In that case, I don’t see that it matters that much.
Speaking of coordinates, if I move 3 feet directly to my right relative to the Kinect, does the Z distance also change or just X? In other words, are the coordinates completely orthogonal?
@transponderfish, the 3rd window is likely black unless you reconfigured the XML ocoonfiguration file. That pwindow is hooked to the IR output. the Kinect (or its drivers) does not allow IR output at the same time as RGB output. You have to choose only one.
@dambik, I too am not getting finger data. I ask for it in code, but it returns zeros. It is possible that NITE or SensorKinect does not support that joint at this time. I’ve opened an issue at https://github.com/diablodale/jit.openni/issues so I can continue to look at this and if resolved have the answer available for others.
@dambik, the object that is soon-to-be-your-best-friend is OSC-route. UC Berkeley has written and released a fantastic package of Max externals. You can get them at http://cnmat.berkeley.edu/downloads. You can download just the OSC ones (they start with "OSC-") or at the top of the page is the Everything package. The OSC-route object will do all the splitting of the message you want: routing, ranges of values, wildcards, its great!
@dambik and @transponderfish, previous versions of software on github can usually be view/downloaded by clicking on the "Commits" on the top menu bar of github. There will be listed all the commits (uploaded snapshots) that an authors uploads. On a line is usually a version statement. On the version that you want, click the "tree" hex number on the right column and there you can view the entire tree for a particular version. Download from that tree whatever you would like.
@everyone, a private release of mine has changed the skeleton data to start with /skel and fixed the outlet ordering. I am close to having the user seen/lost/etc. events output once a fixed a few bugs related to new callback functionality I added.
@dambik, i have been thinking about the sequence number or frame number request. I ask to understand more about your need. Reason is, there is some timestamp and frame data that OpenNI exposes for a given frame of mapdata (rgb, depth, and ir). The usefulness of that data in an environment outside the external (e.g. in max) is in question.
You mention grouping data across players. Can you describe a scenario more fully? Or perhaps a few more examples so I can get my head around the request?
Last night I spent several hours with your external and Vizzie (to get a feel for it, before I ask specific questions or even make requests), the results are amazing!
Right now I have tested it with 3 players, all 3 of us were tracked without problems (I am working with the 4th output mostly, feeding a sketchr to get outlines. We never merged into a large blob and what pleased me most was that my honey’s skirt always was there. Which is amazing to me, considering the mess in my room. I stayed in the background, often disappeared, but it was sufficient when the computer recognized my head and an arm. As I said, never a large blob or confusion between our limbs.)
I looked into the XML, I assume this is explained in the OpenNI documentation, right? I read it some weeks ago and will do so again the next days.
-I get output in the MAX-window when I click "summary". But: yesterday I suddenly got continuous skeletal data there (no idea what I did…), today not.
– Everything looks fine and stable, but I still need a few attempts to get things running (still haven’t found out if I have to load the XML-file with "read" or if the "read jit…" button is sufficient (everything is in the same folder, in the MAX folder in my win documents path). Right now, I have managed to crash the whole thing (no idea how, hahaha) – I will post my observations on how to restart everything (because ending MAX, un/-replugging the Kinect is a bit annoying AND it is almost impossible to crash a program you have written yourself; you always know what not to do… I know that from experience)
v0.6.1 is up on GitHub. I now output useful skeleton events (new user, lost user, calibrating, etc.).
I’m going to stop coding new features for now. What features are there need to be used, tested, and I would like everyone’s feedback. If you find bugs or crashes that you can reproduce, please open an issue at https://github.com/diablodale/jit.openni/issues
@transponderfish, yes the OpenNI documentation goes into its format. There is not a lot of tweaking you can do other than add/remove/switching to different nodes (image, ir, depth, user). The SensorKinect driver and the Kinect itself impose them. If you remove nodes that you do not need, you will lessen your CPU and memory usage.
@transponderfish, the sample patcher I provided printed out all the OSC messages. You have to be calibrated to get OSC joint into. The NITE driver requires you standing in a Psi pose. Once calibrated, then the joint data is output. Also, in high bandwidth situations the Max window doesn’t always keep up with printing them. That could have been what you saw. The new sample patcher I provide doesn’t print them all out, rather just the events. It does flash a button to show the OSC joint data. There’s a lot of it.
@transponderfish, if you can get a repro of the crash with clear steps, I can try and track it down. Until then, you can a) send a read message (you will be prompted for the file) or b) send read and the filename and it will auto load for you.
Is it possible to have a clip or a visual to see what’s happening when its working.
> @dambik, the object that is soon-to-be-your-best-friend is OSC-route. UC
> Berkeley has written and released a fantastic package of Max
> externals. …. The OSC-route object will do all the splitting of the
> message you want: routing, ranges of values, wildcards, its great!
I was hoping you weren’t going to suggest that. :) I have numerous machines to install it on and I didn’t really plan on using OSC. You might be able to convince me of the advantages, however…
> @dambik, i have been thinking about the sequence number or frame number
> request. I ask to understand more about your need. Reason is, there is some
> timestamp and frame data that OpenNI exposes for a given frame of mapdata
>(rgb, depth, and ir). The usefulness of that data in an environment outside
> the external (e.g. in max) is in question.
> You mention grouping data across players. Can you describe a scenario more
> fully? Or perhaps a few more examples so I can get my head around the
One concern is whether player skeleton data always arrives in order for the same frame. For example, frame 1 gives me player 1 then player 2 data, frame 2 gives me player 2 then player 1, frame 3 gives me player 2 then player 1.
Another issue is missing data. For example, suppose I’m calculating the lateral velocity of right hands by determining their change in X position and dividing by the framerate period. If a user lost a frame of data, a sequence number would tell me to either invalidate the current calculation OR to use a different time period for the velocity calculation. A sequence number might also help provide a warning if there are a high number of missing data sets (e.g., player 2 disappears for 10 seconds then reappears for whatever reason).
Perhaps I’m a bit paranoid but when working with Arduino and Xbee, sending a sequence number really helped me address problems with missing, duplicate and invalid data sets.
Everyone, I have put install/usage documentation on the Wiki at https://github.com/diablodale/jit.openni/wiki
@dambik, its not difficult for me to add an alternate output format for the skeleton data. One approach I am considering is to have 3 output formats controllable via an attribute:
1) my personal preference via OSC
2) OSCeleton-puppet format
3) a Max-only list format (this is probably what you are desiring)
The 3rd I would likely take the OSC of (1) and just remove the slashes. Would something like that work for you?
Dambik, now time to geek-out so we can get our heads around this timing topic. BTW, its interesting you mention arduino & xbee, I think we have some similar project pieces. ;-)
The data output from the outlets is initiated by a bang on the jit.openni inlet, not by a specific frame rate/output of the Kinect. The Kinect is sending data constantly to the computer. Its only when you bang that I ask OpenNI for a snapshot of that data. You could, however, bang the jit.open object equal to or faster than the FPS configured which would *likely* get you every possible frame.
The data output from the outlets from jit.open is done in the standard Max ordering (right to left). If it is not, then that’s a bug I need to fix.
When a bang is received, here is the flow that occurs:
1. Get snapshot of whatever data is currently available, do not block and wait for new data
2. create matrix of depthmap (if defined in XML) and queue for output
3. create matrix of imagemap (if defined in XML) and queue for output
4. create matrix of irmap (if defined in XML) and queue for output
5. create matrix of userpixmap (if defined in XML) and queue for output
6. output tracked skeletons (if defined in XML via user node) and queue for output
for each tracked user skeleton...
for each of the 24 OpenNI joint types...
check against confidence filters, if doesn't pass then go to next joint without output
queue output for joint OSC data
7. output all data via outlets; should be in standard right->left ordering
The UserID values in OpenNI (and therefore jit.openni) are not guaranteed to be in sequence. Its easy to get out of sequence by people disappearing and reappearing causing the skeleton to no longer be tracked and therefore has no output. For example, in (6) above, it is possible to have a user (seen in the userpix map) but not being tracked for skeletons due to calibration failure, etc. I recommend to use the userID value to track a user, not the sequence of data output by OpenNI for a given frame snapshot.
I do believe as I currently have it coded that for every bang to jit.openni you should get:
1) all tracked skeleton data output for a given frame snapshot
2) matrices output for any configured depth, image, ir, or usermap node in XML
You shouldn’t have a scenario where in one frame you get user3 and user2, then in the next frame get only user 3, then in the next frame go back to getting user3 and user2. The only scenario in which that would occur (which is highly unlikely) is if between those frames user 2 was lost, then seen and calibrated in one frame. It is possible for your code to catch this scenario by watching for the user events like "lost_user" and "calib_success".
I may implement in the future an attribute which if enabled causes old (repeated) data to not be output. Not to worry, the default will be as it is today which is to always output data even if it is old/repeated.
Since there is a continuous flow of data from the Kinect, the timestamp between frames greatly increases. It is measured in microseconds and I cannot find documentation defining what was timestamp=zero. I did a quick debug codechange and was surprised to find that the frameIDs start with 1, increment by 1 with every snapshot I take in code (a bang to jit.openni), and do not skip numbers. I cannot find any documentation explaining the behavior of the frameID, so what I’ve observed could be circumstantial rather than guaranteed.
It isn’t practical for me to embed a frameID in the matrices themselves, I would only provide it on a outlet like the skeleton OSC or make a dedicated outlet on the far right just for a frameID. However, I wonder if instead you should generate your own frameIDs or timestamps. When you bang jit.openni, also create a frameID that is associated with your initiated bang; perhaps using a counter specific to your implementation.
OK enough technical info from me. What is your take on all of this?
Thanks for documenting! I have had not the time so far to try out v6.1, but I will certainly do on Monday.
To give some visual feedback I have uploaded my first tests to YouTube:
I will also check where in the chain the upper 5% of the image is lost…
@transponderfish, perhaps one contribution to your issue is the
This configuration does the math to enable an OpenNI user to "overlay" the depth data onto the RGB data pixel by pixel. Using that translation, the depth data is smaller in width and height than the RGB data. Depth will no longer be a full 640×480.
"its not difficult for me to add an alternate output format for the skeleton data. One approach I am considering is to have 3 output formats controllable via an attribute:
3) a Max-only list format (this is probably what you are desiring)
The 3rd I would likely take the OSC of (1) and just remove the slashes. Would something like that work for you?"
That would be perfect.
"now time to geek-out so we can get our heads around this timing topic."
I’m going to take a little time to think over what you wrote…
Today I have tested v6.3, it works like a charm, it is using less processor than v6.0.
What I definitely need is support for TWO Kinects running at the same time on the same machine. The requirements for this are described somewhere else on this forum, I have sorted that out and controlled that this is set up correctly here (they need to run on two different USB-controllers, which is not to be mistaken with inputs/plugs. It can be checked with a software named USBlyzer, trial and error can do it, too, but why rely on that when there is a proper tool available?)
Now, I have tried to create a patch with two jit.openni modules to see what happens. It looks like the patch is preferring the Kinect which is first listed in USBlyzer. (Both Kinects are not working at the same time and I cannot choose, but I did not expect that) It also looks like both jit.openNI modules are working completely independent from each other (I can tell that from the little glitches that happen in the 4th output, they are different) which is 100% positive and promising for what I need to do with it, yippie!!!
Reading the OpenNI documentation (page 9) implies to me that an easy solution would be to edit two XML input-files, which say something like "Kinect 1" and "Kinect 2" at the beginning and feed them to two jit.openNI modueles – what do you think?
(the mac-only external by JMPelletier works with an "open 1" and "open 2". Does your external understand any similar commands?)
I am personally not interested in the OSC-output, but for giving you feedback: after doing the necessary pose, I got output. Works. Please ask if I should check more things.
NOTE! OpenNI and Primesense released a lot of new code two days ago. Some of it includes changes to the XML configuration support; some APIs were depricated. I have not tested my code against this new drop of OpenNI, NITE, and SensorKinect. I am definitely testing and updating everything to this new code by 20 July because of new features I want.
@transponderfish, I’m interested to hear of any success on two simul Kinects using OpenNI. I did have this in mind when I was writing. However, have not 2 Kinects myself for any testing. Thiw is the approach I took:
- leverage the documented OpenNI XML config files; allows rich configuration with inherit compatibility at no development cost to me. Allows arbitrary numbers of devices, generators, etc.
- allow the default "sharing" of devices mode rather than exclusive locking
- each jit.openni creates an independent OpenNI session which directly maps to the XML configuration file. You can point to the same config files or separate config files.
Given that, create two different XML config files. Two separate jit.openni objects. Then send a read message to each jit.openni referencing the independent XML files. Now…..
What I do not know, is how the OpenNI middleware, NITE, or the SensorKinect driver will behave with two Kinects. Given SensorKinect is a hack, I would look there first if it works. The simplest thing that I suggest testing is an ImageMap. Create two separate XML files simplifying them to only have one
You can specify Queries for Nodes which can isolate a specifically desired device. But the docs I’ve foudn so far would only differentiate between different types of sensors, not multiple of the same time. I can see in the C code that they’ve added low level support, but I can’t yet find the XML parse that utilizes it.
https://github.com/avin2/SensorKinect is the driver of which I’m speaking and OpenNI.org has all the middleware.
Hey diablodale, nice work on getting a PC max external happening for the Kinect! I’m really looking forward to getting it running.
I’ve happened upon an issue when trying to create your jit.openni object. I receive a error message:
"tooltip: Max.exe – Entry Point Not Found
The procedure entry point xnGetBytesPerPixelForPixelFormat could not be located in the dynamic link library OpenNi.dll"
…causing the object to be disabled.
I was running 3 month old OpenNI and NITE middleware, which I thought could be the issue. So I’ve updated to the latest stable builds and receive the same error. Potentially I’ve missed something else along the way.
I’m running: Win7(64-bit), Max 5.1.8, PrimeSensor 220.127.116.11, OpenNI 18.104.22.168
I installed your object by copying jit.openni.mxe and jit.openni_config.xml into my user object library. But I haven’t touched anything in the config file.
@Diablodale, I have done as you said, created two simplified xml files, read from two jit.openni modules. Result is same as before, I get the same output from both of them.
I tried a bit of hacking on the second one:
As you see, changing the name into "Device2" at least works so far that it does not crash. The serial number is there because it is the only difference between both Kinects I found in USBlyzer (it is commented out because MAX gave me the error message "corrupted XML". I then searched the web for a hint on the correct XML-syntax here, unfortunately found nothing yet, but I stumbled upon this thread:
Now, I do not trust everything I read on the web, but somebody there says that XML is not working for this purpose and that the key is in the driver/external. On the other hand, the more I read the OpenNI documentation, the more I believe it must be possible (I found more discussions by programmers, if it helps I can post links here).
Question on the MAX window, when I get: dumpout: read jit.openni_HACK.xml 1
The "1" means "success", right?
@kwijy, now that’s an odd error. I call that function extremely often and its needed. Why your OpenNI.dll wouldn’t support it is unknown. I recommend one of three things:
1) use the exact DLL versions that I describe in the README (note that while you are running on Win64, you must have OpenNI 32-bit installed)
2) wait until I update the codebase to support the newer OpenNI DLLs released in the last 4 days.
3) rather than moving those 2 files into your user object library, put the three files jit.openni.mxe, jit.openni_config,xml, and jit.openni_test1.maxpat all into a folderon your desktop. Run that patch, click the read xml file message box, and then start the metro. What happens?
@transponderfish, it is possible some part of the codepath OpenNI or SensorKinect hasn’t fully exposed enough information to differentiate between two Kinects in XML. OpenNI’s XML config spec allows arbitrary data that is driver specific…perhaps some code work needs to be done here.
@diablodale, I can report a major success in my dual-kinect-research: your external is definitely working with two kinects attached :-)
Here’s what I did:
– Considering the info USBlyzer gives me, the first Kinect listed is KinectA, the second KinectB
– as you suggested, I set up a minimal MAX-patch and two XML-configs.
– I unplugged KinectA, opened the patch, loaded the config for KinectB, started it
– plugged in KinectA again, loaded the config, started it: both are working independently.
I made some guesses on what the XML-config is expecting, nothing is working here – at least I found out what the external accepts as valid. Obviously it ignores anything non-intellegible, as long as it is well-formed. This is accepted, but has no effect:
(some of it is based on step 11, found here: http://www.kinecthacks.nl/kinect-tutorial-4-setting-up-mt-kinect-package-of-openexhibits/
Well, it was worth a try…)
Maybe this is relevant for you, maybe not, in the Microsoft SDK (5 days old) they give info on how to access a specific device by code (page 16):
I will now set up the whole thing on my notebook, test your new external and see what happens. Then I’ll google on about XML.
@diablodale Thanks for the reply mate. I found and installed the version of OpenNI you listed, but couldn’t find the compatible NITE version. I tried to load it anyway as you said in your 3rd potential fix but it just crashed Max. But no worries, I’ll sit tight for your next build.
Thanks again for your help. Looking forward to your update! Cheers.
@Diablodale & @all: Today I managed to set up it up on my notebook (2.2GHz Dualcore, Vista), it was a bit of a hassle, so here are my experiences:
– like kwijy, I did not find the exact versions listed, but when I checked those from my desktop PC and installed these, it worked (OpenNI 22.214.171.124, NITE 126.96.36.199, SensorKinectMod 188.8.131.52).
– I uninstalled all other drivers first, rebooted, installed everything, rebooted. It did not work first, I found out that the SensorKinectMod drivers were not active. Go to the device manager, see if there are XBoxNUI drivers left under HID-devices, if so, uninstall/delete them. Replug the Kinect and point Windows to the extracted SensorKinect folder. It takes a while until Windows sorts everything out, wait until you see an own folder "PrimeSense", listing Kinect Audio, Kinect Motor and Kinect Camera.
@Diablodale: I managed to run both Kinects on the Notebook using the trick mentioned above. It is running for over 2hrs now without crashing :-) CPU is at 60-90%.
Things look far more nervous now than the results I posted in the video (I think it was 6.0 or 6.1 running on my 3 GHz Quadcore desktop with win 7):
Looking at the 4th outpout, I am gone very often, sometimes a single limb, sometimes completely. It now needs my whole body to track me, when I sit down, I am gone completely. The previous version only needed my upper body with arms to find me – which would it make wheelchair-compatible.
I will test 6.5 now on my desktop – maybe it is the slower CPU, maybe the USB-controller that is causing this flakeyness.
i have a win7 64bit with
openni 184.108.40.206 win32
nite 220.127.116.11 win32
primesense sensor kinectmod 18.104.22.168 (the avin package)
and max 5.1.8
when i create the jit.openni object came up an error from
tooltip: max.exe saying that the external isnt designed for windows or have an error… in max window appear a message saying:
Error 139 loading external
i didnt see anybody else having this kind of issue… do you have an idea about what happens?
thank you very much!
@Diablodale: After 2hrs testing on my desktop machine I can say for sure that CPU-power is an issue!
- I tried the patch you provide, everything worked fine, it even recognized me while sitting. When the MAX-window told me to strike a pose, I did so and it worked on the second attempt (saying "calibration good").
Please tell me if I can provide any specific info here.
- next I tested it with two Kinects, which definitely works too, the glitches (lost limbs or me disappearing totally for a moment) on output 4 are bearable, everything is FAR better than on the notebook. I will see what happens when I try to optimize behaviour by tweaking resolution and framerates in the XML.
- a general hint: I have a glass-door in my room, with frosted/uneven glass, the kind they used in the 50ies to "upgrade" early 20th century doors – this disturbes the Kinect quite a bit, the IR is reflected in all directions at once. I compared this to our modern windows with isolation-glass: here the IR goes through with no reflections. So avoid old glass ;-)
The NITE you are using is older than the one the external was developed with. If you compare what I am using atm, obviously they have not to be 100% of the same version-number, but at least slightly newer does it.
From experiments with other Kinect-programmes I remember that not everything is compatible (I e.g. failed miserably mixing up stable/non-stable drivers).
Heads up, we are working with hi-tech in an alpha-stage, some trial and error is needed ;-)
the problem is that i cant find NITE 22.214.171.124,, the most similar i found was the 126.96.36.199 … i already tried with the last stable version of NITE (188.8.131.52) and last version of OpenNI (184.108.40.206) and no luck, same error…
do you have have a download link for nite 1.3.14? i google it a couple of times but nothing came up…
I am new to this forum, maybe it does not allow PM, maybe I have not found the button.
I can mail the nite 220.127.116.11 to you if you wish.
I have posted v0.6.6 of jit.openni object in the usual https://github.com/diablodale/jit.openni
v0.6.6 *requires* the new generation of OpenNI, NITE, SensorKinect software. Full details of the versions required are on the Wiki at the same URL. A few other features were added in this and the immediately prior version. All in the README and on the Wiki.
@transponderfish, thank you for your investigation. I see on the openni forum that people can get both working if everything is in C; I haven’t seen anyone speak to XML support :-/ I suspect it just hasn’t been a priority; shame since the XML support adds support for config outside C/C++ coding.
I have been testing on and off the past week and had pretty good results but with occasional drop outs. I think I need to build a test patch that helps me keep better track of status from the OSC messages. But, at the moment, I’m off on a trip and intend to pick up the latest version when I get back.
I have windows 7 64bit
I have downloaded the exact drivers specified in the wiki
OpenNI 18.104.22.168 for Win32
PrimeSense NITE 22.214.171.124 for Win32
SensorKinect Win32 Device Driver v0.7 (based on 126.96.36.199 Primesense driver)
I have uninstalled all kinect drivers that I had previously
Now I can install the openNI 188.8.131.52 driver
but when I try to install the NITE 184.108.40.206 driver I get a message that I need to install the openNI 220.127.116.11 driver (which like I said, I have already)
Anybody had this problem?
hi edsonedge.. seems today we are in the same…
i been able to install (in win7 64) openNI 18.104.22.168 and nite 22.214.171.124
but my problem is that i cant execute the msi installer from avin2 github
windows says something like bad installer.. do you run it succesfully?
is this the file mentioned as (SensorKinect Win32 Device Driver v0.7 (based on 126.96.36.199 Primesense driver)) in the diablo instructions?
@edsonedge, carsol; I run everything on Win7x64. I recommend removing all CL NUI, KinectSSDK, and all OpenNI, NITE, and SensorKinect software. Then, go into your c:program files(x86) directory and verify that you have no PrimeSense or OpenNI subdirectories. If you do, then something didn’t uninstalled. And resolve that.
Then give it another shot. You *must* use the Win32 versions of ALL THREE components (OpenNI, NITE, and SensorKinect). The x64 versions are not supported because Max is a Win32 app.
@diablodale: thanks for the answer.. i follow your guidelines using all win32 versions and the "SensorKinect-Win-OpenSource32-188.8.131.52.msi" dont want execute in my machine.. is like the msi is corrupted…. weird…
but the old SensorKinect-Win-OpenSource32-5.0.1.msi works.. i mean at least i can install it, but jit.openni dont works…
@kwijy, carsol: that is puzzling. The SensorKinect-Win-OpenSource32-184.108.40.206.msi that I downloaded and successfully installed from his github is 4,341,760 bytes in size. Do you both have the same one?
I am running Win7x64 SP1 Ultimate with all current Windows Updates on an Intel Core2Duo laptop.
If you continue to have problems installing that drivers, perhaps starting a dialog with avin (the driver’s author) or one of the forums like the OpenNI forums on google groups. I suggest that because his installers have always worked for me; I’ve never encountered a problem.
@diablodale: hold up mate. not sure what i was trying to install before – but i’ve got it up and running now. on the first test it’s running well and i can’t wait to get stuck into converting over from OSCeleton!
for the record, these are the files i’ve downloaded and installed to get it running:
@carsol + @edsonedge: if you have troubles finding the files i can email them to you. jump on the contact page on my blog and i’ll rar the files up and send them across… http://chrisvik.wordpress.com/
thanks again dale. i’ll make sure to keep you posted with what i end up doing with your tool.
thanks kwiy :)
glups, mea culpa… avin2 msi was a wrong file,, i mistake downloading! grgr…
with all last versions, everything works fine!
thanks all for the help!
@diablodale: everything works fine here with 6.6. Great!
I have tested on the desktop machine for over an hour and noticed nothing unexpected. It is amazing how stable the tracking is, I managed to confuse it a bit by moving furniture around etc, but once the pointer/focus is on me, I will stay in window #4. I tried to trick it by rolling into a ball on my chair (the idea was to hide head & limbs to confuse the algorithms), but I stayed in the picture without a single glitch.
Quick tests of the output enabling attributes: works, I think it saves a few percents of CPU (not a ‘scientific’ test yet), which is a great feature, of course.
And of course I am very curious now how it will perform with 2 Kinects, especially with several users. :-)
A little update, while I am demystifying my notebook’s behaviour:
– In my enthusiasm I did not really read what the MAX-window’s output, it gave the error-message
"newobj OSC-route: No such object"
I had thought I had the same installation here (updated Max the same day), so all I watched was the patcher and the system-monitor. What was missing here was the CNMAT external. (@diablodale: I do hereby suggest an update in the Wiki concerning the requirements).
– @all: Everything is fine if you read the following after opening the patcher:
"OSC-route ("OpenSoundControl route") object version 1.17.1 by Matt Wright, Michael Zbyszynski.
Copyright ) 1999,2000,01,02,03,04,05,06,07,08 Regents of the University of California. All Rights Reserved.
Jitter 1.7.0 installed
jit.openni v0.6.6, Copyright (c) 2011 Dale Phurrough. This program comes with ABSOLUTELY NO WARRANTY.
jit.openni v0.6.6, Licensed under the GNU General Public License v3.0 (GPLv3) available at http://www.gnu.org/licenses/gpl-3.0.html"
and after reading the XML: "dumpout: read jit.openni_config.xml 1"
- OSC output and calibration are now fine on the notebook, but there are still lots of glitches on the camera output. I am now running a complete virus-scan, because I suspect either the virus-scanner is running in the background or (worst case) I have caught a virus. (blaming Vista would be too easy)
@diablodale, I made some detailed measurements of processor load, on both machines I set metro to 75, this gives me a good compromise between visual fluency and CPU-load, the results are as follows (one player):
(I read the results from Resource Monitor, it is more detailed than what you see in Task Manager)
- Desktop, 3GHz Quadcore, Win 7: ca. 11% Max, ca 7% PrimeSense Device Development Kit. Deactivating features I am not needing does not make a great difference, saving 1, maybe 2% load.
- Notebook, 2.2 GHz Dualcore, Vista: 19% Max, 9% PrimeSense Dev. Here, the difference in load after deactivaing features again is hard to notice.
It crashes regularly on the notebook, but I cannot really give a pattern, sometimes it runs for over 2 hours, sometimes after a few seconds or minutes.
I always get an error from Vista, saying that Prime Sense Device Development Kit has stopped working (I am translating here, my Windows are German language)
Sometimes I get the following error in the Max Window after a crash (no idea what causes it, I cannot say if it has something to do with having calibrated or not):
jit.openni Failed updating generator nodes (Xiron OS got an event timeout!)
jit.openni data unavailable for request
(This is printed repeatedly, like being output by a loop)
I get losts of glitches in the window #2 (regular cam), they resemble those on an old defect CRT-TV, distortions that are horizontal stripes for a few milliseconds. I do not need to wait for them, they are always there, the length of undistorted sequences is 2 secs or so at maximum.
I hope this helps a bit, I mostly tested unattended – staring at a machine waiting for a fail would be a bit too tedious ;-)
The virus theory weakens the more I think about it; if it was the case, the desktop would be affected too. There definitely were background processes when I made the last post, maybe the virus scan, maybe defragmenting – who knows. It was cured after a complete scan.
It possibly can be connected to the update, I think everything worked better before, but I frankly cannot be sure, as I usually do things on my desktop and later transfer to the notebook, testing only briefly here. I do not remember a crash before on the notebook.
I remember having read a warning concerning notebooks and USB-cameras in a forum when I researched how to connect two Kinects: Somebody said that you cannot be sure it will work on your machine without really trying, even when you have checked the hardware and USB-ports with USBlyzer. The reason is that the components on a Notebook’s mainboard are so closely crammed that it cannot be excluded that errors occur, as the data-rate is horrendously high and the drivers are of varying quality. It sounds plausible to me, because I have made lots of experiments with different webcams and Processing. It works with the internal cam, but I cannot find a common pattern on what model will work on USB – the Logitech flagship does not, a cheapo no-name does, another cheapo does not. Max/Jitter behaves far better here, I have to say – the said Logitech works.
i saw some mentions in the thread about osc-route…
if you dont like it, you can remove the dashes from the output of jit.openni with a [regexp (/) @substitute " "] and then you can use the standard route object…
@carsol: OpenNI supports 24 joints. However not all joints are supported by NITE nor stable; e.g., they document that leg tracking is unstable and noisey. I recommend using the confidence values to aid in filtering out the worse values and consider using smoothing.
@transponderfish: OSC-route is not a requirement for jit.openni. Instead, I use the OSC-route external in the example patcher I provided. You can process OSC output however you would like, ignore it, or even disable it.
@transponderfish: The last posts from you I saw you were not using the required versions of OpenNI, NITE, and SensorKinect. Have you updated to them now? The errors you are experiencing are due to lower level components (like OpenNI and SensorKinect) that I don’t have control over. I try to catch errors when I can. One error that I catch is the "Failed updating generator nodes". That is my code seeing lower level software failing and me trying to skip failing code so I can continue running. That particular error is severe, it means that when I call the API to get updated data from OpenNI, that OpenNI tells me it can’t give me anything and it failing. That suggests to me that there has been a complete breakdown by some lower component. You see this error repeated because you will see one such error log for each bang you send jit.openni.
@transponderfish: Receiving all feeds from jit.openni is compute and USB intensive. Put the Kinect on a USB controller of its own. I also recommend receiving (aka generating) only what you need by editing the config XML file. Disabling it in XML is the best way to reduce load. OpenNI and Kinect send and process data if you have it configured in XML. Turning it off using the jit.openni attribute does not yield the same reduction in load. The attribute only disables creating and outputting the jitter matrices; in the background Kinect and OpenNI are still calculating the massive amounts of data. This attribute functionality is desired because there are times when a person might want the skeleton output but not need the depthmap output. In this case, a depthmap generator is required (for NITE to calculate skeletons). Because I have the attribute functionality, it allows you to save a few CPU cycles by disabling the output of jitter depthmap matrices rather than the default behavior which is to generate output for anything configured.
@diablodale: Yes, I have updated everything. I tried to be verbose as an aid for you debugging and optimising. Thus, I have used everything "as is" (except for metro 75) and I document my own errors as a help for others.
I have now done a test on the desktop PC, running 2 Kinects using the trick described above:
- ressource monitor displays 19% for Max, 8% for xnSensorServer.exe/PrimeSense Development Kit. This is less than I expected, the total CPU-load shown in task manager is around 35%, with spikes to 40%.
There are another 11% used by system interrupts (??? displayed as "Systemunterbrechungen" – why the $%&/?! do they have to translate specific terms???)
- The behaviour is again flakey, to be specific on output #4 (silhouettes). My first assumpton was the IR projections would be interfering, thus I have set up the 2 Kinects back-to-back, pointing in two opposite directions. It gets better a bit, especially when one user is seen in total. (I have tested with 2 persons, each on one of the kinects)
- the OSC-route display is very valuable for assessing what is happening here: in the Max-window I see it counting up to user 10 quickly, but it is not able to keep tracking who is who ("I see x/I lost y). I assume both Kinects are sharing the same process, which totally confuses OpenNI’s tracking.
- I have not yet experimented with two XMLs, but I will do.
- Switching off Kinect B in the patcher cures it.
- Right now (after running for +30 mins, the IR-window of Kinect B is behaving erratically (size is "pumping" between the normal, reduced size, full scale and a bit smaller)
About my notebook: yes, I have assumed that it is a lower level component thing – I do not remember reading anything about Vista-compatibility anywhere. And we can still blame it on its mainboard ;-)
@diablodale: another question, theres a way of getting a 4 char matrix of depthmap?
i was expecting that, the typical colored depthmap.. isnt that i spend a lot of time, but in the firsts tests i did with the greyscale depthmap i cant really simulate a good 3d mesh image, i found a really flat mesh with few Z value range…. you know what i mean? my english you know…. :)
or maybe i can post a couple of sample images from previous tests i did with a colored depthmap and the one with your external…
A question for everyone, I have a build ready to release that has 3 output formats settable by attribute:
1: the current OSC format (default)
2: a max route object compatible format, same as (1) without the slashes
3: OSCeleton legacy format (no orientation, no normalized values)
For output #3, should it be the normalized values or OpenNI native "raw" values?
When OSCeleton was first written, the author didn’t understand that the OpenNI values were millimeters. Instead, they watched the sample output and choose to apply a normalizing formula. This formula can be troublesome as it doesn’t support the full range of Kinect data. Its also based in a false assumption the author made. The author did the best they could at the time. However, that mistake is now legacy.
I can copy the normalize formulas from his codebase and apply them only in the (3) format output. *OR* I can leave the OpenNI (aka Kinect) values in their native "raw" values. This would be equivalent to using the "-xr" switch on OSCeleton.exe
@carsol: I do not intend to create a 4 char depthmap. There is no data that would fill a 4 char matrix. Kinect and OpenNI return a single long integer for each (x,y) pixel. There is only one plane of depth data. I already support changing the output format of the depthmap to a long, float32, or float64. All of them 1 plane data.
@carsol: By combining the output of the depthmap matrix and the imagemap matrix in your own patcher, you can create voxels and display them in amazing 3d mesh images using OpenGL.
nice to have these new features!
about your question i think its interesting to have them with the oscleton normalize formula, so for those who work before with OSCleton can migrate to your object easyly…
i dont see the way to change the output format for out1 to float32 or char.. ?
if i want just get depth and image output i need to erase the other nodes in the xml file right? i did it but the outlets 3 and 4 of jitopenni keep banging is this normal?
More tests done (quadcore), but first I have to correct an error in my post from yesterday: I did not mean the IR-window, I meant the depth-map window.
- I left the two Kinects unattended yesterday (back to back as described), for more than 5 hrs not a crash. Yeah!
- today, I checked my hypothesis that the players between both Kinects would be confused: I was wrong. I checked by changing the two "print OSC" in my testpatch into "print OSC_A" and "print OSC_B". I saw that both devices/processes are cleanly separated.
- still, the behaviour is rather nervous (as described above, players are lost quickly, assigned a new number), nothing compared to one Kinect running alone, where things run very smoothly and the system reacts very pardoning (it even recognizes me as a player while sitting and only waving one arm). I found out that it helps when the player is in full view (all limbs are clearly seen). I checked this with two players (each on one Kinect)
- I used two XML configs, in the second I renamed all items (e.g. Depth2).
I also tried to save CPU by deactivating nodes, but, of course, after I am interested in the output #4 I will definitely need these…
- Somehow I produced the following error, I hope it helps (I think in this case the screenshot is more useful than typing it)
New version v0.6.9 is up at the usual https://github.com/diablodale/jit.openni and the wiki was updated to document the new features areas of alternate user/skeleton output formats, user center of mass output.
@transponderfish: that’s a cool assertion. ;-) That particular assertion is within Max itself; far beyond my code. An assertion is where a developer makes an assertion (normal english meaning) that some logic should always be true. But if it doesn’t for some reason, it should "throw an assertion". I recommend you submit this error to the Max support team. It is possible that my code somehow triggers this in their code. Regardless, the assertion should never occur and should instead be caught in error handling. Error handling that I can then catch and manage. Perhaps it is something they would like to see so they can address it in Max 6.0.
i haven’t done extensive testing using multiple kinects, however i’ve been in performance situations where 4-5 kinects were setup from the same spot controlling different computers – we had to cover up the kinects that weren’t being used for each performance as it seems the IR signals interfere with each other – ie. clearly visible distortion around the edge of the user’s ‘cutout’ image where it should be solid, loosing the user easily, loosing limbs and in general, very jumpy data. (this was using OSCeleton however). i’m not sure if this is related to your issue or not.
"- still, the behaviour is rather nervous (as described above, players are lost quickly, assigned a new number), nothing compared to one Kinect running alone, where things run very smoothly and the system reacts very pardoning (it even recognizes me as a player while sitting and only waving one arm). I found out that it helps when the player is in full view (all limbs are clearly seen). I checked this with two players (each on one Kinect)"
i know very little of jitter (i’m coming from an audio background), but for the past few days i’ve been spending a lot of time going through the tutorials and examples to find a way to do what you’ve mentioned below, but with little success.
i’m trying to layer 3d objects together with the RGB image, so the objects may pass behind and in front of my body, but so far i’m at a loss on which way to do it. i’ve looked into chromakey masking but i’m struggling to get any results, and quite possibly i’m barking up the wrong tree.
making a mesh sounds like a much better way to go. would you mind pointing me in the right direction in terms of which objects are needed to create the 3d mesh?
"By combining the output of the depthmap matrix and the imagemap matrix in your own patcher, you can create voxels and display them in amazing 3d mesh images using OpenGL."
just a report on the external.. i’ve had it running mostly without issue. i’ve had a couple of instances where the RGB image has ended up (after even only half an hour to an hour of running) with green and purple(ish) chunks of noise and distortion that take up around 50% of the image.
the skeletal tracking seems a bit jumpier when this occurs. it’s fixed by simply closing the patch and re-opening. no error messages pop up.
– I have triple-checked that both Kinects are on different USB-controllers, both are USB 2.0. If I use different inputs/controllers, they simply do not install (this is shown in USBlyzer with an exclamation mark in front of the device)
- to check if there is IR-interference between the two devices (they are set up back-to-back, but who knows, the human eye does not detect IR, maybe something is reflected by something) I have modified a test patch for IR (with two XML-configs, everything renamed, e.g. "IR2")
# nope, there is no interference. I have checked this by covering Kinect_B with a cloth (a Newspaper is a reflecting surface and thus not so good), then viewing the IR-output of Kinect_A. To make sure nothing eludes me, I have boosted/brightened the window with a "jit.op @op * @val 1.5".
# the jumpy behaviour starts _exactly_ when Kinect_B is activated in the patch (not after the XML is loaded), when it is switched off, it goes back to normal (switching it off in the patch does not stop the IR-projector, as you can see).
I possibly have found a bug in the XML-configuration. When I tried to find out if anything was wrong with the IR-node, I saw that both these lines are mandatory:
If one of them or both are missing, the following error is produced:
"jit_openni: XML config initialization open failed (Device Protocol: Bad Parameter sent!)"
And: it does not mirror.
Next, I will experiment with the smoothing and confidence options.
May i butt in ?
How dose it run on snow leopard ?
What do I need to do ?
@diablodale – thank you very much, great contribution
included is a simple demo which demonstrate mapping of jit.openni to a point mesh with texture.
----------begin_max5_patcher---------- 2516.3oc6bkziihjE9rao9+PHq4znLwDarTmpV0LGpSyg9XoRVgMgcRmrMP3 bYZM+2mfH.avowFiACYqoVRRf.389h2d7f+7W+kYyWE+FOaN3Kfe.lM6OkGY l5X4GYV4AlMOj815.VlZfyWGGFxiDyen3jB9aB0I73gwf3MfPVRhezVvy9Q7 0BfGSv.wQhX.CjD6GI.g7rmL1e8A4CKdWj5lfJO5F4UDwB4p67uk5yB1eAQ6 Bi2IB3BE8XVdXeO0XiW8GOhvyqdex7+Op6CDYrezILw5mjT4xTIMpA.hIVdd .zzUsgZluASLLA+rxi1Op7ICUG7+9q+R9V4lGZMD9G9BijW8i7hesAtZONnO l38DtlHmO+A4+OPQU3ZHZ+MKIkmImiXB+3nJbHDCy4IpKMeiihAsLqxfm.Vf Xq7wgrcZ5h5ITIh+pjM9nbUNZsMvHe+cob.9e7hjouZ4jlQT48e41fkE2+lw WmtHTQIDkXTgPkoFKumvWBfms9wM6BBxVmx4QWOzAaVX7j.ks0oQJngTxC5X eF3BQcLbPDariTTyxvk5Po1O.bMu.rUeJfmV.TkHkjP7C3uvSyjJDUYoYyO7 rcrzSQJ4bDlpejzJOS4bv1f30Oy8p9LmIs7s45tMwI7nCWgh4p8iiFpeTUE5 O7rY6BDKqgzlFmd.aXq4Me4mVXX17so9dwQ4DR8qM+3kORocBp1rYMpWMjHV xotbQbbvJV5K9Y9qB30mUjx5rH+PlfK70DEx7vU5Glj5qcXb3f7Hl717jTHO NHn9cSepWN0o73u3ul+pum3I0sqBvUy2XUgoZ5h0Ow48UVWw7elslkvA+1u+ su+cv2h830F1YlPZ1I3GzD+vsrIkwlTH0RyV1p4VH4CpgMYAa1AqXE+5CcFE 0LZGXez7KvcHMeQsMPTIio0Uomm8pdlx427G37dmukwIkw1xOszyAC5f+Fr6 RN0XmyXe+XnEeyRVEXORi81ZqlNmE6Q8NDW2oYcD9Y96cGWImEWyMd8vo1zH dS5I7Fq9IwpEJxl2UzNiGjms.xt6XN5rX9JVz15A2cLFS6IL1RKRSsaAH2+h zh3saC3cVk+rRgVsyfpag8TyavcQkfHxyPrIGwJZHe.M.MYw6RWWxaEJRfin GOdlvOZerU+XuzvwC7IeOuihhQA+9dpDaKnQvOa+b5US9z1R9VSRx2psjOdR R931R9ndm7KOQNmTjqSF6Et2R4iSp9sjIDo9q1IzZJ0x8oMw3etycwTLNcpE MNbYNTqXAEYps+gN+7Yrrm66bBxGEN0AWO44VItCoES6RADflpftIU7dWKJo S4NY.p9xSrHufhhu.9pejblyWrTjxhx1DmFBf8U875RkurQcCY00xRknNDiF rx8sIHVdmtgRVI0rk4JpyQ1zvsQDR9fXpPIUw5bRnxxsc0lY010wAwokEnvx 0D5ZkWaBGLE4RTUofZ55HQtKT+PhiJsJhxD.fZNXE.qQ8aVXRkbjFNEbnSmq 6EoHtbMHQFG07HohLKHmp64pqFxjRvu0r9qUmJsJlpsLBUE01ld2qrZNlweK IE7U0OyxwueX9Sviv+t92g+D3Gk+yxy.Mdr3D2cH1raPrJQQBVYiDibt.FOL xkZdCPAJ6aXDvhXJMqXB9JyikH.l2avjZ2MvTIhhcvZTzdTDXk9xyW+sRO4d orWWFF6wAEQgdugR5sHW5nT8QpQeNnzs2gRAXU9eud3B2HbUVfhZaOMnQ5z5 aRprtlj6+BQkkv4dA9g.r4v6Kl1ok.1VIOA09RtnGk92Z2+NjKRiAn9DeNiX TmBaVWUGhxU.896PPq3wSYY7gT4S+.NMrA6NrokorFGS+5NNn+W.8OVY0ZUt tSA2gn5ELWGbm6n4qLkG4wSK8VpjJVpxLRl.Tw+fF2Qvzt6xdEX49kj9t1CB wYqm2K81C18FDmPN5xn3dIDnGZmfCKnuLbUsISyqsWBt38Xnaj.z+uQBlDMR vYWJ30xj1WkplRAqXdix5ASLadwypHD0Tno5Tif597AZRGkECt0f7133wAkw N2BJiw0PYcVSSTTNZ63.vV2B.iHUAXmIH9lIyF+YtpSby3iCBiuEDFZqkcsT abmf1I9NHHNSLNP6MYCFpiv.h00oGOEg1L93HzhtI6tNS.f8bsdiJFHv+52+ V+2.hnaxdJTuzQEYrXgGktOb0NgHNpyBU0K+xGvG3k5nFZUCd2TG0LLBORwl GSkbMGrH6Yd.XgbvK2kIyYdQtgvheU4WeoxoSwumIXoh86ra8Zole4taX9Ac WVz4ryGA9YpkHsSaabdDdSd0JliMKV7uwYRVcuGBuR1sSDW69oU8fq401CtC UyjkyasqgffSyFZJ2wVK6nooYCkkGyymaF.+YmAr9ry.e10AHe10ATFGs+ry .VsgAN0b0TgAnspwbcltL.oULvDVDB2JF.OcY.TqXfIrN.rUVglv5.syO.7u 78mN555Ocz32e5M9gU3IgHI6KKVr0W7ztUFxwsvymsJH1iEvWjuP04qIVj+h W8e1ueVkUja29BJnJ4s9c4Cgct+s5a1tvPV562gd8saKDuSwaPqU4WQg6ba9 pK0m2tvDIe1OhJvt0YoZQkhEj2gLJ82QbB3qx+u.70WXA.D5d27inN0GeE8v PQShCcGrVvZX9LkbwuRIGZfAbQ8NIW5aTRAjTHFQrLrj+wF8ftBx8tLU8Wwx VaIo1qVYMUnCxAM18NHXCE.dvsolxYdfCdYVtNNZi+Vi2BCFdyrVcRAwshQV 3.9xTbVHa3AmN0hqHR01OGSGk2zDsjz0iPVszba626LeNktAQuBLFiLnCj1Z iAKtINEr+ESQHCfkk7tZcS+BHONRYXjqeesLqhs1DUjjHSn4BS7BSmEd7v2k YKr4c4YeL+5kh2RSyYO5G8n7VJ3oKFzuZWcBww5NQlfUPN15ZeM9zCq9xBTL EbbBXkSDeLwqZIN1zJ.T+cB9D4a0PtVmT7n0zlU8bUa5sE2bpSbvqg3vV3bM OBlna2Zx985Gx1rMjMkNJXpSqoM70PaTpBRwVJKbV5Vuo9dDqBUQaCGaZw98 BOgHshoruV.20RwSZ4CrEY+d8BUeL8zfzsynHlzJaVWMoAIPCnq7O14ew.Ul jKDbT60OTdqfU30R5HWMcpaSGWm86zKDMdXnYJZ.oYqAgl6Mo2KRZGa03NRa zKBaiiC21YHEMZ3F4RjV9q+wHPaTTqb.MNJC4uNnUV4klHN73PbCikOGSssN k8uRCendxwMrcgaLNpIzV44lNN19rba0r8HExiyvDOFgnBijfTACe5M5OGrk 4kX1WAFqDFtnpOwd7jTMmnDmMZJSbz1Pb1iiAH6VEWH410wyOfby+CTc8ODK -----------end_max5_patcher-----------
awesome. thanks very much for sharing!
edit: i’ve noticed the mesh seems to be inverted on the z axis. i really have no idea what i’m doing, but managed to change it by altering the jit.expr:
jit.expr @expr -snorm -1.*snorm in norm 1.-norm
i’ve inverted the "snorm" to a "-snorm", then rotated through the jit.gl.handle object to 180 degrees on the x plane. seemed to fix it for now.
edit: nvm =) setting the "amp" to a negative number also does this.
@diablodale: Yesterday I took the two kinects and my notebook with me and tested your new version in a place that provides more of a "laboratory condition", i.e. more ample space, less "visual noise" (no bookshelves & stuff) around. We were two, each moved in front of one Kinect.
The results were better than expected. I had two crashes (test time 1 hr) and the user-tracking is anything but stable (which means it loses a user and finds another one all the time, although it is the same person, as described above), but it is not a catastrophic problem, most of the time we were recognized, if one was gone, he was there again after a second. Lost limbs were not a big issue.
- I tried giving Max and the xnSensorServer.exe the highest priority possible in Task Manager, but did not notice an effect.
- Also, I did not see an effect after changing the values for skeleton smoothing etc. So these are only affecting the OSC-output, right?
@yair: Thank you for sharing! Here, the patch is closing, no problems noticed (and it is running stable).
What seems weird to me is the behaviour of the jit.openni outputs. I cannot get the cam (#2) and the user tracking (#4) working at the same time. (Plus, the OSC output is only updating when the Max-window has the focus) I assume there is something I do not know about Max/Jitter. If somebody could point me to the right direction, I would highly appreciate it.
@kwijy, noise and distortion you describe is very odd. I am unsure if I could be the cause of this. The code I’ve written is a very thin layer of code to expose OpenNI to Max/Jitter. The code is very repetitive; its a loop doing the same thing each time you bang it. Given that, I think…any distortion/coloring would be immediate if it was within my code. I’ll think on it more. A possible scenario, is that there is a memory leak or other coding problem in the OpenNI, Jitter, or SensorKinect code. Or it could even be in your graphic drivers (Max using the OpenGL portion of your graphics driver which tends to get less rigorius testing on Windows compared to Mac). Its not that I want to point fingers to someone else, rather its that I think I do almost nothing that would slowly detoriate the graphics portion like that. There is one section of code that I very heavily manipulate the RGB values from OpenNI into a Jitter compatible matrix. However, any error there should show up immediately on the first bang and continue in all bangs.
@transponderfish. Problems with xml config reading should be reported to the OpenNI group. My code never opens or looks at the XML file. Instead I ask OpenNI to do it all and rely on their functionality. Its is odd that your computer needs MapOutputMode. I have never needed that and I always mirror. It could be that two Kinects plugged in causes lots of problems in SensorKinect and OpenNI. The authors of each may not be doing much (or any) testing on that scenario. I know I haven’t because I don’t have two Kinects. Please do report any OpenNI XML configuration bugs to the OpenNI google group at https://groups.google.com/forum/#!forum/openni-dev
@livemax, this max external is Windows only. I welcome a Mac developer to join me in updating any needed code to make it cross-platform. Until then, there are other methods available for Mac user like jit.freenect
@carsol, you also are experiencing an assert in Segmentation.cpp. That’s Max’s code and should be reported to the Max support team. Meanwhile, what were you doing in Max right before this Assert occured. Also, I have noticed that both you and transponderfish do not use English as your primary language on your computer. So far, you are the only two that have reported this problem and both of you have non-english in common. If I could understand what these Assertions mean (Max will have to tell me), then I might be able to reverse-think it.
@carsol. Thank you for reporting that outlets still bang even if they are not listed in the XML config. For now, that is expected behavior. It is due to the Max Jitter APIs for MOPs (matrix operators) that I use. Since these outlets exist, they default to a 1×1 matrix. And then later in their APIs, they output whatever matrix is current…even if it is a blank 1×1 matrix. I have requested in another post on this forum the code in their API so that I can modify it to not output on outlets that I disable. Unfortuantely, they have not replied…yet.
if i want just get depth and image output i need to erase the other nodes in the xml file right? i did it but the outlets 3 and 4 of jitopenni keep banging is this normal?
@transponderfish. Correct, the smoothing and confidence attributes affect only the user/skeleton output. If you have the xml configured and have not disables the outputs, it should be very easy to get the imagemap (#2) and the user/skeleton (#4) outlets working. Every test I run outputs depthmap, imagemap, and user/skeletons. For me, its the most tested thing. What do you mean by OSC output? If you mean seeing the OSC messages printed in the Max console window, then not seeing them is normal. Max puts updating that window at a very very low priority. Not to worry, the OSC output is being sent out the outlet and your patch can act on it.
@diablodale: ok, i will send the assertion error to c74 support.
time ago i found a bug in jitter(that is already fixed) related to the international settings of windows, if i set spanish for international conversions jitter patches crash, if i set to another country everything was working fine, i dont know if theres also some connection with this assertion, but if somebody is having many issues with this (i have only one crash at the moment) can give a try changing the international setting in windows.
@diablodale: i start to dive into the XML config file to do some mods, but im having constant crashes of max if i change the XML file on the fly… do you?
besides this im having diferent issues, if i add this line to node Image1 or Depth1 in the default XML file:
if i comment with () the depth node entry im still getting depthmap image in first outlet of jit.openni, but if i comment image1 node the image isnt comming as must be…
what im doing wrong? any special considerations for the dition of the xml file?
thank you very much!
@carsol, I use Windows 7×64 Ultimate. I should be able to switch my primary display language to Spanish. Do you think this would simulate your setup?
@carsol, the Kinect itself and SensorKinect do not support all possible values of xres, yres, and fps. The possible values are more restrictive than the Primesense Sensor. It is possible this combination is disallowed. Also, if there is both a IMAGE and DEPTH node, then both must have the same resolution.
@carsol, you discovered my primary reason for the output_depthmap attribute. It is likely when you removed the DEPTH node, that you retained the USER node. In this scenario, the USER generator itself requires a DEPTH node and automatically creates one. This is all done automatically by OpenNI and NITE. I call an OpenNI api that reads and initialize OpenNI using XML. After that call, I receive back a list of generators. In your case, it was likely a USER and a DEPTH (that the user node required). I also noticed this behavior and recognize it as intended by OpenNI. However, I saw an opportunity to reduce some CPU load by allowing you and me to disable my code which does the matrix calculation to convert that unwanted depthmap into a jitter compatible format.
@carsol, repeatedly reloading the XML on the jit.openni object is unreliable and prone to crashes. Known issue, please see details at https://github.com/diablodale/jit.openni/issues/4
@diablodale: the bug was with my old machine, winXP 64bit and max 5.1.4, dont know if with win7 and max 5.1.4 happens, you need to put "spanish" in regional settings (where is the config about date format etc..)and you will get a straightforward crash in max patches with jitter objects…
about resolution modes, ouh yeah, kinect isnt supporting QVGA, what a pity for latency maniacs… do you have tested your external with Primesense sensor? do you think will work? seems primesense have some advantages like 60fps at 320×240….
if i want the best performance only for skeleton tracking i need only the user node and set the good output_depthmap attribute to 0 and also the other outputs, image, IR and user to 0… right?
following with the XML issues, if comments in IR node are removed im getting an error of bad parameter… its only me?
@carsol, to be able to assist you, I request that you visit https://github.com/diablodale/jit.openni/issues and open an issue for each of your XML problems. Please include a full description of your OS, Max software, specific openni related software and versions, hardware with all version numbers, languages, etc. Then include reproduction steps of what is installed, any code running, steps that I should take to reproduce the problem, etc. Then include the full XML that you are experiencing problems with.
You can look at issue#1 as an example of the information that I request https://github.com/diablodale/jit.openni/issues/1
Yes, if you want the least amount of code to run and get skeleton tracking, then I suggest you have:
1) only a USER generator node in your XML
2) disable all the output attributes except for skeleton
3) disable the orientation data if you don’t need it
4) set the confidence filters to at least (0.6). In the current version of NITE they only output 0, 0.5, and 1.0.
5) only bang jit.open as many times/sec as you need.
@yair: Thank you for the sample mesh code! VERY cool.
Can someone explain to me why the mesh consists of a couple of different flat planes that overlap. Why isn’t my head round, for example? :)
OH, I see now – I have to set the AMP parameter.
Ah, the mesh code works MUCH better if you use the values directly from jit.openni rather than from the [jit.op @op / @val 22] object. Dividing by 22 caused Z to be limited to one of about 11 possible values making a sliced effect. My head is now round.
@diablodale: at the moment isnt a big deal dont have IR but if you want i open the case at your github page and thanks for the tips!
@all: it happens only to me that with yair patch (that connects a float32 matrix to the first outlet) the image have a lot of flicks and glitches? if i connect to a char matrix i dont have this issue, but then theres others..
"it happens only to me that with yair patch (that connects a float32 matrix to the first outlet) the image have a lot of flicks and glitches? if i connect to a char matrix i dont have this issue, but then theres others.."
Yes, I’m also seeing "flicks and glitches". I expect using a char matrix limits some of this by quantizing the Z values (like the /22 jit.op) so small changes aren’t noticed. The drawback is that you loose some of the smoothness.
Perhaps setting the confidence filters higher in the XML file would help.
@all: and nobody else? glitch glitch from depth output to float matrices?
@dambik: connecting a char matrix you will get a strange effect, the image does some kind of white rotation while you moving close/further to the camera,, grgr dont know how to explain it, but give it a round… i guess is something about long values going to char 0-255?
btw, the confidence filter is only for user/skeleton outs.
"connecting a char matrix you will get a strange effect, the image does some kind of white rotation while you moving close/further to the camera,, grgr dont know how to explain it, but give it a round… i guess is something about long values going to char 0-255?"
I expect the Z value in a char matrix would be continually wrapping around back to zero as you approach the camera, i.e., 255,254,…,2,1,0,255,254,…,2,1,0,255,…, etc. Since Z is in millimeters, that amounts to a range of every 10 or so inches before it wraps around once again.
Until someone provides me a repro(duction) case, I can’t track down or explain much. So far, all the behaviors people describe here are expected behaviors or so unclear I can’t assist.
If you think there is an issue, I encourage opening a new issue with lots of details at https://github.com/diablodale/jit.openni/issues
Lots of details would include the patch itself also. Even a screenshot so I can see what you are seeing on your screen.
Help me help you.
@diablodale: you are right.. i dont know wheres the issue, if its in the external or in the way matrices get transform.. or just is like it is..
in the picture you can see a white spot in my face, if i go close to the camera disapears (goes to grey) and get back the white and disappers and so on wrapping around…
and here is the patch.
thank you very very very much!
----------begin_max5_patcher---------- 1002.3ocyY0siahCF85To9N7ItNMK1fAxdy184X0pQd.mD2EriLNMIaUe2qw FllzNSBDfAFoLVw3eNeGe72Oju8wOrv6Y4IVoG7mv+.KV7MSOKr8U0yhlNV3 UPOklSKsCzKUVTvDZuk0OTyNosOXGSw.Nri9UFPyKkPpTTpoBMrMmqS2Y1nl 4rQJzBZAyNu+Vwo4u7HdlsS4ye4SH+W5UbnfKxYZKDPM8tWwJMPgp4RwSJVp 1YIQn0q7WBI93pF+5+A+6k6dI++s6NJw73etIxC5lc4kt2SMfmK1d4N3aW03 0jpFbbhqoZSrS56e7CUsllkslW+BWuZ+QtHSd70Hi0OJWDWgsnDKDCvVbiCu hOt1tw2xtQVCt17wgQqhL+EiWZ3QxkKoa8zm2ybSzyao4SOXGA6ngE9cQWEq IjpBZd0AZ2zWjdIuhHQsPdgeK4UKnYRrSk4DY92jeM7vSETsheZLYZ2N.HXS tjpCvPTnODl32QhOteDOdzI9P6hiHn2Ul+M8rdjq2A5c7Rnfo2IyLdYMeS7e vY4AHWJKYvVE6bYJMmAeklefUtDxLrQ0.9qWVubtfkJOHrKZP2NxvQ24Lq+N Ui8sjeHw4bgDXOrW2Kmplc+YlpixSLoK15CIzpibfb1ZsEeacFuRY.dOSEa8 FuvL3fG6tY8gWPHZHhyTuZXBYNEmQtG9r4ye.e1bGCR5ppB+5LKd.UUNZCUK trKxLv8kKwvRiOKFbTQ2u2fZfpLdhxp8VAG2wDFuZERStiaNnLiTAREPOROC aTxB6bSMbrhNLtyPit6rvg2aVqR8t2YcmL5TSPj+kbSv5AH84wHAQz3GHnlJ vXxDjp2chEfdvXAgNoe.AODwBZXn4UrfAJSXD98Rg0DT3cNm16IwPOlDiDXK qsQo0SIVHYNqvR2QUOn7JbzUWgWkwQsm72KwUAqrjtk867W4ghBp5bWuLFOH Yngt0kQ2aNA4Dr02LStsFqGDjVtcaN6Us06IMZuM0jnSyQeqpjYDtzXpMVIg ffNdpmL5G5NpYsqTujVPP8sFu27dghQy.WoLLgf+jIWwM7sqNUj2QRKZzIsn KunfBiG4aJ2jx5H4DL5jSc5h0rS.YjImaWTbkRpqdZGj3RQ2OtjqBijnUjNF Vp8e60Ch41HaMp+5uuhEsUO3Wn7R4AUZCZpe+zvEHNiUp4BalPWLn0WOnc7r Ll3pBvJ3Y6kFet033V5fVCspWlw8wV7jfs31.Mxj.MTqvFZRvVv7EZQyWnkL egVUJd2GaISC1ZCzBmDn0JVCgluzFBOMXqUgDPSji2VEKEELeOUw3Ybfd7zP bXRqYNTu.WUGllefY8NOM -----------end_max5_patcher-----------
I’ve been working on a Max application that applies a video effect at the location of a user’s right hand when their hand approaches the camera to within a certain Z distance. This works BUT, at times, it seems the skeleton values become "stuck" and stop changing for a few minutes. I can still see the depth map changing properly at these times but the OSC skeleton values returned do not. Is there an error situation where invalid OSC skeleton data could possibly be returned? (Is the skeleton data stored in a queue by any chance? Sometimes it acted like data wasn’t being removed fast enough from a FIFO queue and was falling behind.)
Segmentation.cpp is not a C74 managed source code file.
Please ignore my previous post – boneheaded coding error on my part.
I have noticed the RGB image has a strange offset every other row of pixels in 640×480 that would degrade any computer vision used on that matrix: http://i.imgur.com/J7YR1.png
After some research, it looks like a known issue to do with the driver: http://openni-discussions.979934.n3.nabble.com/OpenNI-dev-Quality-of-Kinect-RGB-Camera-td2678960.html
It sounds like the missing step will be added to Avin’s driver at some point, but until then I’ve found this shader which might be able to correct the image: http://graphics.cs.williams.edu/papers/BayerJGT09/#shaders Unfortunately I don’t know where to begin using that shader (and specifically customising the settings for the Kinect’s particular issue).
we had this discussion a few years ago. mosaicing is an inherent problem with any color camera. i include andrew b. code with the relevant shader.
still, doing it on the gpu will help aesthetically but the callback to the cpu for extra processing will give a hit in performance. last time i checked.
----------begin_max5_patcher---------- 1125.3ocyYtsrZpCFG+Z8ofgqWafDN2q5dl9FzK2ydbBRTyZAIzPrZam9tuy AzpshqHnrcFkEPhI+yu7cBV+X9L2B1dbqqyGb9GmYy9w7Yyz2RciYcWOysFs eYEpU2M2UUL51Z2WLMIOksUTgE5Fgc2kTp6Jq30+BlbnqqXTAEUi0M82bBp5 PKlQP7sFrQHp4.Ibewws.QW657uc8qgiawTARPXzEb7Rgo6wvXufWbBCTGOb 33ORMssjuqGZ.T1Z2XgDK2PnqOcbhiT+3rT0wX8.AOcjjqUB8vREnt2OmOWc 3kGD5he7nKTiNPP3nYmYfh.OKvK5gCunbM0RSGK6Lz+Ihcg2O1YOFhLq7GHF n3cxk2gUf.uWOyRs7lSChipcPRvPEN.uietLy.WfYACfYCASgFqNPtFSfjnq yojAvoZbaKZM9O.EGiJc1HDMev2e2tcdT72kSkWItlQ8Vx719leyFlf4Wf9F l6SpkCRpWizbvVJBxGEE6wQEjqCNECuniZw5krJFuqqlVyxC.4I590cFnGW6 7qsWc1DCiyz6Xw8sUAuelzuRDdqq7ZqPENqXLmOVRpaWhpvNPO4GqiD.FRTT 4juXc0BkX1xwpPBCwNGjpsygoQF67zqame+g2WIkXVSEhhMHTvQz1ULd8BkU lvAZOECGHEqQBNY+HIXXf1HDBCmrHpBmBGLG0hsGQvAfHclFIbLS0PhkBM.x TwW5jwGkE1NBsjsyXZoTqrHr.0WqQVzHH1fLmBLzJKw3PlLo3R5PxwzRLWQL qYT3DynTc1kvHcj97folP+Jd+JhLVOpPlaVmK1608smjDHvKV809BB+eLKfI 2Yjw4LGNYYAjkAZeYKCfODpvc.UTXxFZdhWPt9X58z.ylZ+Z2rsQVuWs+mkm 3GEmDjJOEUhaWrRVFMtbg1nSFL6lJ9KdT098.KiKRa0kYbmSe.UwobeaLIDr 1mLoelbluFi1tjiwTUiGViELtLRpda3pFaw4mZsEDb7O8Xu0cyRbiXSw1UqL yQv.HhfsdcE1ZSmT6by5uDfjCapWaq8NFv9K0XAm0WMjWXElMzLY2t4tAIwZ jDkb8PL24xt+hvql8Uxwhgb9HpD0zan3K3Vj+fJ01BhcvAw7hHuRX4taZc7p AvXYz4Z4SB+GPtKBNlJq+7MRCtjf7X709pq7+j7A4aQjkj9BZGbgRCrH2m0b LSyQyK.Bz8feVkcS2C2JB82eW15IQc+yYUKaKe4gs+NyFmeMOxLYBBU+ZDNo Ovy5yFRY44QUqIkMLY7lNIzy9lsJR8Bkd1jjEJR8FClPI8aSWOZJbZwThM6b wOgZJ5ITSS7dWrMZR4ZFMcZJxVMENcZJzVMAmLMYyVW9sYMEC7hkOHeVWZot yGoNUZ.7dgrtMcBhgJwAflmON73E2Ao9dHM41jpAi.X2+9hriWMRolc+2861 wCOY2Obz5Dje+E5DfNvzVpgd5dOmjvoWRAOURJ0BEkMsPxBEkNBEIu3my+O. cgaQE -----------end_max5_patcher-----------
Hi Yair, thanks for that patch & shader, it will certainly work as a stopgap until its implemented in the driver (I’m using 1. 2. 1. as the shader settings). It doesn’t impact performance too much for me – still getting 60fps while running some jit-ogre models and videoplanes.
@carsol, all the behaviors your describe and that I see in the patcher you provided are expected behavior.
top-left: the matrix output from depthmap is by default a long 1 plane matrix. The values that OpenNI and Kinect save in cells of that matrix will vary between 0->10000. When you force it into another matrix format with your jit.matrix, you loose a large amount of data. A char matrix cell only holds 8 bits (0->255). Since the max value in decimal is 10000, you need 14 bits to hold the info. Bits 9->14 are being thrown away by Max in the jit.matrix forced conversion and you are seeing the lower order remaining bits of information.
top-middle: I see tiny changes changes in values all over the screen. This is expected behavior. The depth sensor is trying to measure in millimeters so even the slightest movement is seen. Also, even though it gives values in mm, its measurements are not precise. You see variations from frame to frame due to the technology. When surfaces are hot, very reflective (mirror) you can see a lot of variation in values. All expected behavior.
top-right: In this example, you are using math in a jit.op to remove the lower 3 (= 2^3) bits of information; shifting them to the right. The remaining bits are then used to display. All expected behavior due to your math.
bottom-right: expected behavior. You have bits shifted by 3 ( same as / 8) and them normalized it to display in the window. Any subtle flickering are inherit expected behaviors in the kinect sensor as it tries to create millimeter precise values from a $150 device.
@Andrew Pask, earlier in this thread transponderfish posted an Assert screenshot which idenfies in text:
Program: c:program filescycling ’74Max 5.0Max.exe
Expression: xOfAreaPercentage20 >= 0 && xOfAreaPercentage80 >= 0
If you are not the owner of the 5000+ line files (wow, that’s big), do you have an idea who is? Is it a file that is part of msvcrt.dll? Part of opengl?
The two Asserts we’ve seen look like either thread related code or graphics related code to me.
Thank you for your input. I previously used a similar google query. I couldn’t find anything with that filename that has that many line numbers nor those variable names. Leads me to believe it is a codepath within Max.exe’s process space which you (or I) indirectly call. Some code that is closed source.
I wonder if Windbg.exe could see the call stack when this assert is thrown?
@everyone: I am replacing the osceleton_legacy_raw attribute in the next release. The functionality will remain but exposed in a different way. The wiki has been updated with this notice. It will be a trivial update for anyone that is using that attribute now.
When I click on download at github, it downloads what appears to be version v0.6.6-4 from file diablodale-jit.openni-v0.6.6-4-g120a839.zip
Is this correct?
Awww. I didn’t see that limitation of GitHub. I’ve fixed it.
Issue is that GitHub only exposes "tagged" commits in the easy download button. I removed a bunch of old tags and added a new one for v0.7.1. You can see it in the download button now.
In the future, you can download any files in a commit (aka version) past or present that you want by clicking the "commit" in the top menu bar. Scroll to the version you want. Then click the tree hex number on the right side. There you can see all the files included in that commit (aka version).
We both learned something today. And it was all brought to you by the color red and the number 8. :-)
BTW, what is "projective coordinate support"?
You can have your OSC/skeleton data in one of three values types. See the wiki at https://github.com/diablodale/jit.openni for the documentation on how to activate it on jit.openni. I can imagine times when someone doesn’t want real-world coordinates from the OSC data; rather they want the projective coordinates (x,y pixel coordinates).
If you want to go down into the computer graphics rabbit hole, google for projective space, coordinates transforms, etc. One examples is http://en.wikipedia.org/wiki/Projective_space
Thanks! I was wondering about the coordinate system. Projective coordinates are exactky what I need for one current project.
Projective coordinates work great!
btw. here is a link to an automated way of installing everything needed to start development for kinect on mac/pc > http://zigfu.com/
The Zigfu OpenNI installer lets you set up your entire development environment in one click. It bundles together OpenNI, NITE, and SensorKinect and configures them automatically.
THANK YOU for zigfu. I couldn’t get it to work before this!! I LOVE YOU.
Shhhh, between just us (haha), I have a private working max external that directly uses the Microsoft’s Kinect SDK. Skeletons and motor control work; no rgb/depth images at this time. Outlet and message compatible with my jit.openni therefore likely you won’t need to do any re-writes.
The Kinect SDK has a bug which requires you to kill the Max process after closing it. That and its lack of the image/depth matrices will keep it private for now.
Cool! Can’t wait to try the new one.
By the way, I have had a Max/Jitter jit.openni Kinect app running for the past couple of weeks in a Univ gallery space. Seems to be working fairly well without requiring daily restarts or reboots. Now if visitors would only "read" and follow the cartoon directions… :P
Great work jit.openni !
jit.openni works with asus X-tion Pro ?
I have never tested it with the asus. The only two openni "drivers" I’ve used are the SensorKinect and Recorder. I only make OpenNI calls so I would hope it to transparently work.
Do an install of OpenNI as per the README (or Wiki) instructions. Install your Asus drivers. And try it. If you have problems, confirm that the OpenNI sample apps also work. If the OpenNI sample apps don’t work, there’s something beyond my control going wrong.
I’m interested to here of your results.
diablodale & transponderfish, what’s the current status of multiple Kinect operation? any developments after the last bits mentioned in this thread?
thanks for the extensive investigation of the issue! right now i’m using a 2nd computer for the 2nd Kinect but I hope I’ll be able to run it all on 1 machine at some point.
btw, i found a programmer who wants to help me porting the external to OS X. more on that asap.
I have not pursued multiple Kinects via OpenNI after our investigation together. OpenNI via PrimeSense has limitations in supporting multiple Kinects w/ skeletons; overlapping lasers and resulting IR dots are difficult to discern. That’s all the jitter and poor tracking that one gets. (FYI, Microsoft SDK has the same limitation.)
OpenNI just today released a new version of their SDK. I don’t see any fixes addressing multiple Kinect usage.
I did see someone out there is working on a wrapper API to allow easier multiple Kinect usage. Unsure of its value because the Microsoft SDK already easily allows multiple Kinect usage and the OpenNI can be hacked to get at least depth/imagemap from multiple Kinects. But perhaps there is some value for your specific needs
Over the last month, I was focused on building a max external for the Microsoft SDK. I have that done. It provides only skeleton’s and its compatible with projects written with jit.openni. I’m keeping that private at the moment because of audio.
You see, the Microsoft SDK provides *rich* audio support. Max/MSP requires audio outlets to be the first outlets on an external. So I want to get that MSP part working before I release it and people start making projects against outlets that would change.
I have made a small set of private changes to jit.openni. I’m interested in your Mac developer assisting in making any needed code adjustments to make jit.openni cross-platform.
This is great. Other OSC solutions work fine as well but it is preferable to have all in Max than running other external software.
I am facing a couple of issues though..
a> I find it challenging to calibrate a user. Is it possible to track an upper body model only or just the hands, and how?
b> I am not sure how to receive each Joint data. Is it "OSC-route/*/limbname" ?
c> Can I use the attributes as messages?
I recommend you open the example patcher included in my distribution. It exercises much of the functionality and you can see how to use OSC-route with it.
Personally, and anecdotally from others, have heard that calibration is pretty good with the release I recommend in the wiki: OpenNI 220.127.116.11 for Win32, PrimeSense NITE 18.104.22.168 for Win32.
Heads up: I have not tested jit.openni with the latest binaries that PrimeSense released within the last two weeks. If they maintained compatibility, it should work. ;-) I plan to test it out myself next week.
If you’re using something other than jit.openni, I can’t speak for their reliability. In all cases, you much assume the PSI pose. No option. But once you do, it takes about 1 second. Its very fast. Oh…and be sure that you have a fast enough framerate. Slow framerates tend to calibrate and track skeletons poorly. I usually have at least one frame every 50ms.
Can you track an upper body only? If you are asking about skeleton profiles, then I already have support for that in a private build of jit.openni (v0.7.4). I will be releasing it this weekend or early next week. I suspect there will be little CPU change by not tracking all the joints. I recommend writing your patcher to only use the joints you need. Then with the coming jit.openni release with skeleton profiles, you can switch to a profile matching your needs.
1.Hi everybody I’m trying to download the example on github but I can’t can someone post the example on the forum.
2.I have download the external and my kinect is working with my pc so what’s the next step to make it work with jitter…
3. Do I need an Osc program to make it works with Max
Thank you very much…
"Over the last month, I was focused on building a max external for the Microsoft SDK. I have that done. It provides only skeleton’s and its compatible with projects written with jit.openni. I’m keeping that private at the moment because of audio."
I am VERY interested in this. What’s the status?
i v got an idea!
little bit off topic, but i will post it anyway :)
i thought, that it would be great to get skeleton data output in matrix format like in jit.cv. it could be faster, and easiest to use for example in simple skeleton draw. what do you think?
Any development over "multiple kinects" ?
Very interested in it as well :-)
Hi, everyone. I’m in the deep focus of an upcoming commission piece (3rd March here in Berlin) that uses the Kinect SDK version of my Kinect object. I’ve made the minor updates to support Kinect SDK v1.0 and all is well with it. Much stabler than earlier SDKs.
After I finish this art piece, I will turn to updating the object to support the same features as my OpenNI version. I don’t foresee any technical issues at this time. But first, I must focus on the upcoming show.
Good news is that it will have more features that the OpenNI version due to Kinect v1.0 SDKs functionality. I will likely release the object in 2-3 major steps to first have feature parity, then support multiple Kinects. I may update it later to support asynchronous data output, but am not prioritizing it since Max patches tend to be synchronous with a bang generating output (movies, gl, kinect, etc.)
Thank you Diablodale and I wish you success with your commissioned piece in Berlin.
Looking forward to have more news on further developments.
wow, it works very effectively :)
it doesn’t work with windows xp, am I wrong ? Beacause of the kinect sdk compatibility ?
after a lot of attempts, always:
"Can’t create any node of the requested type!"
No solution with xp though ?
does anyone knows, how to get previous version of drivers?
i cant find those files:
– OpenNI 22.214.171.124 for Win32
– PrimeSense NITE 126.96.36.199 for Win32
– PrimeSense Sensor KinectMod 188.8.131.52 for Win32
and with most recent release it doesnt work..
try this pack
edit: sorry didnt fully read your comment.
try here and there
great to hear that there will be a version with multiple Kinect support :-)
I am having a §$%&/ time "solving" this with a rtsp-stream…
If you are in the jitter world, just use jit.net.send/recv. We are currently streaming depth and rgb images from 6 kinects (attached to 6 CPU’s) to one central CPU with about 10ms of latency.
Thank you so much for telling me – I got it working now after a good deal of trial & error. Everything is fine now with a latency of 10ms, 13fps on a matrix of 640 x 480.
I had tested net.send a few weeks ago and had really depressing results (latency ca 7 secs), so I went for the rtsp method, where I was able to reach ca. 50ms using VLC, so this looked to be the way to go.
For anybody fighting against the quirkyness of net.send/recv: it looks like it is a bit unforgiving about the binding to a set IP-address and port and how the workgroup is configured (public/private/workplace). At least these are my guesses so far – patience and lots of reboots were the key after all…
Thanks yair and diablodale!
I also find, that when you install the latest zigfu, and replace config file with my attachment everything works great even skeleton data without t-pose. weird :)
anyway diablodale, how the work goes with the new version? I look forward to upcoming update and can not wait to test it :))
…and I’m back. Got all the after-work and video done at http://hidale.com/balloons/ for those interested.
I will start working on an update to the Kinect SDK version of my object late this week. I’m unsure if I will keep it closed-source or open-source. Reason being…
I haven’t had any contributors to my open-source jit.openni. :-/ I invite others to contribute to any needed updates to the jit.openni version. For now, it isn’t a priority for me to make code updates myself.
will enjoy yur last devlopment
if c, i will ++
immer weitr gehen
FYI, my existing jit.openni object appears to work with no code changes. On my windows machine, I downloaded and installed
sensorkinect 091 (which is based on 184.108.40.206)
The only issue is in the XML config file. You need to remove the Scene node. Somewhere in the newer OpenNI code (yet not in their changelog), they have decided to create a scene generator even if not requested. So putting a scene node in the XML file seems to create a conflict. Remove it, and you have a working system, with no pose needed.
I doesn’t understand the coordinate X, Y, Z that I receive from jit.openni
i have value like 1300. to -1400.
I have tryed an other type of OSCeleton a long time ago and it was from -1. to 1.
And when i try to map this value to a 3D skeleton it doesn’t work..
So my questions are: what is the range of the output value?
And do you have some tricks for mapping the coordinate from jit.openni to a 3d skeleton like Diablodale’s installation Balloons?
Plead the wiki at https://github.com/diablodale/jit.openni/wiki which should answer your questions on the coordinates. In short, they are real world millimeters. Wiki…
I do not recommend you use OSCeleton -1 to 1 values. The author guessed on the Kinect’s behavior when coding it for -1 to 1. And they guessed wrong. No insult…it was just a wrong guess which led to incorrect functionality. I do have legacy support in my object to simulate this bad behavior. I don’t recommend it because the very nature of that old OSCeleton behavior is bug filled. Its also in the wiki.
Track for mapping? Haha, that’s the special sauce. I recommend you first create a simple 3D world with, perhaps, spheres for each joint. Once you get that working, then start building something more to your desired goal.
Ok thanx i will try that
I am trying to make works the jit.openni external but no luck… I remember clearly that maybe 6 month ago I was able to work with that external…. Weird or we have to do new stuff to make it works….
Hi, I am getting this message
jit_openni: XMLconfig initialization failed (One or more of the following nodes could not be enumerated:
jit_openni: XMLconfig initialization failed (One or more of the following nodes could not be enumerated:
Device: PrimeSense/SensorKinect/220.127.116.11: The device is not connected!
Image: PrimeSense/SensorKinect/18.104.22.168: Can’t create any node of the requested type!
Device: PrimeSense/SensorKinect/22.214.171.124: The device is not connected!
jit_openni: XML config initialization open failed (Can’t create any node of the requested type!)
dumpout: read jit.openni_config.xml 0
I cant find what the problem might be, it was working last night before I went to bed and when I turned on my pc today just wouldnt work. Uninstalled all of the drivers and installed them again to be sure thats not the problem but it doesnt seems to make any difference. Any ideas how I can trace the problem?
It seems the error is in layers outside my codebase. From what I can tell, its that the kinect itself is not successfully connected and running at the device level. Be sure that you’ve pulled it into the wall power, in a dedicated usb port, and that all 3 SensorKinect devices show up and are working in device manager.
@diablodale I reinstalled all the drivers again and checked whether I havent forgoten the external power supply unpluged but I havent. I’ve got the 3 Sensor kinct devices under primeSense in the device manager working fine. Even to make sure its nothing wrong with the kinect I installed the kinect official SDK on another computer and its working fine there. I even tried the
sensorkinect 091 (which is based on 126.96.36.199)
you have suggested and they didnt work as well. Is there any way to test whether openni and nite are working on their own?
are you thinking in upgrade your object to be used with the kinect sdk 1.5? seems theres a lot of new features and better efficiency…. :)
@ElectronicElement: The OpenNI and NITE SDKs come with samples. You could try one of those samples to see if they work for you.
@carsol: The Microsoft Kinect SDK version of this object that I have written for my private use works unchanged with the new v1.5 SDK from Microsoft. I haven’t released this object to the public yet. They added orientation which allows me to make the object even more equal in behavior to the OpenNI/NITE version. Of interest to me is the work Microsoft did to include face tracking.
The main reason I haven’t released it yet is the audio support in the Microsoft Kinect SDK. I am learning how to write an MSP object in C so that I can expose this data (or part of it). My current challenge is learning how to be an initial source of audio data and thereby controlling the sample rate. If anyone knows…please do point me to docs or educate me.
@diablodale I guess it isnt normal that in the OpenNI directory: "C:Program Files (x86)OpenNI" most of the folders are empty? All folders apart from Bin, Data and Driver folders are empty… there isnt a single file in the sample directory, only emty folders…
thats for OpenNI 188.8.131.52/NITE 184.108.40.206 for Win32 + SensorKinect Win32 Device Driver v0.7 (220.127.116.11)
Forums > Jitter