here is my working jit.openni max jitter openni external for windows
I have written a Windows Max Jitter external for OpenNI. I reached a
major milestone today.
https://github.com/diablodale/jit.openni is the location of the
project and within the bin directory are the Win32 external, XML
config file, and a sample patcher.
It has been casually tested with SensorKinect and OpenNI binaries.
Please see the README file for install instructions.
Currently it supports:
-ImageMap of RGB24 output in a 4-plane char jitter matrix
-DepthMap output in a 1-plan long, float32, or float64 matrix
Its output is very similar to jit.freenect.grab and therefore it can
often be used in its place with small changes. My object outputs depth
in mm and jit.freenect.grab outputs in cm. A simple math op can
resolve this. Note, my object does not provide the "raw" values of
jit.freenect.grab; instead it provides the mm depth values via OpenNI.
I would like to see support for other generators (skeletons, gestures,
hand point, etc.) in the future added by myself or with the assistance
of others.
If you find problems and can reproduce them with clear steps, I
encourage you to open an issue at https://github.com/diablodale/jit.openni/issues
Awsome! ;)
Do I have to throw out the MS sdk for this to work?
I suspect you would need to uninstall Microsoft's Kinect device drivers since OpenNI needs OpenNI compatible drivers (e.g. SensorKinect).
This is a Win32 Jitter external. No other Win32 external exists of which I know. I wrote this because I needed an external and could no longer wait for Microsoft's SDK. Also, this external is intended to work with any OpenNI device.
I've posted an update. It now supports rgb, depth, and ir cameras with outputs in flexible matrix types. Next up...skeletons.
It is likely I will port this to Microsoft's SDK; I like their non-pose needed calibration. However, my priority is getting an object working and tested for my next art installation.
I tried to install but kept getting an error that OpenNI.dll wasn't found. I copied the OpenNI.dll file to the local directory but that didn't help.
Do I have to throw out the MS sdk for this to work?
@diablodale
Awsome
Yeah I think because you dont need a calibration pose using the MS sdk - then that makes it alot better for interactive art installations
I have heard some say that the NI skeleton is faster though, but on my computer (windows 7 64bit) the Ms sdk skeleton seems faster (havent tried it with many applications though)
dambik, would you please send me detailed repro steps as well as your computer software/os setup. For example (this is only an example, you will need to write your own setup and repro steps).....
setup---
Windows 7 Ultimate x64 SP1
Max/MSP/Jitter 5.1.8 for Windows
repro steps----
1. I first installed OpenNI 1.1.0.39 for Win32
2. Then I installed PrimeSense NITE 1.3.1.4 for Win32
3. then I installed PrimeSense Sensor KinectMod 5.0.1.32 for Win32
4. then I copied the jit.openni.mxe, jit.openni_test1.maxpat, and jit.openni_config.xml into the same directory.
5. then I double clicked on jit.openni_test1.maxpat
result---
I got an error from (windows, max, etc) saying "blah blah"
expected--
blah blah blah
I uninstalled everything and reinstalled as outlined below. The OpenNI.dll not found error is gone but now there's an XML initialization error. Is the Kinect SUPPOSED to appear as three unknown devices in Device Manager???? Or are there other drivers I'm supposed to install???
Windows Vista x64 Ultimate SP2
Max/MSP/Jitter 5.1.8 for Windows
Downloaded and installed the following in order:
OpenNI-Win32-1.1.0.41-Redist.msi
NITE-Win32-1.3.1.5-Redist.msi
Sensor-Win32-5.0.1.32-Redist.msi
Extracted all files from diablodale-jit.openni-1cd3781 and ran jit.openni_test1.maxpat from /bin directory
Clicked on read jit.openni_config.xml message object
Max then returns the following error:
jit_openni: XMLconfig initialization failed (One or more of the following nodes could not be enumerated:
Device: PrimeSense/SensorV2/5.0.1.32: The device is not connected!
Image: PrimeSense/SensorV2/5.0.1.32: Can't create any node of the requested type!
Device: PrimeSense/SensorV2/5.0.1.32: The device is not connected!
)
OK, I've got jit.openni working fine now. I needed avin2-SensorKinect-28738dc.zip for the neccessary Windows drivers. Was Sensor-Win32-5.0.1.32-Redist.msi the wrong package to insall???
I use the same install components written up in the README; part of the distribution at https://github.com/diablodale/jit.openni
I always encourage people to look there because that will have the current info while the very post I'm writing now will become out of date.
You appear to be using using differerent OpenNI and SensorKinect versions that I am using. At moment, as in the README file, I am using
- OpenNI 1.1.0.39 for Win32
- PrimeSense NITE 1.3.1.4 for Win32
- PrimeSense Sensor KinectMod 5.0.1.32 for Win32
Good to hear its working for you. Skeleton will likely start showing up in the codebase next week. I have been reading the needed sections of the two SDKs, thinking through my coding approach, and how the output of the data should look.
Do you have an opinion on how the skeleton data should be output? First though is to output some block of data which would hold all joints and all locations for those joints.
If I used a matrix type=long, that I could require the patcher knowing the integer->friendly join name mapping. I hesitate at this approach.
If I used a long list where the 1st element is the friendly name for a join, next is the location of that joint, 3rd is friendly name for another join, next is...and so on. This would work but then requires list management to split it up into separate messages (zl iter 2) for storage and action. I lean towards this approach.
Anyone else have an opinion on how the external should output skeleton data?
Not having access to pc cos' moving now...
Thanks for the good work.
Of what i tested in osceleton, It's difficult to know in the series of joints when a new frame is starting.
and it sends only active joints independently with appending of the user id.
I sometimes reformated the tuio for a 1 line list, sending it as 1 frame through osc from my tracking analysis to main patch, then cutting it with Lchunk | . this way, i can store all datas in 1 list for coll...
| frame 145 | active 0 2 4 5 | 0 0.345 0. 567 | 2 0.24 etc...
perhaps an approach with:
frame 145 (or new frame)
user 1
active left_hand right_hand etc...
left_hand 0.23 0.45
right_hand 0.67 0.78
user 2
etc..
bang (end of sequence)
that could easily be repacked with a subpatch:
| 1 145 | active left_hand right_hand | left_hand 0.23 0.45 | right_hand 0.67 0.78 |
| 2 145 | etc...
Well what seems to me important is to be notified by your external of the beginning and end of the frame.
And the fact that some joints appears or disappear, makes difficult to pack it in a numeric only list or matrix.
my 2 ...
By the way, It will be really happy of having an external that could output at the same time:
depth map + skeleton (+ eventually the rgb).
Spanning 2 parallel processing of the kinect datas for a richer output:
depth map > fluid3D analysis in matrixes
skeleton > joints processing (interact, 3D object, particles...)
nice! anyone up for making a Mac version? i wish i could do this myself...
"You appear to be using using differerent OpenNI and SensorKinect versions that I am using."
I actually looked for the same versions listed in your README file but couldn't find them for some reason. Perhaps I looking in the wrong place? Luckily, the most recent verions (unstable) worked fine.
"Do you have an opinion on how the skeleton data should be output?"
I do like the simplicity of being able to use ROUTE and/or UNPACK objects to parse the data. Maybe a fixed sized implicit list with a header (sequence number, player number) and a series of location coords?
Diablodale, a BIG thanks for doing this :-)
@offceplus: google "register dll", it is necessary to tell windows where a dll is located.
"Officeplus" appears to be a spambot parroting random posts and inserting commercial links.
v0.5.0 is up on github. I now output userpixel maps aka xnGetUserPixels(). This was a good stepping stone on my way to support skeletons.
I am going to try a few skeleton output approaches privately and then release one of them for everyone's feedback. After I hear your feedback, I may change the way skeletons are output and it may not be backwards compatible.
** I don't promise any backwards compatibility as this is all prelease pre-version 1.0 coding. **
I recognize that I can't predict everyone's uses. That is why I want your feedback. I ask that you consider I may not be able to accommodate everyone's wishes. However, I do hope the final skeleton output can be massaged by Max to be whatever you want it to be.
COOL! Got a user map after doing a quick calibration dance. Very nice. How do I get the user ID value(s)?
BTW, what's the purpose of the jit.op operators in the example?
v0.6.0 is now up on GitHub in the normal place. I now output user skeletons.
This preliminary version using OSC format and leaves all the values as floating point as OpenNI outputs.
/userid/jointname x y z confidPosition x1 x2 x3 y1 y2 y3 z1 z2 z3 orientPosition
This release also has attributes to filtering data based on position or orientation confidence, display or not the orientation data, and smooth skeleton data using OpenNI's smoothing API
Caveats:
- I do not yet output user seen/lost, calibration started/failed, or pose events via OSC.
Questions:
- Are you getting valid skeleton data?
- Do you like this OSC format for skeleton data?
- For position data, should I keep it as a float or change it to an integer? If change it, why?
@dambik, the it.ops are scaling values so they look a smooth ramp of greyscale rather than psychedelic waves of black/white
I got v0.6.0 working - happy :-)
To share my installation experiences:
- I did not find the versions from your readme on the OpenNI site, but these did it (Win 7 Ultimate, 32 bit):
OpenNI-Win32-1.1.0.41-Redist.msi
NITE-Win32-1.3.1.5-Redist.msi
avin2-SensorKinect-28738dc.zip (used this one according to dambik's hint)
Initially, I got error-messages about the xml-file being corrupted and parsing errors. The cause was obviously that I DLed the three files by right-clicking and using "save as..."
DLing the complete zip cured this.
As a start, I now moved around in front of the Kinect with my daughter a bit and watched the patcher window. I see output in the jit.pwindows, but not in the 3rd one.
In the 4th we both were clearly distinguishable as seperate persons. Does this mean I am getting valid skeletal data?
@dambik: I almost think it is obligatory to post something naive/silly or even stupid, when it is the very first contribution to a new forum. Regard it as my way of showing that I am not a bot :-)
"I almost think it is obligatory to post something naive/silly or even stupid, when it is the very first contribution to a new forum. Regard it as my way of showing that I am not a bot :-)"
Welcome aboard, Not-a-Bot. :) Heh, I recently got caught by that silly spambot too.
OK, I got v0.6.0 setup and running today. No skeleton data all afternoon so I finally dragged the Kinect back to my desk and BINGO! I think I had the same thing happen when I tried the Nite examples come to think of it.
So, here are some observations from my tests:
- Are you getting valid skeleton data?
I checked the right_hand skeleton data against the center pixel value in the depth map and they did appear to agree when they overlapped. Didn't seem to get any data on the right_finger at all and the left_finger didn't appear to be tracking. That's all I was able to test today.
- Do you like this OSC format for skeleton data?
Not really. I haven't figured out how to parse the forward slashes in Max so I ended up using statements like ROUTE /3/RIGHT_HAND which treats everything from the first / to the RIGHT_HAND as a single word. No user number unlesss I can parse the forward slashes.
By the way, I was wondering if a sequence number or frame number might be helpful for grouping data across players.
- For position data, should I keep it as a float or change it to an integer? If change it, why?
You're outputting millimeters, right? In that case, I don't see that it matters that much.
Speaking of coordinates, if I move 3 feet directly to my right relative to the Kinect, does the Z distance also change or just X? In other words, are the coordinates completely orthogonal?
@transponderfish, the 3rd window is likely black unless you reconfigured the XML ocoonfiguration file. That pwindow is hooked to the IR output. the Kinect (or its drivers) does not allow IR output at the same time as RGB output. You have to choose only one.
@dambik, I too am not getting finger data. I ask for it in code, but it returns zeros. It is possible that NITE or SensorKinect does not support that joint at this time. I've opened an issue at https://github.com/diablodale/jit.openni/issues so I can continue to look at this and if resolved have the answer available for others.
@dambik, the object that is soon-to-be-your-best-friend is OSC-route. UC Berkeley has written and released a fantastic package of Max externals. You can get them at http://cnmat.berkeley.edu/downloads. You can download just the OSC ones (they start with "OSC-") or at the top of the page is the Everything package. The OSC-route object will do all the splitting of the message you want: routing, ranges of values, wildcards, its great!
@dambik and @transponderfish, previous versions of software on github can usually be view/downloaded by clicking on the "Commits" on the top menu bar of github. There will be listed all the commits (uploaded snapshots) that an authors uploads. On a line is usually a version statement. On the version that you want, click the "tree" hex number on the right column and there you can view the entire tree for a particular version. Download from that tree whatever you would like.
@everyone, a private release of mine has changed the skeleton data to start with /skel and fixed the outlet ordering. I am close to having the user seen/lost/etc. events output once a fixed a few bugs related to new callback functionality I added.
@dambik, i have been thinking about the sequence number or frame number request. I ask to understand more about your need. Reason is, there is some timestamp and frame data that OpenNI exposes for a given frame of mapdata (rgb, depth, and ir). The usefulness of that data in an environment outside the external (e.g. in max) is in question.
You mention grouping data across players. Can you describe a scenario more fully? Or perhaps a few more examples so I can get my head around the request?
Last night I spent several hours with your external and Vizzie (to get a feel for it, before I ask specific questions or even make requests), the results are amazing!
Right now I have tested it with 3 players, all 3 of us were tracked without problems (I am working with the 4th output mostly, feeding a sketchr to get outlines. We never merged into a large blob and what pleased me most was that my honey's skirt always was there. Which is amazing to me, considering the mess in my room. I stayed in the background, often disappeared, but it was sufficient when the computer recognized my head and an arm. As I said, never a large blob or confusion between our limbs.)
I looked into the XML, I assume this is explained in the OpenNI documentation, right? I read it some weeks ago and will do so again the next days.
More observations:
-I get output in the MAX-window when I click "summary". But: yesterday I suddenly got continuous skeletal data there (no idea what I did...), today not.
- Everything looks fine and stable, but I still need a few attempts to get things running (still haven't found out if I have to load the XML-file with "read" or if the "read jit..." button is sufficient (everything is in the same folder, in the MAX folder in my win documents path). Right now, I have managed to crash the whole thing (no idea how, hahaha) - I will post my observations on how to restart everything (because ending MAX, un/-replugging the Kinect is a bit annoying AND it is almost impossible to crash a program you have written yourself; you always know what not to do... I know that from experience)
v0.6.1 is up on GitHub. I now output useful skeleton events (new user, lost user, calibrating, etc.).
I'm going to stop coding new features for now. What features are there need to be used, tested, and I would like everyone's feedback. If you find bugs or crashes that you can reproduce, please open an issue at https://github.com/diablodale/jit.openni/issues
@transponderfish, yes the OpenNI documentation goes into its format. There is not a lot of tweaking you can do other than add/remove/switching to different nodes (image, ir, depth, user). The SensorKinect driver and the Kinect itself impose them. If you remove nodes that you do not need, you will lessen your CPU and memory usage.
@transponderfish, the sample patcher I provided printed out all the OSC messages. You have to be calibrated to get OSC joint into. The NITE driver requires you standing in a Psi pose. Once calibrated, then the joint data is output. Also, in high bandwidth situations the Max window doesn't always keep up with printing them. That could have been what you saw. The new sample patcher I provide doesn't print them all out, rather just the events. It does flash a button to show the OSC joint data. There's a lot of it.
@transponderfish, if you can get a repro of the crash with clear steps, I can try and track it down. Until then, you can a) send a read message (you will be prompted for the file) or b) send read and the filename and it will auto load for you.
Is it possible to have a clip or a visual to see what's happening when its working.
Thanx Ben
> @dambik, the object that is soon-to-be-your-best-friend is OSC-route. UC
> Berkeley has written and released a fantastic package of Max
> externals. .... The OSC-route object will do all the splitting of the
> message you want: routing, ranges of values, wildcards, its great!
I was hoping you weren't going to suggest that. :) I have numerous machines to install it on and I didn't really plan on using OSC. You might be able to convince me of the advantages, however...
> @dambik, i have been thinking about the sequence number or frame number
> request. I ask to understand more about your need. Reason is, there is some
> timestamp and frame data that OpenNI exposes for a given frame of mapdata
>(rgb, depth, and ir). The usefulness of that data in an environment outside
> the external (e.g. in max) is in question.
>
> You mention grouping data across players. Can you describe a scenario more
> fully? Or perhaps a few more examples so I can get my head around the
> request?
One concern is whether player skeleton data always arrives in order for the same frame. For example, frame 1 gives me player 1 then player 2 data, frame 2 gives me player 2 then player 1, frame 3 gives me player 2 then player 1.
Another issue is missing data. For example, suppose I'm calculating the lateral velocity of right hands by determining their change in X position and dividing by the framerate period. If a user lost a frame of data, a sequence number would tell me to either invalidate the current calculation OR to use a different time period for the velocity calculation. A sequence number might also help provide a warning if there are a high number of missing data sets (e.g., player 2 disappears for 10 seconds then reappears for whatever reason).
Perhaps I'm a bit paranoid but when working with Arduino and Xbee, sending a sequence number really helped me address problems with missing, duplicate and invalid data sets.
Everyone, I have put install/usage documentation on the Wiki at https://github.com/diablodale/jit.openni/wiki
@dambik, its not difficult for me to add an alternate output format for the skeleton data. One approach I am considering is to have 3 output formats controllable via an attribute:
1) my personal preference via OSC
2) OSCeleton-puppet format
3) a Max-only list format (this is probably what you are desiring)
The 3rd I would likely take the OSC of (1) and just remove the slashes. Would something like that work for you?
Dambik, now time to geek-out so we can get our heads around this timing topic. BTW, its interesting you mention arduino & xbee, I think we have some similar project pieces. ;-)
The data output from the outlets is initiated by a bang on the jit.openni inlet, not by a specific frame rate/output of the Kinect. The Kinect is sending data constantly to the computer. Its only when you bang that I ask OpenNI for a snapshot of that data. You could, however, bang the jit.open object equal to or faster than the FPS configured which would *likely* get you every possible frame.
The data output from the outlets from jit.open is done in the standard Max ordering (right to left). If it is not, then that's a bug I need to fix.
When a bang is received, here is the flow that occurs:1. Get snapshot of whatever data is currently available, do not block and wait for new data
2. create matrix of depthmap (if defined in XML) and queue for output
3. create matrix of imagemap (if defined in XML) and queue for output
4. create matrix of irmap (if defined in XML) and queue for output
5. create matrix of userpixmap (if defined in XML) and queue for output
6. output tracked skeletons (if defined in XML via user node) and queue for output
for each tracked user skeleton...
for each of the 24 OpenNI joint types...
check against confidence filters, if doesn't pass then go to next joint without output
queue output for joint OSC data
7. output all data via outlets; should be in standard right->left ordering
The UserID values in OpenNI (and therefore jit.openni) are not guaranteed to be in sequence. Its easy to get out of sequence by people disappearing and reappearing causing the skeleton to no longer be tracked and therefore has no output. For example, in (6) above, it is possible to have a user (seen in the userpix map) but not being tracked for skeletons due to calibration failure, etc. I recommend to use the userID value to track a user, not the sequence of data output by OpenNI for a given frame snapshot.
I do believe as I currently have it coded that for every bang to jit.openni you should get:
1) all tracked skeleton data output for a given frame snapshot
2) matrices output for any configured depth, image, ir, or usermap node in XML
You shouldn't have a scenario where in one frame you get user3 and user2, then in the next frame get only user 3, then in the next frame go back to getting user3 and user2. The only scenario in which that would occur (which is highly unlikely) is if between those frames user 2 was lost, then seen and calibrated in one frame. It is possible for your code to catch this scenario by watching for the user events like "lost_user" and "calib_success".
I may implement in the future an attribute which if enabled causes old (repeated) data to not be output. Not to worry, the default will be as it is today which is to always output data even if it is old/repeated.
Since there is a continuous flow of data from the Kinect, the timestamp between frames greatly increases. It is measured in microseconds and I cannot find documentation defining what was timestamp=zero. I did a quick debug codechange and was surprised to find that the frameIDs start with 1, increment by 1 with every snapshot I take in code (a bang to jit.openni), and do not skip numbers. I cannot find any documentation explaining the behavior of the frameID, so what I've observed could be circumstantial rather than guaranteed.
It isn't practical for me to embed a frameID in the matrices themselves, I would only provide it on a outlet like the skeleton OSC or make a dedicated outlet on the far right just for a frameID. However, I wonder if instead you should generate your own frameIDs or timestamps. When you bang jit.openni, also create a frameID that is associated with your initiated bang; perhaps using a counter specific to your implementation.
OK enough technical info from me. What is your take on all of this?
Thanks for documenting! I have had not the time so far to try out v6.1, but I will certainly do on Monday.
To give some visual feedback I have uploaded my first tests to YouTube:
https://www.youtube.com/watch?v=y2u3C_s1BME
I will also check where in the chain the upper 5% of the image is lost...
@transponderfish, perhaps one contribution to your issue is the translation that is occurring. In the sample config XML file I provide I include:Image1
This configuration does the math to enable an OpenNI user to "overlay" the depth data onto the RGB data pixel by pixel. Using that translation, the depth data is smaller in width and height than the RGB data. Depth will no longer be a full 640x480.
"its not difficult for me to add an alternate output format for the skeleton data. One approach I am considering is to have 3 output formats controllable via an attribute:
....
3) a Max-only list format (this is probably what you are desiring)
The 3rd I would likely take the OSC of (1) and just remove the slashes. Would something like that work for you?"
That would be perfect.
"now time to geek-out so we can get our heads around this timing topic."
I'm going to take a little time to think over what you wrote...
@Diablodale:
Today I have tested v6.3, it works like a charm, it is using less processor than v6.0.
What I definitely need is support for TWO Kinects running at the same time on the same machine. The requirements for this are described somewhere else on this forum, I have sorted that out and controlled that this is set up correctly here (they need to run on two different USB-controllers, which is not to be mistaken with inputs/plugs. It can be checked with a software named USBlyzer, trial and error can do it, too, but why rely on that when there is a proper tool available?)
Now, I have tried to create a patch with two jit.openni modules to see what happens. It looks like the patch is preferring the Kinect which is first listed in USBlyzer. (Both Kinects are not working at the same time and I cannot choose, but I did not expect that) It also looks like both jit.openNI modules are working completely independent from each other (I can tell that from the little glitches that happen in the 4th output, they are different) which is 100% positive and promising for what I need to do with it, yippie!!!
Reading the OpenNI documentation (page 9) implies to me that an easy solution would be to edit two XML input-files, which say something like "Kinect 1" and "Kinect 2" at the beginning and feed them to two jit.openNI modueles - what do you think?
(the mac-only external by JMPelletier works with an "open 1" and "open 2". Does your external understand any similar commands?)
I am personally not interested in the OSC-output, but for giving you feedback: after doing the necessary pose, I got output. Works. Please ask if I should check more things.
NOTE! OpenNI and Primesense released a lot of new code two days ago. Some of it includes changes to the XML configuration support; some APIs were depricated. I have not tested my code against this new drop of OpenNI, NITE, and SensorKinect. I am definitely testing and updating everything to this new code by 20 July because of new features I want.
@transponderfish, I'm interested to hear of any success on two simul Kinects using OpenNI. I did have this in mind when I was writing. However, have not 2 Kinects myself for any testing. Thiw is the approach I took:
leverage the documented OpenNI XML config files; allows rich configuration with inherit compatibility at no development cost to me. Allows arbitrary numbers of devices, generators, etc.
allow the default "sharing" of devices mode rather than exclusive locking
each jit.openni creates an independent OpenNI session which directly maps to the XML configuration file. You can point to the same config files or separate config files.
Given that, create two different XML config files. Two separate jit.openni objects. Then send a read message to each jit.openni referencing the independent XML files. Now.....
What I do not know, is how the OpenNI middleware, NITE, or the SensorKinect driver will behave with two Kinects. Given SensorKinect is a hack, I would look there first if it works. The simplest thing that I suggest testing is an ImageMap. Create two separate XML files simplifying them to only have one element for an image: . Also, in those separate config files, there has got to be a way to describe which device is associated with which node rather than the default behavior you are experiencing. What I am unable to find is how in XML to specific a specific Kinect.
You can specify Queries for Nodes which can isolate a specifically desired device. But the docs I've foudn so far would only differentiate between different types of sensors, not multiple of the same time. I can see in the C code that they've added low level support, but I can't yet find the XML parse that utilizes it.
https://github.com/avin2/SensorKinect is the driver of which I'm speaking and OpenNI.org has all the middleware.
Hey diablodale, nice work on getting a PC max external happening for the Kinect! I'm really looking forward to getting it running.
I've happened upon an issue when trying to create your jit.openni object. I receive a error message:
"tooltip: Max.exe - Entry Point Not Found
The procedure entry point xnGetBytesPerPixelForPixelFormat could not be located in the dynamic link library OpenNi.dll"
...causing the object to be disabled.
I was running 3 month old OpenNI and NITE middleware, which I thought could be the issue. So I've updated to the latest stable builds and receive the same error. Potentially I've missed something else along the way.
I'm running: Win7(64-bit), Max 5.1.8, PrimeSensor 5.0.3.3, OpenNI 1.3.2.1
I installed your object by copying jit.openni.mxe and jit.openni_config.xml into my user object library. But I haven't touched anything in the config file.
@Diablodale, I have done as you said, created two simplified xml files, read from two jit.openni modules. Result is same as before, I get the same output from both of them.
I tried a bit of hacking on the second one:
As you see, changing the name into "Device2" at least works so far that it does not crash. The serial number is there because it is the only difference between both Kinects I found in USBlyzer (it is commented out because MAX gave me the error message "corrupted XML". I then searched the web for a hint on the correct XML-syntax here, unfortunately found nothing yet, but I stumbled upon this thread:
http://groups.google.com/group/openni-dev/browse_thread/thread/a667d6066d144d71/a5fff1a47170a132
Now, I do not trust everything I read on the web, but somebody there says that XML is not working for this purpose and that the key is in the driver/external. On the other hand, the more I read the OpenNI documentation, the more I believe it must be possible (I found more discussions by programmers, if it helps I can post links here).
Question on the MAX window, when I get: dumpout: read jit.openni_HACK.xml 1
The "1" means "success", right?
@kwijy, now that's an odd error. I call that function extremely often and its needed. Why your OpenNI.dll wouldn't support it is unknown. I recommend one of three things:
1) use the exact DLL versions that I describe in the README (note that while you are running on Win64, you must have OpenNI 32-bit installed)
2) wait until I update the codebase to support the newer OpenNI DLLs released in the last 4 days.
3) rather than moving those 2 files into your user object library, put the three files jit.openni.mxe, jit.openni_config,xml, and jit.openni_test1.maxpat all into a folderon your desktop. Run that patch, click the read xml file message box, and then start the metro. What happens?
@transponderfish, it is possible some part of the codepath OpenNI or SensorKinect hasn't fully exposed enough information to differentiate between two Kinects in XML. OpenNI's XML config spec allows arbitrary data that is driver specific...perhaps some code work needs to be done here.
@transponderfish, correct. it is modeled after the jit.qt external. I updated the wiki with the info https://github.com/diablodale/jit.openni/wiki
@diablodale, I can report a major success in my dual-kinect-research: your external is definitely working with two kinects attached :-)
Here's what I did:
- Considering the info USBlyzer gives me, the first Kinect listed is KinectA, the second KinectB
- as you suggested, I set up a minimal MAX-patch and two XML-configs.
- I unplugged KinectA, opened the patch, loaded the config for KinectB, started it
- plugged in KinectA again, loaded the config, started it: both are working independently.
I made some guesses on what the XML-config is expecting, nothing is working here - at least I found out what the external accepts as valid. Obviously it ignores anything non-intellegible, as long as it is well-formed. This is accepted, but has no effect:
1
1
(some of it is based on step 11, found here: http://www.kinecthacks.nl/kinect-tutorial-4-setting-up-mt-kinect-package-of-openexhibits/
Well, it was worth a try...)
Maybe this is relevant for you, maybe not, in the Microsoft SDK (5 days old) they give info on how to access a specific device by code (page 16):
http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/docs/ProgrammingGuide_KinectSDK.pdf
I will now set up the whole thing on my notebook, test your new external and see what happens. Then I'll google on about XML.
@diablodale Thanks for the reply mate. I found and installed the version of OpenNI you listed, but couldn't find the compatible NITE version. I tried to load it anyway as you said in your 3rd potential fix but it just crashed Max. But no worries, I'll sit tight for your next build.
Thanks again for your help. Looking forward to your update! Cheers.
@Diablodale & @all: Today I managed to set up it up on my notebook (2.2GHz Dualcore, Vista), it was a bit of a hassle, so here are my experiences:
- like kwijy, I did not find the exact versions listed, but when I checked those from my desktop PC and installed these, it worked (OpenNI 1.1.0.41, NITE 1.3.1.5, SensorKinectMod 5.0.1.32).
- I uninstalled all other drivers first, rebooted, installed everything, rebooted. It did not work first, I found out that the SensorKinectMod drivers were not active. Go to the device manager, see if there are XBoxNUI drivers left under HID-devices, if so, uninstall/delete them. Replug the Kinect and point Windows to the extracted SensorKinect folder. It takes a while until Windows sorts everything out, wait until you see an own folder "PrimeSense", listing Kinect Audio, Kinect Motor and Kinect Camera.
@Diablodale: I managed to run both Kinects on the Notebook using the trick mentioned above. It is running for over 2hrs now without crashing :-) CPU is at 60-90%.
Things look far more nervous now than the results I posted in the video (I think it was 6.0 or 6.1 running on my 3 GHz Quadcore desktop with win 7):
Looking at the 4th outpout, I am gone very often, sometimes a single limb, sometimes completely. It now needs my whole body to track me, when I sit down, I am gone completely. The previous version only needed my upper body with arms to find me - which would it make wheelchair-compatible.
I will test 6.5 now on my desktop - maybe it is the slower CPU, maybe the USB-controller that is causing this flakeyness.
hi diablodale,,
i have a win7 64bit with
openni 1.1.0.39 win32
nite 1.3.1.3 win32
primesense sensor kinectmod 5.0.1.32 (the avin package)
and max 5.1.8
when i create the jit.openni object came up an error from
tooltip: max.exe saying that the external isnt designed for windows or have an error... in max window appear a message saying:
Error 139 loading external
i didnt see anybody else having this kind of issue... do you have an idea about what happens?
thank you very much!
@Diablodale: After 2hrs testing on my desktop machine I can say for sure that CPU-power is an issue!
- I tried the patch you provide, everything worked fine, it even recognized me while sitting. When the MAX-window told me to strike a pose, I did so and it worked on the second attempt (saying "calibration good").
Please tell me if I can provide any specific info here.
- next I tested it with two Kinects, which definitely works too, the glitches (lost limbs or me disappearing totally for a moment) on output 4 are bearable, everything is FAR better than on the notebook. I will see what happens when I try to optimize behaviour by tweaking resolution and framerates in the XML.
- a general hint: I have a glass-door in my room, with frosted/uneven glass, the kind they used in the 50ies to "upgrade" early 20th century doors - this disturbes the Kinect quite a bit, the IR is reflected in all directions at once. I compared this to our modern windows with isolation-glass: here the IR goes through with no reflections. So avoid old glass ;-)
@carsol:
The NITE you are using is older than the one the external was developed with. If you compare what I am using atm, obviously they have not to be 100% of the same version-number, but at least slightly newer does it.
From experiments with other Kinect-programmes I remember that not everything is compatible (I e.g. failed miserably mixing up stable/non-stable drivers).
Heads up, we are working with hi-tech in an alpha-stage, some trial and error is needed ;-)
@transponderfish:
the problem is that i cant find NITE 1.3.1.4,, the most similar i found was the 1.3.1.3 ... i already tried with the last stable version of NITE (1.4.0.5) and last version of OpenNI (1.3.2.1) and no luck, same error...
do you have have a download link for nite 1.3.14? i google it a couple of times but nothing came up...
thanks!
@carsol
I am new to this forum, maybe it does not allow PM, maybe I have not found the button.
I can mail the nite 1.3.1.5 to you if you wish.
I have posted v0.6.6 of jit.openni object in the usual https://github.com/diablodale/jit.openni
v0.6.6 *requires* the new generation of OpenNI, NITE, SensorKinect software. Full details of the versions required are on the Wiki at the same URL. A few other features were added in this and the immediately prior version. All in the README and on the Wiki.
@transponderfish, thank you for your investigation. I see on the openni forum that people can get both working if everything is in C; I haven't seen anyone speak to XML support :-/ I suspect it just hasn't been a priority; shame since the XML support adds support for config outside C/C++ coding.
I have been testing on and off the past week and had pretty good results but with occasional drop outs. I think I need to build a test patch that helps me keep better track of status from the OSC messages. But, at the moment, I'm off on a trip and intend to pick up the latest version when I get back.
Hi
I have windows 7 64bit
I have downloaded the exact drivers specified in the wiki
OpenNI 1.3.2.3 for Win32
PrimeSense NITE 1.4.1.2 for Win32
SensorKinect Win32 Device Driver v0.7 (based on 5.0.3.4 Primesense driver)
I have uninstalled all kinect drivers that I had previously
Now I can install the openNI 1.3.2.3 driver
but when I try to install the NITE 1.4.1.2 driver I get a message that I need to install the openNI 1.3.2.3 driver (which like I said, I have already)
Anybody had this problem?
hi edsonedge.. seems today we are in the same...
i been able to install (in win7 64) openNI 1.3.2.3 and nite 1.4.1.2
but my problem is that i cant execute the msi installer from avin2 github
SensorKinect-Win-OpenSource32-5.0.3.4.msi
windows says something like bad installer.. do you run it succesfully?
is this the file mentioned as (SensorKinect Win32 Device Driver v0.7 (based on 5.0.3.4 Primesense driver)) in the diablo instructions?
thanks!
@edsonedge, carsol; I run everything on Win7x64. I recommend removing all CL NUI, KinectSSDK, and all OpenNI, NITE, and SensorKinect software. Then, go into your c:program files(x86) directory and verify that you have no PrimeSense or OpenNI subdirectories. If you do, then something didn't uninstalled. And resolve that.
Then give it another shot. You *must* use the Win32 versions of ALL THREE components (OpenNI, NITE, and SensorKinect). The x64 versions are not supported because Max is a Win32 app.
@diablodale: thanks for the answer.. i follow your guidelines using all win32 versions and the "SensorKinect-Win-OpenSource32-5.0.3.4.msi" dont want execute in my machine.. is like the msi is corrupted.... weird...
but the old SensorKinect-Win-OpenSource32-5.0.1.msi works.. i mean at least i can install it, but jit.openni dont works...
best!
@diablodale: i have the same issue as carsol. i followed your instructions, however the sensorkinect-32bit version won't install as it says it's incompatible with my 64-bit operating system. both the OpenNI and NITE 32-bit versions install fine.
@kwijy, carsol: that is puzzling. The SensorKinect-Win-OpenSource32-5.0.3.4.msi that I downloaded and successfully installed from his github is 4,341,760 bytes in size. Do you both have the same one?
I am running Win7x64 SP1 Ultimate with all current Windows Updates on an Intel Core2Duo laptop.
If you continue to have problems installing that drivers, perhaps starting a dialog with avin (the driver's author) or one of the forums like the OpenNI forums on google groups. I suggest that because his installers have always worked for me; I've never encountered a problem.
@diablodale: hold up mate. not sure what i was trying to install before - but i've got it up and running now. on the first test it's running well and i can't wait to get stuck into converting over from OSCeleton!
for the record, these are the files i've downloaded and installed to get it running:
OpenNI-Win32-1.3.2.3-redist.msi
NITE-Win32-1.4.0.5-redist.msi
avin2-SensorKinect-2d12967.zip (SonsorKinect-Win-OpenSource32-5.0.3.4.msi)
@carsol + @edsonedge: if you have troubles finding the files i can email them to you. jump on the contact page on my blog and i'll rar the files up and send them across... http://chrisvik.wordpress.com/
thanks again dale. i'll make sure to keep you posted with what i end up doing with your tool.
thanks kwiy :)
glups, mea culpa... avin2 msi was a wrong file,, i mistake downloading! grgr...
with all last versions, everything works fine!
thanks all for the help!
@diablodale: everything works fine here with 6.6. Great!
I have tested on the desktop machine for over an hour and noticed nothing unexpected. It is amazing how stable the tracking is, I managed to confuse it a bit by moving furniture around etc, but once the pointer/focus is on me, I will stay in window #4. I tried to trick it by rolling into a ball on my chair (the idea was to hide head & limbs to confuse the algorithms), but I stayed in the picture without a single glitch.
Quick tests of the output enabling attributes: works, I think it saves a few percents of CPU (not a 'scientific' test yet), which is a great feature, of course.
And of course I am very curious now how it will perform with 2 Kinects, especially with several users. :-)
@diablodale: ye.. here the same some hours playing this afternoon and 100% stable..
im getting only good readings in 15 joints in the other ones im getting crazy values, is this normal?
thanks for this great external!
A little update, while I am demystifying my notebook's behaviour:
- In my enthusiasm I did not really read what the MAX-window's output, it gave the error-message
"newobj OSC-route: No such object"
I had thought I had the same installation here (updated Max the same day), so all I watched was the patcher and the system-monitor. What was missing here was the CNMAT external. (@diablodale: I do hereby suggest an update in the Wiki concerning the requirements).
- @all: Everything is fine if you read the following after opening the patcher:
"OSC-route ("OpenSoundControl route") object version 1.17.1 by Matt Wright, Michael Zbyszynski.
Copyright ) 1999,2000,01,02,03,04,05,06,07,08 Regents of the University of California. All Rights Reserved.
Jitter 1.7.0 installed
jit.openni v0.6.6, Copyright (c) 2011 Dale Phurrough. This program comes with ABSOLUTELY NO WARRANTY.
jit.openni v0.6.6, Licensed under the GNU General Public License v3.0 (GPLv3) available at http://www.gnu.org/licenses/gpl-3.0.html"
and after reading the XML: "dumpout: read jit.openni_config.xml 1"
- OSC output and calibration are now fine on the notebook, but there are still lots of glitches on the camera output. I am now running a complete virus-scan, because I suspect either the virus-scanner is running in the background or (worst case) I have caught a virus. (blaming Vista would be too easy)
@diablodale, I made some detailed measurements of processor load, on both machines I set metro to 75, this gives me a good compromise between visual fluency and CPU-load, the results are as follows (one player):
(I read the results from Resource Monitor, it is more detailed than what you see in Task Manager)
- Desktop, 3GHz Quadcore, Win 7: ca. 11% Max, ca 7% PrimeSense Device Development Kit. Deactivating features I am not needing does not make a great difference, saving 1, maybe 2% load.
- Notebook, 2.2 GHz Dualcore, Vista: 19% Max, 9% PrimeSense Dev. Here, the difference in load after deactivaing features again is hard to notice.
It crashes regularly on the notebook, but I cannot really give a pattern, sometimes it runs for over 2 hours, sometimes after a few seconds or minutes.
I always get an error from Vista, saying that Prime Sense Device Development Kit has stopped working (I am translating here, my Windows are German language)
Sometimes I get the following error in the Max Window after a crash (no idea what causes it, I cannot say if it has something to do with having calibrated or not):
jit.openni Failed updating generator nodes (Xiron OS got an event timeout!)
jit.openni data unavailable for request
(This is printed repeatedly, like being output by a loop)
I get losts of glitches in the window #2 (regular cam), they resemble those on an old defect CRT-TV, distortions that are horizontal stripes for a few milliseconds. I do not need to wait for them, they are always there, the length of undistorted sequences is 2 secs or so at maximum.
I hope this helps a bit, I mostly tested unattended - staring at a machine waiting for a fail would be a bit too tedious ;-)
The virus theory weakens the more I think about it; if it was the case, the desktop would be affected too. There definitely were background processes when I made the last post, maybe the virus scan, maybe defragmenting - who knows. It was cured after a complete scan.
It possibly can be connected to the update, I think everything worked better before, but I frankly cannot be sure, as I usually do things on my desktop and later transfer to the notebook, testing only briefly here. I do not remember a crash before on the notebook.
I remember having read a warning concerning notebooks and USB-cameras in a forum when I researched how to connect two Kinects: Somebody said that you cannot be sure it will work on your machine without really trying, even when you have checked the hardware and USB-ports with USBlyzer. The reason is that the components on a Notebook's mainboard are so closely crammed that it cannot be excluded that errors occur, as the data-rate is horrendously high and the drivers are of varying quality. It sounds plausible to me, because I have made lots of experiments with different webcams and Processing. It works with the internal cam, but I cannot find a common pattern on what model will work on USB - the Logitech flagship does not, a cheapo no-name does, another cheapo does not. Max/Jitter behaves far better here, I have to say - the said Logitech works.
hi all..
i saw some mentions in the thread about osc-route...
if you dont like it, you can remove the dashes from the output of jit.openni with a [regexp (/) @substitute " "] and then you can use the standard route object...
cheers!
@carsol: OpenNI supports 24 joints. However not all joints are supported by NITE nor stable; e.g., they document that leg tracking is unstable and noisey. I recommend using the confidence values to aid in filtering out the worse values and consider using smoothing.
@transponderfish: OSC-route is not a requirement for jit.openni. Instead, I use the OSC-route external in the example patcher I provided. You can process OSC output however you would like, ignore it, or even disable it.
@transponderfish: The last posts from you I saw you were not using the required versions of OpenNI, NITE, and SensorKinect. Have you updated to them now? The errors you are experiencing are due to lower level components (like OpenNI and SensorKinect) that I don't have control over. I try to catch errors when I can. One error that I catch is the "Failed updating generator nodes". That is my code seeing lower level software failing and me trying to skip failing code so I can continue running. That particular error is severe, it means that when I call the API to get updated data from OpenNI, that OpenNI tells me it can't give me anything and it failing. That suggests to me that there has been a complete breakdown by some lower component. You see this error repeated because you will see one such error log for each bang you send jit.openni.
@transponderfish: Receiving all feeds from jit.openni is compute and USB intensive. Put the Kinect on a USB controller of its own. I also recommend receiving (aka generating) only what you need by editing the config XML file. Disabling it in XML is the best way to reduce load. OpenNI and Kinect send and process data if you have it configured in XML. Turning it off using the jit.openni attribute does not yield the same reduction in load. The attribute only disables creating and outputting the jitter matrices; in the background Kinect and OpenNI are still calculating the massive amounts of data. This attribute functionality is desired because there are times when a person might want the skeleton output but not need the depthmap output. In this case, a depthmap generator is required (for NITE to calculate skeletons). Because I have the attribute functionality, it allows you to save a few CPU cycles by disabling the output of jitter depthmap matrices rather than the default behavior which is to generate output for anything configured.
@diablodale: Yes, I have updated everything. I tried to be verbose as an aid for you debugging and optimising. Thus, I have used everything "as is" (except for metro 75) and I document my own errors as a help for others.
I have now done a test on the desktop PC, running 2 Kinects using the trick described above:
- ressource monitor displays 19% for Max, 8% for xnSensorServer.exe/PrimeSense Development Kit. This is less than I expected, the total CPU-load shown in task manager is around 35%, with spikes to 40%.
There are another 11% used by system interrupts (??? displayed as "Systemunterbrechungen" - why the $%&/?! do they have to translate specific terms???)
- The behaviour is again flakey, to be specific on output #4 (silhouettes). My first assumpton was the IR projections would be interfering, thus I have set up the 2 Kinects back-to-back, pointing in two opposite directions. It gets better a bit, especially when one user is seen in total. (I have tested with 2 persons, each on one of the kinects)
- the OSC-route display is very valuable for assessing what is happening here: in the Max-window I see it counting up to user 10 quickly, but it is not able to keep tracking who is who ("I see x/I lost y). I assume both Kinects are sharing the same process, which totally confuses OpenNI's tracking.
- I have not yet experimented with two XMLs, but I will do.
- Switching off Kinect B in the patcher cures it.
- Right now (after running for +30 mins, the IR-window of Kinect B is behaving erratically (size is "pumping" between the normal, reduced size, full scale and a bit smaller)
About my notebook: yes, I have assumed that it is a lower level component thing - I do not remember reading anything about Vista-compatibility anywhere. And we can still blame it on its mainboard ;-)
@diablodale: another question, theres a way of getting a 4 char matrix of depthmap?
i was expecting that, the typical colored depthmap.. isnt that i spend a lot of time, but in the firsts tests i did with the greyscale depthmap i cant really simulate a good 3d mesh image, i found a really flat mesh with few Z value range.... you know what i mean? my english you know.... :)
or maybe i can post a couple of sample images from previous tests i did with a colored depthmap and the one with your external...
cheers!
A question for everyone, I have a build ready to release that has 3 output formats settable by attribute:
1: the current OSC format (default)
2: a max route object compatible format, same as (1) without the slashes
3: OSCeleton legacy format (no orientation, no normalized values)
For output #3, should it be the normalized values or OpenNI native "raw" values?
When OSCeleton was first written, the author didn't understand that the OpenNI values were millimeters. Instead, they watched the sample output and choose to apply a normalizing formula. This formula can be troublesome as it doesn't support the full range of Kinect data. Its also based in a false assumption the author made. The author did the best they could at the time. However, that mistake is now legacy.
I can copy the normalize formulas from his codebase and apply them only in the (3) format output. *OR* I can leave the OpenNI (aka Kinect) values in their native "raw" values. This would be equivalent to using the "-xr" switch on OSCeleton.exe
@carsol: I do not intend to create a 4 char depthmap. There is no data that would fill a 4 char matrix. Kinect and OpenNI return a single long integer for each (x,y) pixel. There is only one plane of depth data. I already support changing the output format of the depthmap to a long, float32, or float64. All of them 1 plane data.
@carsol: By combining the output of the depthmap matrix and the imagemap matrix in your own patcher, you can create voxels and display them in amazing 3d mesh images using OpenGL.
hi diablodale..
nice to have these new features!
about your question i think its interesting to have them with the oscleton normalize formula, so for those who work before with OSCleton can migrate to your object easyly...
i dont see the way to change the output format for out1 to float32 or char.. ?
if i want just get depth and image output i need to erase the other nodes in the xml file right? i did it but the outlets 3 and 4 of jitopenni keep banging is this normal?
thanks!
@diablodale
More tests done (quadcore), but first I have to correct an error in my post from yesterday: I did not mean the IR-window, I meant the depth-map window.
- I left the two Kinects unattended yesterday (back to back as described), for more than 5 hrs not a crash. Yeah!
- today, I checked my hypothesis that the players between both Kinects would be confused: I was wrong. I checked by changing the two "print OSC" in my testpatch into "print OSC_A" and "print OSC_B". I saw that both devices/processes are cleanly separated.
- still, the behaviour is rather nervous (as described above, players are lost quickly, assigned a new number), nothing compared to one Kinect running alone, where things run very smoothly and the system reacts very pardoning (it even recognizes me as a player while sitting and only waving one arm). I found out that it helps when the player is in full view (all limbs are clearly seen). I checked this with two players (each on one Kinect)
- I used two XML configs, in the second I renamed all items (e.g. Depth2).
I also tried to save CPU by deactivating nodes, but, of course, after I am interested in the output #4 I will definitely need these...
- Somehow I produced the following error, I hope it helps (I think in this case the screenshot is more useful than typing it)
New version v0.6.9 is up at the usual https://github.com/diablodale/jit.openni and the wiki was updated to document the new features areas of alternate user/skeleton output formats, user center of mass output.
@transponderfish: that's a cool assertion. ;-) That particular assertion is within Max itself; far beyond my code. An assertion is where a developer makes an assertion (normal english meaning) that some logic should always be true. But if it doesn't for some reason, it should "throw an assertion". I recommend you submit this error to the Max support team. It is possible that my code somehow triggers this in their code. Regardless, the assertion should never occur and should instead be caught in error handling. Error handling that I can then catch and manage. Perhaps it is something they would like to see so they can address it in Max 6.0.
@transponderfish:
i haven't done extensive testing using multiple kinects, however i've been in performance situations where 4-5 kinects were setup from the same spot controlling different computers - we had to cover up the kinects that weren't being used for each performance as it seems the IR signals interfere with each other - ie. clearly visible distortion around the edge of the user's 'cutout' image where it should be solid, loosing the user easily, loosing limbs and in general, very jumpy data. (this was using OSCeleton however). i'm not sure if this is related to your issue or not.
quote transponderfish:
"- still, the behaviour is rather nervous (as described above, players are lost quickly, assigned a new number), nothing compared to one Kinect running alone, where things run very smoothly and the system reacts very pardoning (it even recognizes me as a player while sitting and only waving one arm). I found out that it helps when the player is in full view (all limbs are clearly seen). I checked this with two players (each on one Kinect)"
@diablodale:
i know very little of jitter (i'm coming from an audio background), but for the past few days i've been spending a lot of time going through the tutorials and examples to find a way to do what you've mentioned below, but with little success.
i'm trying to layer 3d objects together with the RGB image, so the objects may pass behind and in front of my body, but so far i'm at a loss on which way to do it. i've looked into chromakey masking but i'm struggling to get any results, and quite possibly i'm barking up the wrong tree.
making a mesh sounds like a much better way to go. would you mind pointing me in the right direction in terms of which objects are needed to create the 3d mesh?
cheers!
quote diablodale:
"By combining the output of the depthmap matrix and the imagemap matrix in your own patcher, you can create voxels and display them in amazing 3d mesh images using OpenGL."
@diablodale:
just a report on the external.. i've had it running mostly without issue. i've had a couple of instances where the RGB image has ended up (after even only half an hour to an hour of running) with green and purple(ish) chunks of noise and distortion that take up around 50% of the image.
the skeletal tracking seems a bit jumpier when this occurs. it's fixed by simply closing the patch and re-opening. no error messages pop up.
@all:
- I have triple-checked that both Kinects are on different USB-controllers, both are USB 2.0. If I use different inputs/controllers, they simply do not install (this is shown in USBlyzer with an exclamation mark in front of the device)
- to check if there is IR-interference between the two devices (they are set up back-to-back, but who knows, the human eye does not detect IR, maybe something is reflected by something) I have modified a test patch for IR (with two XML-configs, everything renamed, e.g. "IR2")
Results are:
# nope, there is no interference. I have checked this by covering Kinect_B with a cloth (a Newspaper is a reflecting surface and thus not so good), then viewing the IR-output of Kinect_A. To make sure nothing eludes me, I have boosted/brightened the window with a "jit.op @op * @val 1.5".
# the jumpy behaviour starts _exactly_ when Kinect_B is activated in the patch (not after the XML is loaded), when it is switched off, it goes back to normal (switching it off in the patch does not stop the IR-projector, as you can see).
@diablodale:
I possibly have found a bug in the XML-configuration. When I tried to find out if anything was wrong with the IR-node, I saw that both these lines are mandatory:
If one of them or both are missing, the following error is produced:
"jit_openni: XML config initialization open failed (Device Protocol: Bad Parameter sent!)"
And: it does not mirror.
Next, I will experiment with the smoothing and confidence options.
May i butt in ?
How dose it run on snow leopard ?
What do I need to do ?
@diablodale - thank you very much, great contribution
included is a simple demo which demonstrate mapping of jit.openni to a point mesh with texture.
@yair r.:
awesome. thanks very much for sharing!
edit: i've noticed the mesh seems to be inverted on the z axis. i really have no idea what i'm doing, but managed to change it by altering the jit.expr:
jit.expr @expr -snorm[0] -1.*snorm[1] in[1] norm[0] 1.-norm[1]
i've inverted the "snorm[0]" to a "-snorm[0]", then rotated through the jit.gl.handle object to 180 degrees on the x plane. seemed to fix it for now.
edit: nvm =) setting the "amp" to a negative number also does this.
@Kwijy - i blame the heat and the late hour :)
a different approach is to "flip" the Z axis, the third expression.
expr snorm[0] -1*snorm[1] -1.*in[1] norm[0] 1.-norm[1]
@all - i find it difficult to close the patch. hitting alt-F4 or close will freeze interaction with the patch. weird.
@diablodale: Yesterday I took the two kinects and my notebook with me and tested your new version in a place that provides more of a "laboratory condition", i.e. more ample space, less "visual noise" (no bookshelves & stuff) around. We were two, each moved in front of one Kinect.
The results were better than expected. I had two crashes (test time 1 hr) and the user-tracking is anything but stable (which means it loses a user and finds another one all the time, although it is the same person, as described above), but it is not a catastrophic problem, most of the time we were recognized, if one was gone, he was there again after a second. Lost limbs were not a big issue.
And it was really a thrill to remember the technical leap between our first Ataris and our first PCs. This was not ages ago and now I can carry the equipment to do such extraordinary computational tasks in a backpack.
- I tried giving Max and the xnSensorServer.exe the highest priority possible in Task Manager, but did not notice an effect.
- Also, I did not see an effect after changing the values for skeleton smoothing etc. So these are only affecting the OSC-output, right?
@yair: Thank you for sharing! Here, the patch is closing, no problems noticed (and it is running stable).
What seems weird to me is the behaviour of the jit.openni outputs. I cannot get the cam (#2) and the user tracking (#4) working at the same time. (Plus, the OSC output is only updating when the Max-window has the focus) I assume there is something I do not know about Max/Jitter. If somebody could point me to the right direction, I would highly appreciate it.
@kwijy, noise and distortion you describe is very odd. I am unsure if I could be the cause of this. The code I've written is a very thin layer of code to expose OpenNI to Max/Jitter. The code is very repetitive; its a loop doing the same thing each time you bang it. Given that, I think...any distortion/coloring would be immediate if it was within my code. I'll think on it more. A possible scenario, is that there is a memory leak or other coding problem in the OpenNI, Jitter, or SensorKinect code. Or it could even be in your graphic drivers (Max using the OpenGL portion of your graphics driver which tends to get less rigorius testing on Windows compared to Mac). Its not that I want to point fingers to someone else, rather its that I think I do almost nothing that would slowly detoriate the graphics portion like that. There is one section of code that I very heavily manipulate the RGB values from OpenNI into a Jitter compatible matrix. However, any error there should show up immediately on the first bang and continue in all bangs.
@transponderfish. Problems with xml config reading should be reported to the OpenNI group. My code never opens or looks at the XML file. Instead I ask OpenNI to do it all and rely on their functionality. Its is odd that your computer needs MapOutputMode. I have never needed that and I always mirror. It could be that two Kinects plugged in causes lots of problems in SensorKinect and OpenNI. The authors of each may not be doing much (or any) testing on that scenario. I know I haven't because I don't have two Kinects. Please do report any OpenNI XML configuration bugs to the OpenNI google group at https://groups.google.com/forum/#!forum/openni-dev
@livemax, this max external is Windows only. I welcome a Mac developer to join me in updating any needed code to make it cross-platform. Until then, there are other methods available for Mac user like jit.freenect
@carsol, you also are experiencing an assert in Segmentation.cpp. That's Max's code and should be reported to the Max support team. Meanwhile, what were you doing in Max right before this Assert occured. Also, I have noticed that both you and transponderfish do not use English as your primary language on your computer. So far, you are the only two that have reported this problem and both of you have non-english in common. If I could understand what these Assertions mean (Max will have to tell me), then I might be able to reverse-think it.
@carsol. Thank you for reporting that outlets still bang even if they are not listed in the XML config. For now, that is expected behavior. It is due to the Max Jitter APIs for MOPs (matrix operators) that I use. Since these outlets exist, they default to a 1x1 matrix. And then later in their APIs, they output whatever matrix is current...even if it is a blank 1x1 matrix. I have requested in another post on this forum the code in their API so that I can modify it to not output on outlets that I disable. Unfortuantely, they have not replied...yet.
if i want just get depth and image output i need to erase the other nodes in the xml file right? i did it but the outlets 3 and 4 of jitopenni keep banging is this normal?
@transponderfish. Correct, the smoothing and confidence attributes affect only the user/skeleton output. If you have the xml configured and have not disables the outputs, it should be very easy to get the imagemap (#2) and the user/skeleton (#4) outlets working. Every test I run outputs depthmap, imagemap, and user/skeletons. For me, its the most tested thing. What do you mean by OSC output? If you mean seeing the OSC messages printed in the Max console window, then not seeing them is normal. Max puts updating that window at a very very low priority. Not to worry, the OSC output is being sent out the outlet and your patch can act on it.
@diablodale: ok, i will send the assertion error to c74 support.
time ago i found a bug in jitter(that is already fixed) related to the international settings of windows, if i set spanish for international conversions jitter patches crash, if i set to another country everything was working fine, i dont know if theres also some connection with this assertion, but if somebody is having many issues with this (i have only one crash at the moment) can give a try changing the international setting in windows.
@diablodale: i start to dive into the XML config file to do some mods, but im having constant crashes of max if i change the XML file on the fly... do you?
besides this im having diferent issues, if i add this line to node Image1 or Depth1 in the default XML file:
im getting an error in Max window saying something about bad parameter... ? and no image is comming to the outlets...
if i comment with () the depth node entry im still getting depthmap image in first outlet of jit.openni, but if i comment image1 node the image isnt comming as must be...
what im doing wrong? any special considerations for the dition of the xml file?
thank you very much!
@carsol, I use Windows 7x64 Ultimate. I should be able to switch my primary display language to Spanish. Do you think this would simulate your setup?
@carsol, the Kinect itself and SensorKinect do not support all possible values of xres, yres, and fps. The possible values are more restrictive than the Primesense Sensor. It is possible this combination is disallowed. Also, if there is both a IMAGE and DEPTH node, then both must have the same resolution.
@carsol, you discovered my primary reason for the output_depthmap attribute. It is likely when you removed the DEPTH node, that you retained the USER node. In this scenario, the USER generator itself requires a DEPTH node and automatically creates one. This is all done automatically by OpenNI and NITE. I call an OpenNI api that reads and initialize OpenNI using XML. After that call, I receive back a list of generators. In your case, it was likely a USER and a DEPTH (that the user node required). I also noticed this behavior and recognize it as intended by OpenNI. However, I saw an opportunity to reduce some CPU load by allowing you and me to disable my code which does the matrix calculation to convert that unwanted depthmap into a jitter compatible format.
@carsol, repeatedly reloading the XML on the jit.openni object is unreliable and prone to crashes. Known issue, please see details at https://github.com/diablodale/jit.openni/issues/4
@diablodale: the bug was with my old machine, winXP 64bit and max 5.1.4, dont know if with win7 and max 5.1.4 happens, you need to put "spanish" in regional settings (where is the config about date format etc..)and you will get a straightforward crash in max patches with jitter objects...
about resolution modes, ouh yeah, kinect isnt supporting QVGA, what a pity for latency maniacs... do you have tested your external with Primesense sensor? do you think will work? seems primesense have some advantages like 60fps at 320x240....
if i want the best performance only for skeleton tracking i need only the user node and set the good output_depthmap attribute to 0 and also the other outputs, image, IR and user to 0... right?
following with the XML issues, if comments in IR node are removed im getting an error of bad parameter... its only me?
thanks!
@carsol, to be able to assist you, I request that you visit https://github.com/diablodale/jit.openni/issues and open an issue for each of your XML problems. Please include a full description of your OS, Max software, specific openni related software and versions, hardware with all version numbers, languages, etc. Then include reproduction steps of what is installed, any code running, steps that I should take to reproduce the problem, etc. Then include the full XML that you are experiencing problems with.
You can look at issue#1 as an example of the information that I request https://github.com/diablodale/jit.openni/issues/1
Yes, if you want the least amount of code to run and get skeleton tracking, then I suggest you have:
1) only a USER generator node in your XML
2) disable all the output attributes except for skeleton
3) disable the orientation data if you don't need it
4) set the confidence filters to at least (0.6). In the current version of NITE they only output 0, 0.5, and 1.0.
5) only bang jit.open as many times/sec as you need.
@yair: Thank you for the sample mesh code! VERY cool.
Can someone explain to me why the mesh consists of a couple of different flat planes that overlap. Why isn't my head round, for example? :)
OH, I see now - I have to set the AMP parameter.
Ah, the mesh code works MUCH better if you use the values directly from jit.openni rather than from the [jit.op @op / @val 22] object. Dividing by 22 caused Z to be limited to one of about 11 possible values making a sliced effect. My head is now round.
@diablodale: at the moment isnt a big deal dont have IR but if you want i open the case at your github page and thanks for the tips!
@all: it happens only to me that with yair patch (that connects a float32 matrix to the first outlet) the image have a lot of flicks and glitches? if i connect to a char matrix i dont have this issue, but then theres others..
thanks!
"it happens only to me that with yair patch (that connects a float32 matrix to the first outlet) the image have a lot of flicks and glitches? if i connect to a char matrix i dont have this issue, but then theres others.."
Yes, I'm also seeing "flicks and glitches". I expect using a char matrix limits some of this by quantizing the Z values (like the /22 jit.op) so small changes aren't noticed. The drawback is that you loose some of the smoothness.
Perhaps setting the confidence filters higher in the XML file would help.
@all: and nobody else? glitch glitch from depth output to float matrices?
@dambik: connecting a char matrix you will get a strange effect, the image does some kind of white rotation while you moving close/further to the camera,, grgr dont know how to explain it, but give it a round... i guess is something about long values going to char 0-255?
btw, the confidence filter is only for user/skeleton outs.
best!
"connecting a char matrix you will get a strange effect, the image does some kind of white rotation while you moving close/further to the camera,, grgr dont know how to explain it, but give it a round... i guess is something about long values going to char 0-255?"
I expect the Z value in a char matrix would be continually wrapping around back to zero as you approach the camera, i.e., 255,254,...,2,1,0,255,254,...,2,1,0,255,..., etc. Since Z is in millimeters, that amounts to a range of every 10 or so inches before it wraps around once again.
Until someone provides me a repro(duction) case, I can't track down or explain much. So far, all the behaviors people describe here are expected behaviors or so unclear I can't assist.
If you think there is an issue, I encourage opening a new issue with lots of details at https://github.com/diablodale/jit.openni/issues
Lots of details would include the patch itself also. Even a screenshot so I can see what you are seeing on your screen.
Help me help you.
@diablodale: you are right.. i dont know wheres the issue, if its in the external or in the way matrices get transform.. or just is like it is..
in the picture you can see a white spot in my face, if i go close to the camera disapears (goes to grey) and get back the white and disappers and so on wrapping around...
and here is the patch.
thank you very very very much!
I've been working on a Max application that applies a video effect at the location of a user's right hand when their hand approaches the camera to within a certain Z distance. This works BUT, at times, it seems the skeleton values become "stuck" and stop changing for a few minutes. I can still see the depth map changing properly at these times but the OSC skeleton values returned do not. Is there an error situation where invalid OSC skeleton data could possibly be returned? (Is the skeleton data stored in a queue by any chance? Sometimes it acted like data wasn't being removed fast enough from a FIFO queue and was falling behind.)
Hi Guys,
Segmentation.cpp is not a C74 managed source code file.
Cheers
Andrew
Please ignore my previous post - boneheaded coding error on my part.
I have noticed the RGB image has a strange offset every other row of pixels in 640x480 that would degrade any computer vision used on that matrix: http://i.imgur.com/J7YR1.png
After some research, it looks like a known issue to do with the driver: http://openni-discussions.979934.n3.nabble.com/OpenNI-dev-Quality-of-Kinect-RGB-Camera-td2678960.html
It sounds like the missing step will be added to Avin's driver at some point, but until then I've found this shader which might be able to correct the image: http://graphics.cs.williams.edu/papers/BayerJGT09/#shaders Unfortunately I don't know where to begin using that shader (and specifically customising the settings for the Kinect's particular issue).
@bferns
we had this discussion a few years ago. mosaicing is an inherent problem with any color camera. i include andrew b. code with the relevant shader.
still, doing it on the gpu will help aesthetically but the callback to the cpu for extra processing will give a hit in performance. last time i checked.
Hi Yair, thanks for that patch & shader, it will certainly work as a stopgap until its implemented in the driver (I'm using 1. 2. 1. as the shader settings). It doesn't impact performance too much for me - still getting 60fps while running some jit-ogre models and videoplanes.
@carsol, all the behaviors your describe and that I see in the patcher you provided are expected behavior.
top-left: the matrix output from depthmap is by default a long 1 plane matrix. The values that OpenNI and Kinect save in cells of that matrix will vary between 0->10000. When you force it into another matrix format with your jit.matrix, you loose a large amount of data. A char matrix cell only holds 8 bits (0->255). Since the max value in decimal is 10000, you need 14 bits to hold the info. Bits 9->14 are being thrown away by Max in the jit.matrix forced conversion and you are seeing the lower order remaining bits of information.
top-middle: I see tiny changes changes in values all over the screen. This is expected behavior. The depth sensor is trying to measure in millimeters so even the slightest movement is seen. Also, even though it gives values in mm, its measurements are not precise. You see variations from frame to frame due to the technology. When surfaces are hot, very reflective (mirror) you can see a lot of variation in values. All expected behavior.
top-right: In this example, you are using math in a jit.op to remove the lower 3 (= 2^3) bits of information; shifting them to the right. The remaining bits are then used to display. All expected behavior due to your math.
bottom-right: expected behavior. You have bits shifted by 3 ( same as / 8) and them normalized it to display in the window. Any subtle flickering are inherit expected behaviors in the kinect sensor as it tries to create millimeter precise values from a $150 device.
@Andrew Pask, earlier in this thread transponderfish posted an Assert screenshot which idenfies in text:
Program: c:program filescycling '74Max 5.0Max.exe
File .Segmentation.cpp
line: 5250
Expression: xOfAreaPercentage20 >= 0 && xOfAreaPercentage80 >= 0
If you are not the owner of the 5000+ line files (wow, that's big), do you have an idea who is? Is it a file that is part of msvcrt.dll? Part of opengl?
The two Asserts we've seen look like either thread related code or graphics related code to me.