Forums > Jitter

here is my working jit.openni max jitter openni external for windows

June 23, 2011 | 2:06 am

I have written a Windows Max Jitter external for OpenNI. I reached a
major milestone today.

https://github.com/diablodale/jit.openni is the location of the
project and within the bin directory are the Win32 external, XML
config file, and a sample patcher.

It has been casually tested with SensorKinect and OpenNI binaries.
Please see the README file for install instructions.

Currently it supports:
-ImageMap of RGB24 output in a 4-plane char jitter matrix
-DepthMap output in a 1-plan long, float32, or float64 matrix

Its output is very similar to jit.freenect.grab and therefore it can
often be used in its place with small changes. My object outputs depth
in mm and jit.freenect.grab outputs in cm. A simple math op can
resolve this. Note, my object does not provide the "raw" values of
jit.freenect.grab; instead it provides the mm depth values via OpenNI.

I would like to see support for other generators (skeletons, gestures,
hand point, etc.) in the future added by myself or with the assistance
of others.

If you find problems and can reproduce them with clear steps, I
encourage you to open an issue at https://github.com/diablodale/jit.openni/issues


June 23, 2011 | 3:12 am

Awsome! ;)

Do I have to throw out the MS sdk for this to work?


June 25, 2011 | 12:06 am

I suspect you would need to uninstall Microsoft’s Kinect device drivers since OpenNI needs OpenNI compatible drivers (e.g. SensorKinect).

This is a Win32 Jitter external. No other Win32 external exists of which I know. I wrote this because I needed an external and could no longer wait for Microsoft’s SDK. Also, this external is intended to work with any OpenNI device.

I’ve posted an update. It now supports rgb, depth, and ir cameras with outputs in flexible matrix types. Next up…skeletons.

It is likely I will port this to Microsoft’s SDK; I like their non-pose needed calibration. However, my priority is getting an object working and tested for my next art installation.



Ed
June 25, 2011 | 3:53 am

I tried to install but kept getting an error that OpenNI.dll wasn’t found. I copied the OpenNI.dll file to the local directory but that didn’t help.


June 25, 2011 | 5:46 am

Do I have to throw out the MS sdk for this to work?


June 26, 2011 | 3:13 pm

@diablodale

Awsome

Yeah I think because you dont need a calibration pose using the MS sdk – then that makes it alot better for interactive art installations

I have heard some say that the NI skeleton is faster though, but on my computer (windows 7 64bit) the Ms sdk skeleton seems faster (havent tried it with many applications though)


June 28, 2011 | 3:06 am

dambik, would you please send me detailed repro steps as well as your computer software/os setup. For example (this is only an example, you will need to write your own setup and repro steps)…..

setup—
Windows 7 Ultimate x64 SP1
Max/MSP/Jitter 5.1.8 for Windows

repro steps—-
1. I first installed OpenNI 1.1.0.39 for Win32
2. Then I installed PrimeSense NITE 1.3.1.4 for Win32
3. then I installed PrimeSense Sensor KinectMod 5.0.1.32 for Win32
4. then I copied the jit.openni.mxe, jit.openni_test1.maxpat, and jit.openni_config.xml into the same directory.
5. then I double clicked on jit.openni_test1.maxpat

result—
I got an error from (windows, max, etc) saying "blah blah"

expected–
blah blah blah



Ed
June 28, 2011 | 10:47 pm

I uninstalled everything and reinstalled as outlined below. The OpenNI.dll not found error is gone but now there’s an XML initialization error. Is the Kinect SUPPOSED to appear as three unknown devices in Device Manager???? Or are there other drivers I’m supposed to install???

Windows Vista x64 Ultimate SP2
Max/MSP/Jitter 5.1.8 for Windows

Downloaded and installed the following in order:
OpenNI-Win32-1.1.0.41-Redist.msi
NITE-Win32-1.3.1.5-Redist.msi
Sensor-Win32-5.0.1.32-Redist.msi

Extracted all files from diablodale-jit.openni-1cd3781 and ran jit.openni_test1.maxpat from /bin directory

Clicked on read jit.openni_config.xml message object

Max then returns the following error:
jit_openni: XMLconfig initialization failed (One or more of the following nodes could not be enumerated:

Device: PrimeSense/SensorV2/5.0.1.32: The device is not connected!
Image: PrimeSense/SensorV2/5.0.1.32: Can’t create any node of the requested type!
Device: PrimeSense/SensorV2/5.0.1.32: The device is not connected!
)



Ed
June 29, 2011 | 10:43 pm

OK, I’ve got jit.openni working fine now. I needed avin2-SensorKinect-28738dc.zip for the neccessary Windows drivers. Was Sensor-Win32-5.0.1.32-Redist.msi the wrong package to insall???


July 1, 2011 | 12:12 am

I use the same install components written up in the README; part of the distribution at https://github.com/diablodale/jit.openni

I always encourage people to look there because that will have the current info while the very post I’m writing now will become out of date.

You appear to be using using differerent OpenNI and SensorKinect versions that I am using. At moment, as in the README file, I am using
- OpenNI 1.1.0.39 for Win32
- PrimeSense NITE 1.3.1.4 for Win32
- PrimeSense Sensor KinectMod 5.0.1.32 for Win32

Good to hear its working for you. Skeleton will likely start showing up in the codebase next week. I have been reading the needed sections of the two SDKs, thinking through my coding approach, and how the output of the data should look.

Do you have an opinion on how the skeleton data should be output? First though is to output some block of data which would hold all joints and all locations for those joints.

If I used a matrix type=long, that I could require the patcher knowing the integer->friendly join name mapping. I hesitate at this approach.

If I used a long list where the 1st element is the friendly name for a join, next is the location of that joint, 3rd is friendly name for another join, next is…and so on. This would work but then requires list management to split it up into separate messages (zl iter 2) for storage and action. I lean towards this approach.

Anyone else have an opinion on how the external should output skeleton data?



Spa
July 1, 2011 | 4:31 am

Not having access to pc cos’ moving now…
Thanks for the good work.

Of what i tested in osceleton, It’s difficult to know in the series of joints when a new frame is starting.
and it sends only active joints independently with appending of the user id.

I sometimes reformated the tuio for a 1 line list, sending it as 1 frame through osc from my tracking analysis to main patch, then cutting it with Lchunk | . this way, i can store all datas in 1 list for coll…
| frame 145 | active 0 2 4 5 | 0 0.345 0. 567 | 2 0.24 etc…

perhaps an approach with:
frame 145 (or new frame)
user 1
active left_hand right_hand etc…
left_hand 0.23 0.45
right_hand 0.67 0.78
user 2
etc..
bang (end of sequence)

that could easily be repacked with a subpatch:
| 1 145 | active left_hand right_hand | left_hand 0.23 0.45 | right_hand 0.67 0.78 |
| 2 145 | etc…

Well what seems to me important is to be notified by your external of the beginning and end of the frame.
And the fact that some joints appears or disappear, makes difficult to pack it in a numeric only list or matrix.

my 2 …



Spa
July 1, 2011 | 4:54 am

By the way, It will be really happy of having an external that could output at the same time:

depth map + skeleton (+ eventually the rgb).

Spanning 2 parallel processing of the kinect datas for a richer output:
depth map > fluid3D analysis in matrixes
skeleton > joints processing (interact, 3D object, particles…)



dtr
July 1, 2011 | 10:06 am

nice! anyone up for making a Mac version? i wish i could do this myself…



Ed
July 3, 2011 | 4:12 am

"You appear to be using using differerent OpenNI and SensorKinect versions that I am using."

I actually looked for the same versions listed in your README file but couldn’t find them for some reason. Perhaps I looking in the wrong place? Luckily, the most recent verions (unstable) worked fine.

"Do you have an opinion on how the skeleton data should be output?"

I do like the simplicity of being able to use ROUTE and/or UNPACK objects to parse the data. Maybe a fixed sized implicit list with a header (sequence number, player number) and a series of location coords?


July 3, 2011 | 9:27 pm

Diablodale, a BIG thanks for doing this :-)

@offceplus: google "register dll", it is necessary to tell windows where a dll is located.



Ed
July 3, 2011 | 11:03 pm

"Officeplus" appears to be a spambot parroting random posts and inserting commercial links.


July 4, 2011 | 9:05 pm

v0.5.0 is up on github. I now output userpixel maps aka xnGetUserPixels(). This was a good stepping stone on my way to support skeletons.

I am going to try a few skeleton output approaches privately and then release one of them for everyone’s feedback. After I hear your feedback, I may change the way skeletons are output and it may not be backwards compatible.

** I don’t promise any backwards compatibility as this is all prelease pre-version 1.0 coding. **

I recognize that I can’t predict everyone’s uses. That is why I want your feedback. I ask that you consider I may not be able to accommodate everyone’s wishes. However, I do hope the final skeleton output can be massaged by Max to be whatever you want it to be.



Ed
July 6, 2011 | 10:58 pm

COOL! Got a user map after doing a quick calibration dance. Very nice. How do I get the user ID value(s)?

BTW, what’s the purpose of the jit.op operators in the example?


July 7, 2011 | 7:48 am

v0.6.0 is now up on GitHub in the normal place. I now output user skeletons.

This preliminary version using OSC format and leaves all the values as floating point as OpenNI outputs.

/userid/jointname x y z confidPosition x1 x2 x3 y1 y2 y3 z1 z2 z3 orientPosition

This release also has attributes to filtering data based on position or orientation confidence, display or not the orientation data, and smooth skeleton data using OpenNI’s smoothing API

Caveats:
- I do not yet output user seen/lost, calibration started/failed, or pose events via OSC.

Questions:
- Are you getting valid skeleton data?
- Do you like this OSC format for skeleton data?
- For position data, should I keep it as a float or change it to an integer? If change it, why?

@dambik, the it.ops are scaling values so they look a smooth ramp of greyscale rather than psychedelic waves of black/white


July 7, 2011 | 2:46 pm

I got v0.6.0 working – happy :-)

To share my installation experiences:
- I did not find the versions from your readme on the OpenNI site, but these did it (Win 7 Ultimate, 32 bit):
OpenNI-Win32-1.1.0.41-Redist.msi
NITE-Win32-1.3.1.5-Redist.msi
avin2-SensorKinect-28738dc.zip (used this one according to dambik’s hint)

Initially, I got error-messages about the xml-file being corrupted and parsing errors. The cause was obviously that I DLed the three files by right-clicking and using "save as…"
DLing the complete zip cured this.

As a start, I now moved around in front of the Kinect with my daughter a bit and watched the patcher window. I see output in the jit.pwindows, but not in the 3rd one.
In the 4th we both were clearly distinguishable as seperate persons. Does this mean I am getting valid skeletal data?

@dambik: I almost think it is obligatory to post something naive/silly or even stupid, when it is the very first contribution to a new forum. Regard it as my way of showing that I am not a bot :-)



Ed
July 8, 2011 | 2:55 am

"I almost think it is obligatory to post something naive/silly or even stupid, when it is the very first contribution to a new forum. Regard it as my way of showing that I am not a bot :-)"

Welcome aboard, Not-a-Bot. :) Heh, I recently got caught by that silly spambot too.



Ed
July 8, 2011 | 4:03 am

OK, I got v0.6.0 setup and running today. No skeleton data all afternoon so I finally dragged the Kinect back to my desk and BINGO! I think I had the same thing happen when I tried the Nite examples come to think of it.

So, here are some observations from my tests:
- Are you getting valid skeleton data?

I checked the right_hand skeleton data against the center pixel value in the depth map and they did appear to agree when they overlapped. Didn’t seem to get any data on the right_finger at all and the left_finger didn’t appear to be tracking. That’s all I was able to test today.

- Do you like this OSC format for skeleton data?

Not really. I haven’t figured out how to parse the forward slashes in Max so I ended up using statements like ROUTE /3/RIGHT_HAND which treats everything from the first / to the RIGHT_HAND as a single word. No user number unlesss I can parse the forward slashes.

By the way, I was wondering if a sequence number or frame number might be helpful for grouping data across players.

- For position data, should I keep it as a float or change it to an integer? If change it, why?

You’re outputting millimeters, right? In that case, I don’t see that it matters that much.

Speaking of coordinates, if I move 3 feet directly to my right relative to the Kinect, does the Z distance also change or just X? In other words, are the coordinates completely orthogonal?


July 8, 2011 | 10:46 am

@transponderfish, the 3rd window is likely black unless you reconfigured the XML ocoonfiguration file. That pwindow is hooked to the IR output. the Kinect (or its drivers) does not allow IR output at the same time as RGB output. You have to choose only one.

@dambik, I too am not getting finger data. I ask for it in code, but it returns zeros. It is possible that NITE or SensorKinect does not support that joint at this time. I’ve opened an issue at https://github.com/diablodale/jit.openni/issues so I can continue to look at this and if resolved have the answer available for others.

@dambik, the object that is soon-to-be-your-best-friend is OSC-route. UC Berkeley has written and released a fantastic package of Max externals. You can get them at http://cnmat.berkeley.edu/downloads. You can download just the OSC ones (they start with "OSC-") or at the top of the page is the Everything package. The OSC-route object will do all the splitting of the message you want: routing, ranges of values, wildcards, its great!

@dambik and @transponderfish, previous versions of software on github can usually be view/downloaded by clicking on the "Commits" on the top menu bar of github. There will be listed all the commits (uploaded snapshots) that an authors uploads. On a line is usually a version statement. On the version that you want, click the "tree" hex number on the right column and there you can view the entire tree for a particular version. Download from that tree whatever you would like.

@everyone, a private release of mine has changed the skeleton data to start with /skel and fixed the outlet ordering. I am close to having the user seen/lost/etc. events output once a fixed a few bugs related to new callback functionality I added.


July 8, 2011 | 10:54 am

@dambik, i have been thinking about the sequence number or frame number request. I ask to understand more about your need. Reason is, there is some timestamp and frame data that OpenNI exposes for a given frame of mapdata (rgb, depth, and ir). The usefulness of that data in an environment outside the external (e.g. in max) is in question.

You mention grouping data across players. Can you describe a scenario more fully? Or perhaps a few more examples so I can get my head around the request?


July 8, 2011 | 1:36 pm

Last night I spent several hours with your external and Vizzie (to get a feel for it, before I ask specific questions or even make requests), the results are amazing!

Right now I have tested it with 3 players, all 3 of us were tracked without problems (I am working with the 4th output mostly, feeding a sketchr to get outlines. We never merged into a large blob and what pleased me most was that my honey’s skirt always was there. Which is amazing to me, considering the mess in my room. I stayed in the background, often disappeared, but it was sufficient when the computer recognized my head and an arm. As I said, never a large blob or confusion between our limbs.)

I looked into the XML, I assume this is explained in the OpenNI documentation, right? I read it some weeks ago and will do so again the next days.

More observations:
-I get output in the MAX-window when I click "summary". But: yesterday I suddenly got continuous skeletal data there (no idea what I did…), today not.
- Everything looks fine and stable, but I still need a few attempts to get things running (still haven’t found out if I have to load the XML-file with "read" or if the "read jit…" button is sufficient (everything is in the same folder, in the MAX folder in my win documents path). Right now, I have managed to crash the whole thing (no idea how, hahaha) – I will post my observations on how to restart everything (because ending MAX, un/-replugging the Kinect is a bit annoying AND it is almost impossible to crash a program you have written yourself; you always know what not to do… I know that from experience)


July 8, 2011 | 4:32 pm

v0.6.1 is up on GitHub. I now output useful skeleton events (new user, lost user, calibrating, etc.).

I’m going to stop coding new features for now. What features are there need to be used, tested, and I would like everyone’s feedback. If you find bugs or crashes that you can reproduce, please open an issue at https://github.com/diablodale/jit.openni/issues

@transponderfish, yes the OpenNI documentation goes into its format. There is not a lot of tweaking you can do other than add/remove/switching to different nodes (image, ir, depth, user). The SensorKinect driver and the Kinect itself impose them. If you remove nodes that you do not need, you will lessen your CPU and memory usage.

@transponderfish, the sample patcher I provided printed out all the OSC messages. You have to be calibrated to get OSC joint into. The NITE driver requires you standing in a Psi pose. Once calibrated, then the joint data is output. Also, in high bandwidth situations the Max window doesn’t always keep up with printing them. That could have been what you saw. The new sample patcher I provide doesn’t print them all out, rather just the events. It does flash a button to show the OSC joint data. There’s a lot of it.

@transponderfish, if you can get a repro of the crash with clear steps, I can try and track it down. Until then, you can a) send a read message (you will be prompted for the file) or b) send read and the filename and it will auto load for you.


July 8, 2011 | 5:41 pm

Is it possible to have a clip or a visual to see what’s happening when its working.

Thanx Ben



Ed
July 8, 2011 | 10:06 pm

> @dambik, the object that is soon-to-be-your-best-friend is OSC-route. UC
> Berkeley has written and released a fantastic package of Max
> externals. …. The OSC-route object will do all the splitting of the
> message you want: routing, ranges of values, wildcards, its great!

I was hoping you weren’t going to suggest that. :) I have numerous machines to install it on and I didn’t really plan on using OSC. You might be able to convince me of the advantages, however…

> @dambik, i have been thinking about the sequence number or frame number
> request. I ask to understand more about your need. Reason is, there is some
> timestamp and frame data that OpenNI exposes for a given frame of mapdata
>(rgb, depth, and ir). The usefulness of that data in an environment outside
> the external (e.g. in max) is in question.
>
> You mention grouping data across players. Can you describe a scenario more
> fully? Or perhaps a few more examples so I can get my head around the
> request?

One concern is whether player skeleton data always arrives in order for the same frame. For example, frame 1 gives me player 1 then player 2 data, frame 2 gives me player 2 then player 1, frame 3 gives me player 2 then player 1.

Another issue is missing data. For example, suppose I’m calculating the lateral velocity of right hands by determining their change in X position and dividing by the framerate period. If a user lost a frame of data, a sequence number would tell me to either invalidate the current calculation OR to use a different time period for the velocity calculation. A sequence number might also help provide a warning if there are a high number of missing data sets (e.g., player 2 disappears for 10 seconds then reappears for whatever reason).

Perhaps I’m a bit paranoid but when working with Arduino and Xbee, sending a sequence number really helped me address problems with missing, duplicate and invalid data sets.


July 9, 2011 | 5:08 pm

Everyone, I have put install/usage documentation on the Wiki at https://github.com/diablodale/jit.openni/wiki

@dambik, its not difficult for me to add an alternate output format for the skeleton data. One approach I am considering is to have 3 output formats controllable via an attribute:
1) my personal preference via OSC
2) OSCeleton-puppet format
3) a Max-only list format (this is probably what you are desiring)
The 3rd I would likely take the OSC of (1) and just remove the slashes. Would something like that work for you?

Dambik, now time to geek-out so we can get our heads around this timing topic. BTW, its interesting you mention arduino & xbee, I think we have some similar project pieces. ;-)

The data output from the outlets is initiated by a bang on the jit.openni inlet, not by a specific frame rate/output of the Kinect. The Kinect is sending data constantly to the computer. Its only when you bang that I ask OpenNI for a snapshot of that data. You could, however, bang the jit.open object equal to or faster than the FPS configured which would *likely* get you every possible frame.

The data output from the outlets from jit.open is done in the standard Max ordering (right to left). If it is not, then that’s a bug I need to fix.

When a bang is received, here is the flow that occurs:
1. Get snapshot of whatever data is currently available, do not block and wait for new data
2. create matrix of depthmap (if defined in XML) and queue for output
3. create matrix of imagemap (if defined in XML) and queue for output
4. create matrix of irmap (if defined in XML) and queue for output
5. create matrix of userpixmap (if defined in XML) and queue for output
6. output tracked skeletons (if defined in XML via user node) and queue for output
for each tracked user skeleton...
for each of the 24 OpenNI joint types...
check against confidence filters, if doesn't pass then go to next joint without output
queue output for joint OSC data
7. output all data via outlets; should be in standard right->left ordering

The UserID values in OpenNI (and therefore jit.openni) are not guaranteed to be in sequence. Its easy to get out of sequence by people disappearing and reappearing causing the skeleton to no longer be tracked and therefore has no output. For example, in (6) above, it is possible to have a user (seen in the userpix map) but not being tracked for skeletons due to calibration failure, etc. I recommend to use the userID value to track a user, not the sequence of data output by OpenNI for a given frame snapshot.

I do believe as I currently have it coded that for every bang to jit.openni you should get:
1) all tracked skeleton data output for a given frame snapshot
2) matrices output for any configured depth, image, ir, or usermap node in XML

You shouldn’t have a scenario where in one frame you get user3 and user2, then in the next frame get only user 3, then in the next frame go back to getting user3 and user2. The only scenario in which that would occur (which is highly unlikely) is if between those frames user 2 was lost, then seen and calibrated in one frame. It is possible for your code to catch this scenario by watching for the user events like "lost_user" and "calib_success".

I may implement in the future an attribute which if enabled causes old (repeated) data to not be output. Not to worry, the default will be as it is today which is to always output data even if it is old/repeated.

Since there is a continuous flow of data from the Kinect, the timestamp between frames greatly increases. It is measured in microseconds and I cannot find documentation defining what was timestamp=zero. I did a quick debug codechange and was surprised to find that the frameIDs start with 1, increment by 1 with every snapshot I take in code (a bang to jit.openni), and do not skip numbers. I cannot find any documentation explaining the behavior of the frameID, so what I’ve observed could be circumstantial rather than guaranteed.

It isn’t practical for me to embed a frameID in the matrices themselves, I would only provide it on a outlet like the skeleton OSC or make a dedicated outlet on the far right just for a frameID. However, I wonder if instead you should generate your own frameIDs or timestamps. When you bang jit.openni, also create a frameID that is associated with your initiated bang; perhaps using a counter specific to your implementation.

OK enough technical info from me. What is your take on all of this?


July 9, 2011 | 9:44 pm

Thanks for documenting! I have had not the time so far to try out v6.1, but I will certainly do on Monday.

To give some visual feedback I have uploaded my first tests to YouTube:

https://www.youtube.com/watch?v=y2u3C_s1BME

I will also check where in the chain the upper 5% of the image is lost…


July 9, 2011 | 10:55 pm

@transponderfish, perhaps one contribution to your issue is the translation that is occurring. In the sample config XML file I provide I include:
Image1

This configuration does the math to enable an OpenNI user to "overlay" the depth data onto the RGB data pixel by pixel. Using that translation, the depth data is smaller in width and height than the RGB data. Depth will no longer be a full 640×480.



Ed
July 11, 2011 | 11:05 pm

"its not difficult for me to add an alternate output format for the skeleton data. One approach I am considering is to have 3 output formats controllable via an attribute:
….
3) a Max-only list format (this is probably what you are desiring)
The 3rd I would likely take the OSC of (1) and just remove the slashes. Would something like that work for you?"

That would be perfect.

"now time to geek-out so we can get our heads around this timing topic."

I’m going to take a little time to think over what you wrote…


July 14, 2011 | 7:53 pm

@Diablodale:
Today I have tested v6.3, it works like a charm, it is using less processor than v6.0.

What I definitely need is support for TWO Kinects running at the same time on the same machine. The requirements for this are described somewhere else on this forum, I have sorted that out and controlled that this is set up correctly here (they need to run on two different USB-controllers, which is not to be mistaken with inputs/plugs. It can be checked with a software named USBlyzer, trial and error can do it, too, but why rely on that when there is a proper tool available?)

Now, I have tried to create a patch with two jit.openni modules to see what happens. It looks like the patch is preferring the Kinect which is first listed in USBlyzer. (Both Kinects are not working at the same time and I cannot choose, but I did not expect that) It also looks like both jit.openNI modules are working completely independent from each other (I can tell that from the little glitches that happen in the 4th output, they are different) which is 100% positive and promising for what I need to do with it, yippie!!!

Reading the OpenNI documentation (page 9) implies to me that an easy solution would be to edit two XML input-files, which say something like "Kinect 1" and "Kinect 2" at the beginning and feed them to two jit.openNI modueles – what do you think?
(the mac-only external by JMPelletier works with an "open 1" and "open 2". Does your external understand any similar commands?)

I am personally not interested in the OSC-output, but for giving you feedback: after doing the necessary pose, I got output. Works. Please ask if I should check more things.


July 15, 2011 | 4:10 pm

NOTE! OpenNI and Primesense released a lot of new code two days ago. Some of it includes changes to the XML configuration support; some APIs were depricated. I have not tested my code against this new drop of OpenNI, NITE, and SensorKinect. I am definitely testing and updating everything to this new code by 20 July because of new features I want.

@transponderfish, I’m interested to hear of any success on two simul Kinects using OpenNI. I did have this in mind when I was writing. However, have not 2 Kinects myself for any testing. Thiw is the approach I took:

  • leverage the documented OpenNI XML config files; allows rich configuration with inherit compatibility at no development cost to me. Allows arbitrary numbers of devices, generators, etc.
  • allow the default "sharing" of devices mode rather than exclusive locking
  • each jit.openni creates an independent OpenNI session which directly maps to the XML configuration file. You can point to the same config files or separate config files.

Given that, create two different XML config files. Two separate jit.openni objects. Then send a read message to each jit.openni referencing the independent XML files. Now…..

What I do not know, is how the OpenNI middleware, NITE, or the SensorKinect driver will behave with two Kinects. Given SensorKinect is a hack, I would look there first if it works. The simplest thing that I suggest testing is an ImageMap. Create two separate XML files simplifying them to only have one element for an image: . Also, in those separate config files, there has got to be a way to describe which device is associated with which node rather than the default behavior you are experiencing. What I am unable to find is how in XML to specific a specific Kinect.

You can specify Queries for Nodes which can isolate a specifically desired device. But the docs I’ve foudn so far would only differentiate between different types of sensors, not multiple of the same time. I can see in the C code that they’ve added low level support, but I can’t yet find the XML parse that utilizes it.

https://github.com/avin2/SensorKinect is the driver of which I’m speaking and OpenNI.org has all the middleware.


July 17, 2011 | 2:53 am

Hey diablodale, nice work on getting a PC max external happening for the Kinect! I’m really looking forward to getting it running.

I’ve happened upon an issue when trying to create your jit.openni object. I receive a error message:

"tooltip: Max.exe – Entry Point Not Found

The procedure entry point xnGetBytesPerPixelForPixelFormat could not be located in the dynamic link library OpenNi.dll"

…causing the object to be disabled.

I was running 3 month old OpenNI and NITE middleware, which I thought could be the issue. So I’ve updated to the latest stable builds and receive the same error. Potentially I’ve missed something else along the way.

I’m running: Win7(64-bit), Max 5.1.8, PrimeSensor 5.0.3.3, OpenNI 1.3.2.1

I installed your object by copying jit.openni.mxe and jit.openni_config.xml into my user object library. But I haven’t touched anything in the config file.


July 18, 2011 | 3:12 pm

@Diablodale, I have done as you said, created two simplified xml files, read from two jit.openni modules. Result is same as before, I get the same output from both of them.
I tried a bit of hacking on the second one:


) the depth node entry im still getting depthmap image in first outlet of jit.openni, but if i comment image1 node the image isnt comming as must be…

what im doing wrong? any special considerations for the dition of the xml file?

thank you very much!


July 31, 2011 | 8:48 am

@carsol, I use Windows 7×64 Ultimate. I should be able to switch my primary display language to Spanish. Do you think this would simulate your setup?

@carsol, the Kinect itself and SensorKinect do not support all possible values of xres, yres, and fps. The possible values are more restrictive than the Primesense Sensor. It is possible this combination is disallowed. Also, if there is both a IMAGE and DEPTH node, then both must have the same resolution.

@carsol, you discovered my primary reason for the output_depthmap attribute. It is likely when you removed the DEPTH node, that you retained the USER node. In this scenario, the USER generator itself requires a DEPTH node and automatically creates one. This is all done automatically by OpenNI and NITE. I call an OpenNI api that reads and initialize OpenNI using XML. After that call, I receive back a list of generators. In your case, it was likely a USER and a DEPTH (that the user node required). I also noticed this behavior and recognize it as intended by OpenNI. However, I saw an opportunity to reduce some CPU load by allowing you and me to disable my code which does the matrix calculation to convert that unwanted depthmap into a jitter compatible format.

@carsol, repeatedly reloading the XML on the jit.openni object is unreliable and prone to crashes. Known issue, please see details at https://github.com/diablodale/jit.openni/issues/4


August 1, 2011 | 7:58 pm

@diablodale: the bug was with my old machine, winXP 64bit and max 5.1.4, dont know if with win7 and max 5.1.4 happens, you need to put "spanish" in regional settings (where is the config about date format etc..)and you will get a straightforward crash in max patches with jitter objects…

about resolution modes, ouh yeah, kinect isnt supporting QVGA, what a pity for latency maniacs… do you have tested your external with Primesense sensor? do you think will work? seems primesense have some advantages like 60fps at 320×240….

if i want the best performance only for skeleton tracking i need only the user node and set the good output_depthmap attribute to 0 and also the other outputs, image, IR and user to 0… right?

following with the XML issues, if comments in IR node are removed im getting an error of bad parameter… its only me?

thanks!


August 1, 2011 | 9:41 pm

@carsol, to be able to assist you, I request that you visit https://github.com/diablodale/jit.openni/issues and open an issue for each of your XML problems. Please include a full description of your OS, Max software, specific openni related software and versions, hardware with all version numbers, languages, etc. Then include reproduction steps of what is installed, any code running, steps that I should take to reproduce the problem, etc. Then include the full XML that you are experiencing problems with.

You can look at issue#1 as an example of the information that I request https://github.com/diablodale/jit.openni/issues/1

Yes, if you want the least amount of code to run and get skeleton tracking, then I suggest you have:
1) only a USER generator node in your XML
2) disable all the output attributes except for skeleton
3) disable the orientation data if you don’t need it
4) set the confidence filters to at least (0.6). In the current version of NITE they only output 0, 0.5, and 1.0.
5) only bang jit.open as many times/sec as you need.



Ed
August 1, 2011 | 9:53 pm

@yair: Thank you for the sample mesh code! VERY cool.

Can someone explain to me why the mesh consists of a couple of different flat planes that overlap. Why isn’t my head round, for example? :)
OH, I see now – I have to set the AMP parameter.



Ed
August 1, 2011 | 11:06 pm

Ah, the mesh code works MUCH better if you use the values directly from jit.openni rather than from the [jit.op @op / @val 22] object. Dividing by 22 caused Z to be limited to one of about 11 possible values making a sliced effect. My head is now round.


August 2, 2011 | 12:11 am

@diablodale: at the moment isnt a big deal dont have IR but if you want i open the case at your github page and thanks for the tips!

@all: it happens only to me that with yair patch (that connects a float32 matrix to the first outlet) the image have a lot of flicks and glitches? if i connect to a char matrix i dont have this issue, but then theres others..

thanks!



Ed
August 2, 2011 | 8:20 pm

"it happens only to me that with yair patch (that connects a float32 matrix to the first outlet) the image have a lot of flicks and glitches? if i connect to a char matrix i dont have this issue, but then theres others.."

Yes, I’m also seeing "flicks and glitches". I expect using a char matrix limits some of this by quantizing the Z values (like the /22 jit.op) so small changes aren’t noticed. The drawback is that you loose some of the smoothness.

Perhaps setting the confidence filters higher in the XML file would help.


August 3, 2011 | 12:55 am

@all: and nobody else? glitch glitch from depth output to float matrices?

@dambik: connecting a char matrix you will get a strange effect, the image does some kind of white rotation while you moving close/further to the camera,, grgr dont know how to explain it, but give it a round… i guess is something about long values going to char 0-255?

btw, the confidence filter is only for user/skeleton outs.

best!



Ed
August 3, 2011 | 3:34 am

"connecting a char matrix you will get a strange effect, the image does some kind of white rotation while you moving close/further to the camera,, grgr dont know how to explain it, but give it a round… i guess is something about long values going to char 0-255?"

I expect the Z value in a char matrix would be continually wrapping around back to zero as you approach the camera, i.e., 255,254,…,2,1,0,255,254,…,2,1,0,255,…, etc. Since Z is in millimeters, that amounts to a range of every 10 or so inches before it wraps around once again.


August 3, 2011 | 11:31 pm

Until someone provides me a repro(duction) case, I can’t track down or explain much. So far, all the behaviors people describe here are expected behaviors or so unclear I can’t assist.

If you think there is an issue, I encourage opening a new issue with lots of details at https://github.com/diablodale/jit.openni/issues

Lots of details would include the patch itself also. Even a screenshot so I can see what you are seeing on your screen.

Help me help you.


August 4, 2011 | 1:56 am

@diablodale: you are right.. i dont know wheres the issue, if its in the external or in the way matrices get transform.. or just is like it is..

in the picture you can see a white spot in my face, if i go close to the camera disapears (goes to grey) and get back the white and disappers and so on wrapping around…

and here is the patch.

thank you very very very much!

– Pasted Max Patch, click to expand. –

[attachment=168210,2507]

Attachments:
  1. kincap2.JPG


Ed
August 4, 2011 | 2:42 am

I’ve been working on a Max application that applies a video effect at the location of a user’s right hand when their hand approaches the camera to within a certain Z distance. This works BUT, at times, it seems the skeleton values become "stuck" and stop changing for a few minutes. I can still see the depth map changing properly at these times but the OSC skeleton values returned do not. Is there an error situation where invalid OSC skeleton data could possibly be returned? (Is the skeleton data stored in a queue by any chance? Sometimes it acted like data wasn’t being removed fast enough from a FIFO queue and was falling behind.)


August 4, 2011 | 9:54 pm

Hi Guys,

Segmentation.cpp is not a C74 managed source code file.

Cheers

Andrew



Ed
August 4, 2011 | 10:27 pm

Please ignore my previous post – boneheaded coding error on my part.


August 5, 2011 | 8:42 am

I have noticed the RGB image has a strange offset every other row of pixels in 640×480 that would degrade any computer vision used on that matrix: http://i.imgur.com/J7YR1.png

After some research, it looks like a known issue to do with the driver: http://openni-discussions.979934.n3.nabble.com/OpenNI-dev-Quality-of-Kinect-RGB-Camera-td2678960.html

It sounds like the missing step will be added to Avin’s driver at some point, but until then I’ve found this shader which might be able to correct the image: http://graphics.cs.williams.edu/papers/BayerJGT09/#shaders Unfortunately I don’t know where to begin using that shader (and specifically customising the settings for the Kinect’s particular issue).


August 5, 2011 | 8:57 am

@bferns
we had this discussion a few years ago. mosaicing is an inherent problem with any color camera. i include andrew b. code with the relevant shader.
still, doing it on the gpu will help aesthetically but the callback to the cpu for extra processing will give a hit in performance. last time i checked.

http://cycling74.com/forums/topic.php?id=14028

– Pasted Max Patch, click to expand. –
Attachments:
  1. ab.debayer.jxs

August 5, 2011 | 12:16 pm

Hi Yair, thanks for that patch & shader, it will certainly work as a stopgap until its implemented in the driver (I’m using 1. 2. 1. as the shader settings). It doesn’t impact performance too much for me – still getting 60fps while running some jit-ogre models and videoplanes.


August 5, 2011 | 1:12 pm

@carsol, all the behaviors your describe and that I see in the patcher you provided are expected behavior.

top-left: the matrix output from depthmap is by default a long 1 plane matrix. The values that OpenNI and Kinect save in cells of that matrix will vary between 0->10000. When you force it into another matrix format with your jit.matrix, you loose a large amount of data. A char matrix cell only holds 8 bits (0->255). Since the max value in decimal is 10000, you need 14 bits to hold the info. Bits 9->14 are being thrown away by Max in the jit.matrix forced conversion and you are seeing the lower order remaining bits of information.

top-middle: I see tiny changes changes in values all over the screen. This is expected behavior. The depth sensor is trying to measure in millimeters so even the slightest movement is seen. Also, even though it gives values in mm, its measurements are not precise. You see variations from frame to frame due to the technology. When surfaces are hot, very reflective (mirror) you can see a lot of variation in values. All expected behavior.

top-right: In this example, you are using math in a jit.op to remove the lower 3 (= 2^3) bits of information; shifting them to the right. The remaining bits are then used to display. All expected behavior due to your math.

bottom-right: expected behavior. You have bits shifted by 3 ( same as / 8) and them normalized it to display in the window. Any subtle flickering are inherit expected behaviors in the kinect sensor as it tries to create millimeter precise values from a $150 device.


August 5, 2011 | 1:55 pm

@Andrew Pask, earlier in this thread transponderfish posted an Assert screenshot which idenfies in text:
Program: c:program filescycling ’74Max 5.0Max.exe
File .Segmentation.cpp
line: 5250
Expression: xOfAreaPercentage20 >= 0 && xOfAreaPercentage80 >= 0

If you are not the owner of the 5000+ line files (wow, that’s big), do you have an idea who is? Is it a file that is part of msvcrt.dll? Part of opengl?

The two Asserts we’ve seen look like either thread related code or graphics related code to me.


August 5, 2011 | 4:40 pm

It’s not ours.

http://www.google.com/#q=segmentation.cpp

OpenCV or some other library?

-A


August 5, 2011 | 9:17 pm

Thank you for your input. I previously used a similar google query. I couldn’t find anything with that filename that has that many line numbers nor those variable names. Leads me to believe it is a codepath within Max.exe’s process space which you (or I) indirectly call. Some code that is closed source.

I wonder if Windbg.exe could see the call stack when this assert is thrown?

@everyone: I am replacing the osceleton_legacy_raw attribute in the next release. The functionality will remain but exposed in a different way. The wiki has been updated with this notice. It will be a trivial update for anyone that is using that attribute now.


August 8, 2011 | 1:29 pm

v0.7.1 is up at the normal https://github.com/diablodale/jit.openni and the Wiki there has been updated. This is the release that:
1) removes osceleton_legacy_raw & replaces with skeleton_value_type attribute
2) adds projective coordinate support



Ed
August 9, 2011 | 7:50 pm

When I click on download at github, it downloads what appears to be version v0.6.6-4 from file diablodale-jit.openni-v0.6.6-4-g120a839.zip

Is this correct?


August 10, 2011 | 1:24 am

Awww. I didn’t see that limitation of GitHub. I’ve fixed it.

Issue is that GitHub only exposes "tagged" commits in the easy download button. I removed a bunch of old tags and added a new one for v0.7.1. You can see it in the download button now.

In the future, you can download any files in a commit (aka version) past or present that you want by clicking the "commit" in the top menu bar. Scroll to the version you want. Then click the tree hex number on the right side. There you can see all the files included in that commit (aka version).

We both learned something today. And it was all brought to you by the color red and the number 8. :-)



Ed
August 10, 2011 | 5:27 am

Thanks!

BTW, what is "projective coordinate support"?


August 10, 2011 | 5:44 pm

You can have your OSC/skeleton data in one of three values types. See the wiki at https://github.com/diablodale/jit.openni for the documentation on how to activate it on jit.openni. I can imagine times when someone doesn’t want real-world coordinates from the OSC data; rather they want the projective coordinates (x,y pixel coordinates).

If you want to go down into the computer graphics rabbit hole, google for projective space, coordinates transforms, etc. One examples is http://en.wikipedia.org/wiki/Projective_space



Ed
August 10, 2011 | 7:13 pm

Thanks! I was wondering about the coordinate system. Projective coordinates are exactky what I need for one current project.



Ed
August 10, 2011 | 9:28 pm

Projective coordinates work great!


August 26, 2011 | 10:48 pm

v0.7.3 is up at the normal https://github.com/diablodale/jit.openni and the Wiki there has been updated.

This release adds:
1) scene floor identification
2) depth camera field of view


August 27, 2011 | 9:21 am

thanks
btw. here is a link to an automated way of installing everything needed to start development for kinect on mac/pc > http://zigfu.com/

The Zigfu OpenNI installer lets you set up your entire development environment in one click. It bundles together OpenNI, NITE, and SensorKinect and configures them automatically.


September 23, 2011 | 4:55 pm

THANK YOU for zigfu. I couldn’t get it to work before this!! I LOVE YOU.


September 24, 2011 | 2:19 pm

Shhhh, between just us (haha), I have a private working max external that directly uses the Microsoft’s Kinect SDK. Skeletons and motor control work; no rgb/depth images at this time. Outlet and message compatible with my jit.openni therefore likely you won’t need to do any re-writes.

The Kinect SDK has a bug which requires you to kill the Max process after closing it. That and its lack of the image/depth matrices will keep it private for now.



Ed
September 28, 2011 | 4:32 pm

Cool! Can’t wait to try the new one.

By the way, I have had a Max/Jitter jit.openni Kinect app running for the past couple of weeks in a Univ gallery space. Seems to be working fairly well without requiring daily restarts or reboots. Now if visitors would only "read" and follow the cartoon directions… :P


September 30, 2011 | 10:48 am

Great work jit.openni !
jit.openni works with asus X-tion Pro ?


October 9, 2011 | 12:42 am

I have never tested it with the asus. The only two openni "drivers" I’ve used are the SensorKinect and Recorder. I only make OpenNI calls so I would hope it to transparently work.

Do an install of OpenNI as per the README (or Wiki) instructions. Install your Asus drivers. And try it. If you have problems, confirm that the OpenNI sample apps also work. If the OpenNI sample apps don’t work, there’s something beyond my control going wrong.

I’m interested to here of your results.



dtr
October 9, 2011 | 11:50 am

diablodale & transponderfish, what’s the current status of multiple Kinect operation? any developments after the last bits mentioned in this thread?

thanks for the extensive investigation of the issue! right now i’m using a 2nd computer for the 2nd Kinect but I hope I’ll be able to run it all on 1 machine at some point.

btw, i found a programmer who wants to help me porting the external to OS X. more on that asap.


October 10, 2011 | 11:19 pm

I have not pursued multiple Kinects via OpenNI after our investigation together. OpenNI via PrimeSense has limitations in supporting multiple Kinects w/ skeletons; overlapping lasers and resulting IR dots are difficult to discern. That’s all the jitter and poor tracking that one gets. (FYI, Microsoft SDK has the same limitation.)

OpenNI just today released a new version of their SDK. I don’t see any fixes addressing multiple Kinect usage.

I did see someone out there is working on a wrapper API to allow easier multiple Kinect usage. Unsure of its value because the Microsoft SDK already easily allows multiple Kinect usage and the OpenNI can be hacked to get at least depth/imagemap from multiple Kinects. But perhaps there is some value for your specific needs

http://www.cadet.at/2011/10/03/2realkinectwrapper/

Over the last month, I was focused on building a max external for the Microsoft SDK. I have that done. It provides only skeleton’s and its compatible with projects written with jit.openni. I’m keeping that private at the moment because of audio.

You see, the Microsoft SDK provides *rich* audio support. Max/MSP requires audio outlets to be the first outlets on an external. So I want to get that MSP part working before I release it and people start making projects against outlets that would change.

I have made a small set of private changes to jit.openni. I’m interested in your Mac developer assisting in making any needed code adjustments to make jit.openni cross-platform.


October 18, 2011 | 1:06 am

Hello,

This is great. Other OSC solutions work fine as well but it is preferable to have all in Max than running other external software.
I am facing a couple of issues though..
a> I find it challenging to calibrate a user. Is it possible to track an upper body model only or just the hands, and how?
b> I am not sure how to receive each Joint data. Is it "OSC-route/*/limbname" ?
c> Can I use the attributes as messages?
Thanks
V.


October 19, 2011 | 11:49 pm

I recommend you open the example patcher included in my distribution. It exercises much of the functionality and you can see how to use OSC-route with it.

Personally, and anecdotally from others, have heard that calibration is pretty good with the release I recommend in the wiki: OpenNI 1.3.2.3 for Win32, PrimeSense NITE 1.4.1.2 for Win32.

Heads up: I have not tested jit.openni with the latest binaries that PrimeSense released within the last two weeks. If they maintained compatibility, it should work. ;-) I plan to test it out myself next week.

If you’re using something other than jit.openni, I can’t speak for their reliability. In all cases, you much assume the PSI pose. No option. But once you do, it takes about 1 second. Its very fast. Oh…and be sure that you have a fast enough framerate. Slow framerates tend to calibrate and track skeletons poorly. I usually have at least one frame every 50ms.

Can you track an upper body only? If you are asking about skeleton profiles, then I already have support for that in a private build of jit.openni (v0.7.4). I will be releasing it this weekend or early next week. I suspect there will be little CPU change by not tracking all the joints. I recommend writing your patcher to only use the joints you need. Then with the coming jit.openni release with skeleton profiles, you can switch to a profile matching your needs.


November 1, 2011 | 2:50 pm

1.Hi everybody I’m trying to download the example on github but I can’t can someone post the example on the forum.

2.I have download the external and my kinect is working with my pc so what’s the next step to make it work with jitter…

3. Do I need an Osc program to make it works with Max

Thank you very much…

Ben



Ed
February 9, 2012 | 6:50 am

"Over the last month, I was focused on building a max external for the Microsoft SDK. I have that done. It provides only skeleton’s and its compatible with projects written with jit.openni. I’m keeping that private at the moment because of audio."

I am VERY interested in this. What’s the status?


February 12, 2012 | 9:30 pm

i v got an idea!
little bit off topic, but i will post it anyway :)
i thought, that it would be great to get skeleton data output in matrix format like in jit.cv. it could be faster, and easiest to use for example in simple skeleton draw. what do you think?


February 17, 2012 | 12:07 am

Any development over "multiple kinects" ?

Very interested in it as well :-)


February 18, 2012 | 2:39 pm

Hi, everyone. I’m in the deep focus of an upcoming commission piece (3rd March here in Berlin) that uses the Kinect SDK version of my Kinect object. I’ve made the minor updates to support Kinect SDK v1.0 and all is well with it. Much stabler than earlier SDKs.

After I finish this art piece, I will turn to updating the object to support the same features as my OpenNI version. I don’t foresee any technical issues at this time. But first, I must focus on the upcoming show.

Good news is that it will have more features that the OpenNI version due to Kinect v1.0 SDKs functionality. I will likely release the object in 2-3 major steps to first have feature parity, then support multiple Kinects. I may update it later to support asynchronous data output, but am not prioritizing it since Max patches tend to be synchronous with a bang generating output (movies, gl, kinect, etc.)


February 23, 2012 | 4:22 pm

Thank you Diablodale and I wish you success with your commissioned piece in Berlin.
Looking forward to have more news on further developments.


February 27, 2012 | 5:58 pm

Work in progress… http://youtu.be/fCRm4AyVFkY
Max-a-licious 6 with lighting, materials, physics, and OpenGL transparency. I’m learning a lot which is 1/2 the fun. ;-)


February 28, 2012 | 7:04 pm

wow, it works very effectively :)


March 5, 2012 | 7:35 am

hello,
it doesn’t work with windows xp, am I wrong ? Beacause of the kinect sdk compatibility ?
after a lot of attempts, always:

"Can’t create any node of the requested type!"

No solution with xp though ?

Thx.
Fratzen


March 5, 2012 | 12:54 pm

@Fratzen, I do not support Windows XP for anything.


March 7, 2012 | 3:44 pm

does anyone knows, how to get previous version of drivers?
i cant find those files:
- OpenNI 1.1.0.39 for Win32
- PrimeSense NITE 1.3.1.4 for Win32
- PrimeSense Sensor KinectMod 5.0.1.32 for Win32
and with most recent release it doesnt work..


March 7, 2012 | 3:48 pm

try this pack
http://code.google.com/p/simple-openni/downloads/list
32bit version

edit: sorry didnt fully read your comment.
try here and there
https://github.com/OpenNI/

http://www.openni.org/Downloads/OpenNIModules.aspx


March 7, 2012 | 5:29 pm

Hello Dale,
great to hear that there will be a version with multiple Kinect support :-)

I am having a §$%&/ time "solving" this with a rtsp-stream…


March 7, 2012 | 5:34 pm

If you are in the jitter world, just use jit.net.send/recv. We are currently streaming depth and rgb images from 6 kinects (attached to 6 CPU’s) to one central CPU with about 10ms of latency.

David


March 7, 2012 | 6:44 pm

@napentro the wiki has links to the downloads

https://github.com/diablodale/jit.openni/wiki


March 8, 2012 | 1:34 pm

@David
Thank you so much for telling me – I got it working now after a good deal of trial & error. Everything is fine now with a latency of 10ms, 13fps on a matrix of 640 x 480.

I had tested net.send a few weeks ago and had really depressing results (latency ca 7 secs), so I went for the rtsp method, where I was able to reach ca. 50ms using VLC, so this looked to be the way to go.

For anybody fighting against the quirkyness of net.send/recv: it looks like it is a bit unforgiving about the binding to a set IP-address and port and how the workgroup is configured (public/private/workplace). At least these are my guesses so far – patience and lots of reboots were the key after all…


March 11, 2012 | 9:33 pm

Thanks yair and diablodale!
I also find, that when you install the latest zigfu, and replace config file with my attachment everything works great even skeleton data without t-pose. weird :)
anyway diablodale, how the work goes with the new version? I look forward to upcoming update and can not wait to test it :))


March 13, 2012 | 4:33 pm

…and I’m back. Got all the after-work and video done at http://hidale.com/balloons/ for those interested.

I will start working on an update to the Kinect SDK version of my object late this week. I’m unsure if I will keep it closed-source or open-source. Reason being…

I haven’t had any contributors to my open-source jit.openni. :-/ I invite others to contribute to any needed updates to the jit.openni version. For now, it isn’t a priority for me to make code updates myself.



Spa
March 14, 2012 | 1:15 am

nayah
will enjoy yur last devlopment
if c, i will ++
immer weitr gehen


May 5, 2012 | 7:01 pm

FYI, my existing jit.openni object appears to work with no code changes. On my windows machine, I downloaded and installed
openni 1.5.2.23
nite 1.5.2.21
sensorkinect 091 (which is based on 5.1.0.25)

The only issue is in the XML config file. You need to remove the Scene node. Somewhere in the newer OpenNI code (yet not in their changelog), they have decided to create a scene generator even if not requested. So putting a scene node in the XML file seems to create a conflict. Remove it, and you have a working system, with no pose needed.


May 16, 2012 | 3:10 pm

Hi
I doesn’t understand the coordinate X, Y, Z that I receive from jit.openni
i have value like 1300. to -1400.
I have tryed an other type of OSCeleton a long time ago and it was from -1. to 1.
And when i try to map this value to a 3D skeleton it doesn’t work..

So my questions are: what is the range of the output value?

And do you have some tricks for mapping the coordinate from jit.openni to a 3d skeleton like Diablodale’s installation Balloons?

Thx
Arthur


May 21, 2012 | 11:29 am

Plead the wiki at https://github.com/diablodale/jit.openni/wiki which should answer your questions on the coordinates. In short, they are real world millimeters. Wiki…

I do not recommend you use OSCeleton -1 to 1 values. The author guessed on the Kinect’s behavior when coding it for -1 to 1. And they guessed wrong. No insult…it was just a wrong guess which led to incorrect functionality. I do have legacy support in my object to simulate this bad behavior. I don’t recommend it because the very nature of that old OSCeleton behavior is bug filled. Its also in the wiki.

Track for mapping? Haha, that’s the special sauce. I recommend you first create a simple 3D world with, perhaps, spheres for each joint. Once you get that working, then start building something more to your desired goal.


May 23, 2012 | 9:58 am

Ok thanx i will try that
Best regards

Arthur


May 23, 2012 | 10:36 pm

I am trying to make works the jit.openni external but no luck… I remember clearly that maybe 6 month ago I was able to work with that external…. Weird or we have to do new stuff to make it works….

Ben


May 31, 2012 | 1:31 am

Hi, I am getting this message

jit_openni: XMLconfig initialization failed (One or more of the following nodes could not be enumerated:
jit_openni: XMLconfig initialization failed (One or more of the following nodes could not be enumerated:
Device: PrimeSense/SensorKinect/5.0.3.4: The device is not connected!
Image: PrimeSense/SensorKinect/5.0.3.4: Can’t create any node of the requested type!
Device: PrimeSense/SensorKinect/5.0.3.4: The device is not connected!
)
jit_openni: XML config initialization open failed (Can’t create any node of the requested type!)
dumpout: read jit.openni_config.xml 0

I cant find what the problem might be, it was working last night before I went to bed and when I turned on my pc today just wouldnt work. Uninstalled all of the drivers and installed them again to be sure thats not the problem but it doesnt seems to make any difference. Any ideas how I can trace the problem?


May 31, 2012 | 12:30 pm

It seems the error is in layers outside my codebase. From what I can tell, its that the kinect itself is not successfully connected and running at the device level. Be sure that you’ve pulled it into the wall power, in a dedicated usb port, and that all 3 SensorKinect devices show up and are working in device manager.


May 31, 2012 | 3:56 pm

@diablodale I reinstalled all the drivers again and checked whether I havent forgoten the external power supply unpluged but I havent. I’ve got the 3 Sensor kinct devices under primeSense in the device manager working fine. Even to make sure its nothing wrong with the kinect I installed the kinect official SDK on another computer and its working fine there. I even tried the
openni 1.5.2.23
nite 1.5.2.21
sensorkinect 091 (which is based on 5.1.0.25)

you have suggested and they didnt work as well. Is there any way to test whether openni and nite are working on their own?


May 31, 2012 | 7:18 pm

Hi diablodale,

are you thinking in upgrade your object to be used with the kinect sdk 1.5? seems theres a lot of new features and better efficiency…. :)

thanks!


June 1, 2012 | 1:33 pm

@ElectronicElement: The OpenNI and NITE SDKs come with samples. You could try one of those samples to see if they work for you.

@carsol: The Microsoft Kinect SDK version of this object that I have written for my private use works unchanged with the new v1.5 SDK from Microsoft. I haven’t released this object to the public yet. They added orientation which allows me to make the object even more equal in behavior to the OpenNI/NITE version. Of interest to me is the work Microsoft did to include face tracking.

The main reason I haven’t released it yet is the audio support in the Microsoft Kinect SDK. I am learning how to write an MSP object in C so that I can expose this data (or part of it). My current challenge is learning how to be an initial source of audio data and thereby controlling the sample rate. If anyone knows…please do point me to docs or educate me.


June 1, 2012 | 2:27 pm

@diablodale I guess it isnt normal that in the OpenNI directory: "C:Program Files (x86)OpenNI" most of the folders are empty? All folders apart from Bin, Data and Driver folders are empty… there isnt a single file in the sample directory, only emty folders…

thats for OpenNI 1.3.2.3/NITE 1.4.1.2 for Win32 + SensorKinect Win32 Device Driver v0.7 (5.0.3.4)


Viewing 150 posts - 1 through 150 (of 223 total)