Osceleton Coordinate System
I’ve gotten Osceleton set up & am receiving messages in Max. However, I’m having a tough time understanding how the coordinate system works.
I can see l_hand moving in a reasonable way if I look at that point in isolation. However, looking at the coordinates for l_hand versus head, I’m not sure how to interpret what I’m seeing. I would expect that having someone move their left hand from hip level to above their head should give a set of values where the Y coordinate for l_hand crosses the Y coordinate for head. (In other words, if Y for l_hand is less than for head at hip level, it should be greater when the hand is above the head. Likewise, if Y for l_hand is more than for head at hip level, it should be lower than for head when the hand is above the head.)
That doesn’t seem to happen; I get a range of values for l_hand, but that range doesn’t seem to have anything to do with where head is.
Am I missing something? Are the coordinates for a point relative in some way rather than absolute points within the field of view?
No experience with OSCeleton but I would expect the z co-ordinate would be the one to watch as (either) hand moves *up* over the head. Typically (though not universally) in cartesian space, x is across (ie from left to right) Y, is in or out (back forward on a 2d plane) and z is up or down. Is there any chance you are looking at the correct results in the wrong way?
I also don’t understand quite well how this data works, but "z" is the 3d plane, where you get the distance from kinect. I think the floats from tracking are relative to each inicial point tracked in OSCeleton, not regarding the distance/space in itself.
Anyway, I wrote a set of abstractions called KOSCeleton that you can put it in your abstractions folder, so you can use in your patch only the point you’re interested in. To do so, just add the letter k, and type the name of the point as you’d do with a regular max object, for example: khead, kl_hand, and so on. The outlets gives you x, y and z. By doing this you avoid always having to work with that huge patch that tracks the whole body.
I can’t seem to get any OSCeleton data into my patches using the regular method of udpreceive – OSC-route etc. I’ve just tried your handy abstractions as well, still no joy!
Weird thing is I’m getting data to Max using Bellonas Kinect-Via-OSC interface :
Have you any idea why it works through his interface but not through your abstractions?
Figured it out… I had to alter the route object slightly from (route r_hand) to (route /r_hand /1)
It seems it needs to be explicitly told which user data it’s taking and the back slashes were required also.