Forums > Jitter

trouble with animating a 3D puppet from kinect


pry
October 8, 2013 | 4:01 pm

Hi,

I’m trying to animate a 3D humanoid puppet from dp.kinect. In the patch below, quaternions from dp.kinect are connected to jit.anim.node of Humanoid from recipe 45. Gestures are recorded and played back with a coll. Joints (shoulder, elbow and wrist) are displayed in the jit.pwindow, but Humanoid gestures doesn’t match them.

Any advice is welcome.

Thanks

patch is downloadable HERE


October 15, 2013 | 10:52 am

I recommend you reference https://github.com/diablodale/dp.kinect/wiki/Message-based-Data#skeleton-joints and the MSDN article that is linked from there. It goes into detail on how to use the orientation data.

I don’t have the time now to deeply debug your patch. However, I think that you are not using the correct rotation data for a given bone. According to the Microsoft MSDN documentation linked, the rotation of a given bone is stored in the child joint.

When I look at your patch, it appears that you are using shoulder rotation data on the bone between the shoulder and the elbow. I don’t think that is correct according to the MSDN documentation. Additionally, the rotations are hierarchical. I believe you will need to use all of them in sequence to build up the progressive/hierarchical rotation…or use the absolute rotations.

If you don’t immediately have more progress, I recommend you create a skeleton which matches the bones/joints of the Kinect data and apply the rotations to each. I do caution you…rotation data is not as reliable as joint positions and it might serve you better to use the positions are the primary data for your avatar and then use rotation data to fine-tune it.



pry
October 15, 2013 | 3:53 pm

Hi Diablodale,

Many thanks for your comment and advice. I’m going to follow them.

The reason why I have choosen to use rotation data instead of joint positions is that it seems more logical to me in order to animate puppets whose limbs are not the same size as tracked skeleton’s ones. Am I wrong ?

I thought that orientation data provided were not measured but calculated from joints position. Is thas not the case ?

Thanks again.



pry
October 15, 2013 | 3:54 pm

Hi Diablodale,

Many thanks for your comment and advice. I’m going to follow them.

The reason why I have choosen to use rotation data instead of joint positions is that it seems more logical to me in order to animate puppets whose limbs are not the same size as tracked skeleton’s ones. Am I wrong ?

I thought that orientation data provided were not measured but internally computed from joints position. Is thas not the case ?

Thanks again.


October 15, 2013 | 4:16 pm

I pass whatever orientation data the API provides. I do not change it.
In my experience, few people use rotation data.


October 17, 2013 | 4:38 am

Hi
May be the easiest method is to not use orientation from kinect, only positions.
And physic objects can do the orientation job for you :)

– Pasted Max Patch, click to expand. –


pry
October 17, 2013 | 4:42 pm

Many thanks Matmat (I love your Silicat project !),
I’m going to explore this way.


Viewing 7 posts - 1 through 7 (of 7 total)