jit.gl.model camera perspective
Quick question on how to adjust the camera perspective of a loaded character from 3rd person perspective to first person perspective. Something like how it is about 1:15 of the way into this video:
Well the simple bit would be setting the position of the camera, as you would just send the Kinect x.y.z position for the head, and feed this (packed) into a (position $1 $2 $3) message going into the jit.gl.camera object.
However the hard part is the view of the 1st person, trying to get the camera to rotate with the head. As far as I use the Kinect (I use Synapse as middle-ware), I wouldn’t be able to map the rotation of the head, so I would have to try and work out some math around the shoulder positions -> rotation conversion. However if you are using other means of retrieving the Kinect data, then see if you can get the rotation of the head, and do the same as before.
hope that helped
the gl.camera orientation can be set in one of 6 ways:
you need to determine what format the rotation is coming from the device, and translate that to one of the options above.
also, the jit.quat object will allow you to specify x, y, and z axes, and give you the quaternion values from those axes.
Thanks this info really helps, the good news is that I’m not going to try to get the head tracking part to work, I’m going to use the joystick on the Wiimote nunchuk for that purpose instead. (panning the joystick while holding Z will turn the camera and head of the model up to a natural constraint limit, whereas panning without holding Z will trigger the turning animation and rotate the camera endlessly in both directions).
My only other question for now would have to do with collision detection – how are collisions detected and handled in Jitter 6? I’m working with .b3d at the moment.
Anyone know of relevant threads and/or tutorials? Thanks!
for collision detection you can use the jit.phys family of objects.
jit.phys.world @dynamics 0 will make the physics engine perform only collisions.
take a look, and ask any specific questions you have to the forum for more help.
I wanted to explore what you were after doing originally, and came up with this little patch which works out the difference between the shoulders to create rotation.
If you’re carrying on with the nunchuck joystick though thats cool ;)
----------begin_max5_patcher---------- 1400.3oc0ZssbhaCF9Z3oPim8RfUR9bmocZeNZ5vHvBPIFYpsXSxty9tWcvN 3MwFrEFS5jIFjvV96+S+ms+wzINqxdgV3.9MveClL4GSmLQOkZhIkim3rm7x 5TRg9zb1jlwOt2Yl4mjeM6nHkJz+HtbVyThWOPMqr5hHBmY.mUD9VGv+Tddr D8Rls5w49AUK4lLtfS1quTm+JmQRq9kCDw5cL91k4z0ByJ6EFu.NCf8TG8g5 uKO91c3.IWtTBZ9RJmrJUunvZ2nB120ygvKfmDIFuRhPp494zopCy5HCsmVT P1RqPsf9hFrN4YBhbvqeG.AeAAfMygn13vloMnUzluKRSUg5OhCTGQQ03stv MXK3FN8YIr+.07HSrXa5h0R7mS5otk7ZWtMcoZsNlSUJYMxTdQ1wTXiBVbn5 iP+OngcqzhVcTHx38TGoUyKOuyHitnRsg3E9ZaoxiusPWsvb9s8hmdUc48ea eOQjydo8sbeq1xQn.EC3FXLKfQZxA01dt6vsmeAqiiqn6IG.+4FVJEbfvyj9 1HKUSKYhCKNH26KuzTFmtN6HWTmBGX6I6bXi07EvE6q0xfZCJ2fdRtAVPtDg H+HqBZpQ5oY7MLNSPGBuwdgWkBmgShg1ouUSqY4yrDwNMQIWlgSQ7H+.Y8Sf Mp+ZlubqjxbZAkKi2wx3028Mgn80VVvxCuIlsm1v69RibO1N8QWMLhzHJHdz bue0L70QVHqHK26DYIx1tMsuVnLdyxta34TGB0YRhipjsdjO4sRm3IkybQtT .o4svA9s5lZlk+2HyYW7zRN0XzigQVo1LvIa9Limj8LXSV1pdl2gNEq1XHrk IY5qC.f7LlW+O0vBGdYQzTn1mDCq+UdCyyjd0Fprrw9Wy1uulaB7u21GBvJf rLrBpkVFlqsQ9w6Z3GSXG+v6cL5xzwyo7DZ9v6Bw8pbgXRuNd7Ho151w7nAo AGX6xTwCWuAGlrNumM3nXMQVy1bjOTkCLXdLTlm+PvOnX63GerleLc2.gsSe wlxvVmseurxfO1cLoQAE.mifvG3OvkjzuCjU3tmj9UvFxZI7klZ4OSxSdf2X YtdMxlvFnL6BRGDdpSiHiFU.9dZfgjadgn3.nWOJ6umZW1UjuaTnNXuQ6xvU t36m02b.ZQOM1Zu1Ij60XwghBKYC+6eT9MVF35LjC1REFiGZiWaKImAsW8rs 6DCh+Y6heg8hq4edDie0p6YEg.J1kcLMosxPgmu4OtXSCUao6O0HMKioAwm5 NQTznkATaTVJcigwRRo4Ovc5Z7I6BOgLZLkRO578P71mjLaC3KaPf+PdDCD6 nbflNnoETvfYbYY+HJc1X7DiFwBttQO6T7UQCAHKd1oRhfsWJJpoVbgGnpjB pNWYxcWVySC.ctLu6ASqEB07+JEWjcLecEaUEgFbRNRnEBFW6Mp9IE+KmzNV RBkWWDSXEJoVSyM2C8Nim3NfGUgN.zmG7ndxxmke1yRNjw3hxMIOodjJpser oWFv2Fc5FMXRfpSj.7Ej.0SDnORfKRi4HCxciqFbqvu6kvOpm32TahA+t2V7 G1AMHsPNNVXp2djtgmwwB6cFOsr+FMZ7iWW3Guwa+xK7SFd75BdBFO7f6f+M 73AGTGfCZ7vipOx0rjagdb6m6y.no0NPSIWUu8Gw0xHZfk.30JACHd76Bd7F O7zEGD3wyAAtKYLMdIT1E3z2rehzILn5roJ8mZitEp+cIA8dZ+VA4HW0Gkoi pGcCDftUhQvER3nkTnQ5tm5U9RHnFcKj.bG7gpEy9HAPWSo0ZmmdlWWR8n6k DLh470EMBaKxyTSL4vguQyKJWSMRjUW+Xl9MYKX1TSg4lg5UzIm9MV046MUs Z+b5+ADU.QZD -----------end_max5_patcher-----------
Thanks Chris! Actually I have to experiment a LOT before I decided the mapping scheme I’m going to dump a lot of time into, so this really helps! I’ve just been working now to set up the basic hardware for the experiment, which it turns out is not so basic.
I’m going with a 3-monitor setup of equal size for peripheral vision because I have not seen an HMD headset on the market yet that I feel would be worth the $, though 2012 might reveal the first half decent HMDs appearing on the market.
The point is to thoroughly test the gestural/surround sound framework I’m working on before releasing it (it’ll be open source for Mac and Windows), that is designed to interact gesturally with sound objects in 2D/3D space, with or without graphics representing those objects in 2D/3D space as well. I’ve explored applications of it as a DMI this past year, now I’d like to explore the possibilities for gaming. I’m hoping to model an interactive environment in Jitter that I can later rebuild in Panda 3D
Anyways, thinking about it a bit more, I will probably try to use head tracking for peripheral panning and the joystick for full body rotation. It’s a shame only Sony makes the hardware to do all this on one manufacturer’s set of hardware but make it so hard to develop using their stuff. Right now I’m finding Kinect and Wii hardware together to be a great combination.
So that code will be useful – thanks!