Creating rotating geometry from euler angles?
I recently got an X-IMU motion sensor(http://www.x-io.co.uk), and am setting it up to use on Mac (it comes with windows GUI control/output).
Part of what I want to do is be able to visually see the state of the sensor in Max. Like this:
I’m assuming it’s possible, but how difficult is that to pull off in Jitter? In terms of the actual rotating shape, ideally with the same graphic overlays.
(my math isn’t great…nor is my jitter….hence why I’m asking how hard this is to do)
Sure, can be done. I do something similar with my accelerometers. You mention euler angles. Is that the output of your sensors?
It can spit out raw/calibrated data for each axis of each sensor (16bit 3axis gyro, 12bit 3axis accel, 12bit 3axis mag) as well as euler (and I believe quaternion too (if that’s different from euler).
The sensor basically does all the fusion algorithm stuff onboard, so it can spit out absolute angles on it’s own.
Looks like the angles coming in (from the euler stuff) are -180 to 180, -90 to 90, and -180 to 180
I’ve mapped it to a multislider looking thing and for the most part the angles kind of seem to make sense to the way it’s physically moving, but it’s obviously been flattened to a 2D display.
You can use both the euler and quat rotation pretty much straight on jitter opengl (3d) objects. Convert euler to degrees and use on jit.gl objects ‘rotatexyz’ attribute. Or take the quat rotation, convert to axis-angle format with [jit.quat2axis] and use on the ‘rotate’ attribute. The latter should save you from the singularities occurring with euler angles. With this you should be able to visualize your sensor rotation in 3d quite easily.
Btw, i thought ob3d objects could take quat rotations directly but I don’t find that in the reference right now.
Ah! See, I’m not crazy… It’s the [jit.anim.node] object that has a quat rotation attribute. So you could link your jit.gl shape to a node and input the quat data there.
Hmm. A bit over my head for the time being.
I’ve bookmarked all the objects to have a more thorough look though.
How about this?
----------begin_max5_patcher---------- 759.3oc0X9saaBCEF+ZxSgEJWlEAl+krqxdIxMSUQFvi3JvlZLsospu6ybLz xVRSfJVTlpDDaLG+4e197Y5qyrriEGnU1nui9Ixx50YVVPUMUX0V1xtfbHIm TAMyNQTTP4J6ElmonGTP8a2hX7xZE5gZhBIjHZcNUhjBEQwDbzdpjh1ts68J Ipj8Ld1NIMQY5+.G7RmEHb.b0yGt4rzAcW66vqKX7bpBDhaakrTn6Ew2+s0c A+WBthSJnvS9gjQxs+HDhZUWLb5ZuLKFJuzoWDpXu.QvUKqlZea1rlKKFHnJ nUUjL5QfB3ybWzbLZtGZt+YHhWHPjUM5B453B2V8YDAeLQ7GKQ5nZbVhHWHM xvz88u7t.Lup54Rpoo11u+nI.gb5S5QwQD7dlZYV9x8DdZN8L3a8JPtXP+QP Ar6HVOg8FK9veJVV7onw6e.Zxjrzp8jRJRwxUnMlemTGSQafIVjKxQ+2lRQ9 y6JDoTcEtnMDcx.j6kWQ5F.278bFMTCmLlpGq6JHJI6v0jtDNqXIWSrAPIWe yF2wSof+KW48DimJdpaMmtazotbPgNNmiUl8kNqATE4OZT4imLVES3YSNuTh rryllpk.Ac9ciz0Cu9iXK0idEUtixIw4z9VbmNM+QDfoc1Q2McqJdPqGoPOf t33O.7VhBO+7+I73vQeUStSO+eE1qnPwHpjTc4UEFq+Puw6cEN46JLJ95YhI o7T8IHMYSRzi.IALsVMDeJi0eX6YIWOd94eamUgnTxZ1YHf+pdmcDaLjFWdk tf2zUuex0KSk1iNhu3YGG3lxI8DkWDaQQd8vla3nwVzQXC9DH5gme4FkcPar yY7+9CAA7zT+eBzJQsLoqGZGxnOjVJsRw3vG80qMA+Qa1yRSo799SorpFKKf imdhcnpw+lRM3aJ0DL.0Dd8Xi2.jC1+5omlgt6Mldtk3yP1niuhqeVOD8D8E 0iIsHor7QprpMlfTzNJ2a9WiDt.Jx3lhPDskzGYcsOZVSzda1uA4m22W -----------end_max5_patcher-----------
As dtr sais, using quaternions are absolutely advisable to avoid strange things happening to your rotations! So I would focus on them. It’s actually quite rewarding (but a bit difficult) trying to understand what quaternions actually are. This articles helped me a lot:
Be sure to read about quat multiplication, as you are probably going to need this…
If you prepend ‘rotate’ to data in axis angle format, you can connect to all jitter opengl objects (so without anim.mode)
I would then suggest to put all your reading from the sensor and parsing data in a separate subpatcher or abstraction. At the end of the cycle there, you would convert your quat data to axis angle using the jit.quat2axis, prepend ‘rotate’ to it and feed this to an outlet. This way it’s just a matter of loading the abstraction/subpatcher and connecting it to the desired objects in your future patchers…
I’ve looked at some of the math stuff a bit, but as you’ve mentioned, it can be quite complex.
Thankfully the sensor I got does all the calculations/heavy lifting so the sensor fusion happens on the onboard CPU, so it spits out the crunched numbers.
Thanks for that dtr, I’ll plug the sensor to that tomorrow and give it a whirl.