Creating rotating geometry from euler angles?

    Nov 30 2012 | 8:38 am
    I recently got an X-IMU motion sensor(, and am setting it up to use on Mac (it comes with windows GUI control/output).
    Part of what I want to do is be able to visually see the state of the sensor in Max. Like this:
    I'm assuming it's possible, but how difficult is that to pull off in Jitter? In terms of the actual rotating shape, ideally with the same graphic overlays.
    (my math isn't great...nor is my jitter....hence why I'm asking how hard this is to do)

    • Nov 30 2012 | 8:42 am
      Sure, can be done. I do something similar with my accelerometers. You mention euler angles. Is that the output of your sensors?
    • Nov 30 2012 | 8:54 am
      It can spit out raw/calibrated data for each axis of each sensor (16bit 3axis gyro, 12bit 3axis accel, 12bit 3axis mag) as well as euler (and I believe quaternion too (if that's different from euler).
      The sensor basically does all the fusion algorithm stuff onboard, so it can spit out absolute angles on it's own.
    • Dec 01 2012 | 4:19 pm
      Looks like the angles coming in (from the euler stuff) are -180 to 180, -90 to 90, and -180 to 180
      I've mapped it to a multislider looking thing and for the most part the angles kind of seem to make sense to the way it's physically moving, but it's obviously been flattened to a 2D display.
    • Dec 01 2012 | 4:42 pm
      You can use both the euler and quat rotation pretty much straight on jitter opengl (3d) objects. Convert euler to degrees and use on objects 'rotatexyz' attribute. Or take the quat rotation, convert to axis-angle format with [jit.quat2axis] and use on the 'rotate' attribute. The latter should save you from the singularities occurring with euler angles. With this you should be able to visualize your sensor rotation in 3d quite easily.
      Btw, i thought ob3d objects could take quat rotations directly but I don't find that in the reference right now.
    • Dec 01 2012 | 4:45 pm
      Ah! See, I'm not crazy... It's the [jit.anim.node] object that has a quat rotation attribute. So you could link your shape to a node and input the quat data there.
    • Dec 01 2012 | 5:15 pm
      Hmm. A bit over my head for the time being.
      I've bookmarked all the objects to have a more thorough look though.
    • Dec 02 2012 | 11:28 am
      How about this?
    • Dec 02 2012 | 12:04 pm
      As dtr sais, using quaternions are absolutely advisable to avoid strange things happening to your rotations! So I would focus on them. It's actually quite rewarding (but a bit difficult) trying to understand what quaternions actually are. This articles helped me a lot: (basic understanding)
      Be sure to read about quat multiplication, as you are probably going to need this...
      If you prepend 'rotate' to data in axis angle format, you can connect to all jitter opengl objects (so without anim.mode)
      I would then suggest to put all your reading from the sensor and parsing data in a separate subpatcher or abstraction. At the end of the cycle there, you would convert your quat data to axis angle using the jit.quat2axis, prepend 'rotate' to it and feed this to an outlet. This way it's just a matter of loading the abstraction/subpatcher and connecting it to the desired objects in your future patchers...
    • Dec 02 2012 | 6:43 pm
      I've looked at some of the math stuff a bit, but as you've mentioned, it can be quite complex.
      Thankfully the sensor I got does all the calculations/heavy lifting so the sensor fusion happens on the onboard CPU, so it spits out the crunched numbers.
      Thanks for that dtr, I'll plug the sensor to that tomorrow and give it a whirl.