filtering Gestural control data with M5Stick with Digital Orchestra Toolbox (in Max MSP... duh)
Hi All
I have been working on gestural controlled audio and video with my M5Sticks and I have a question that is way beyond my knowledge ov max or math!
In my patch I have I have recorded the Yaw data in the mtr object so you may see what kind of drift I'm getting from my controllers. also I recorded all the data in other mtr objects bellow if ever you want to see the data that that the controller generates when it is still on my desk.
I'm using the digital Orchestra toolbox to filter my results and I was hoping that an equitation could allow me to stabilize this data rendering it usable for gestural controlled whatever. (perhaps with a calibration bouton to return it at 0)!
Thanks for any leads and just anything!
M