Max was used in just about every facet of this project. The eMotion Technologies' Twist sensor suite is run entirely in Max (the software that the Twist comes with was written in Max). I utilized the Twist's native software and customized it, then wrote my own series of Max patches to process, shape, and map the data so as to properly interact with my sounds in Ableton Live. I used the X and Y axes from the accelerometer on the Twist, as well as two distinct threshold detections from two other data streams. The data stream from the X axis was mapped to the dry/wet parameter of an audio trashing plugin within Ableton Live, which I customized to my liking. The data stream from the Y axis was mapped to the dry/wet parameter of a granulation plugin that I also customized within Ableton. To add another method of gestural articulation I incorporated threshold detection utilizing two different data streams which allowed me to add two more gestures to my gestural palette. These two gestures also allowed me to incorporate triggers, in the form of note-on/off messages, alongside the continuous control messages that were already being sent by the first two data streams. The up/down gesture that acted as one of the triggering mechanisms triggered a predetermined sequence of which tracks were being heard, and when. At the onset of each sequenced trigger, a custom-designed sound was played to highlight the trigger, and the panning control was then shifted to the newest track being heard to make it more apparent which track was just turned on. Following the sequenced triggering, I then randomly triggered the state of each track to being either on, off, or partly on by the left/right gesture, with the panning controls shifting to the master track. If you have any other questions about my data-mapping and use of Max, please feel free to contact me!