Create musical patterns using 4 layers of gestures.
This MIDI maxForLive device helps you create musical patterns with mouse gestures. When you drag the mouse on the interface, the particles’ motion trigger notes and your gesture is looped right away, in sync with the current song’s tempo. This allows you to explore musical ideas in a way that is more instrumental that programmatic, by picking-up expressivity from human gestures.
The idea draws inspiration from pioneers like Len Lye, and more recently with Scott Snibbe’s Motion Phone, in the way sequences are created from mouse gestures.
You can re-explore your Live instruments very intuitively, coming-up with original polyrhythmic patterns, or stress them up to extreme triggering speeds so that they sound more gritty and granular, depending on the max-speed value.
Although I have tested this a lot in the past weeks, I consider this a first beta. Use at your own risk and check the project’s page for downloads and updates.
How did this project use Max?
Max is used to create the user interface for this Live device. The drawing primitives of the LCD object made it very easy to quickly come up with the particles interface. The physics are updated within a custom java external that uses different threads for the graphical commands and the more time-critical aspects of note events triggering.