Doing a gesture controlled interface in Max/MSP using the Kinect. I need somebody to help me finding an interesting approach to using Kinect + music. This is a bachelor thesis.
Here are some of my ideas:
- make a prototype to show a way of making live performance more interesting by gesture recognition to trigger effects on sounds/music
- make a prototype to show a way of using your body movements/hands to control effects in a sequencer instead of using your mouse or knobs on a synth/mixer/midi module
- more ideas? (KEYWORDS: kinect + sound + max/msp)