Gestural Music Interface to control MIDI events in Ableton using gestures. Previous leap motion interfaces only used limited gestures which were recognized by Leap Motion in box. (pinch and grab, pitch, yaw, roll).
However, this project aimed to use actual machine learning (SVMs) in real time to recognize and classify hand gestures to control music. The prediction is accurate and can be mapped to multiple parameters within Max.
Working with Max was supportive and helpful throughout the project. The ml-lib and ml.* externals are ideal for machine learning. There were a few 32 and 64 bit hiccups but that's because my outdated Max 7 does not work with Ableton Live 10.