The only thing I can suggest you is to try to compile aka.leapmotion for 64 bit architecture. You can find the sources here.
Yes, I have seen Geco and it seems really interesting, especially regarding gesture support.
It outputs MIDI data based on gestures, while I was much more interested in generating continous control data for devices, and the possibilities of thus creating complex, subtle and somehow unpredictable patterns, rather than quantized values.
@Gusadel so can you confirm it is working for you in Live 32bit?
Unfortunately, I don't have the programming skills to dig into akamatsu's code and find a better solution for getting the external to work... I've tried to build aka.leapmotion against the latest SDK's but I get this error in Max Window
aka.leapmotion: unable to load extern, object has no 'main' function
Of course I share it. Just asked if you cared. BTW, it seems the Leapmotion has a hard time tracking the Y axis (vertical). So I changed the minimum input scale value for all the Y coordinates from 10 to 30. That seemed to ease things a bit on the lower part but this axis is still the less precise one. Feel free to play with the values (Leap motion drivers V2.05).
Thanks so much for this, it's terrific stuff! The only problem that I have is that I'm unable to record automation data for any parameter mapped from LeapModulation's "map" buttons. So for example, suppose I map left-palm-X to the volume of a softsynth (added to the same track, immediately to the left of LeapModulation). Then I can play notes on the softsynth, changing the volume throughout by moving my left hand. But if I try to record (whether I'm recording in a session clip or directly to the arrangement), the notes record but the volume changes don't.
Is there any way to get around this that you know of? Is it a bug, or a known limitation with some part of the toolchain?