hey Bas - snap. I got mine about a couple of weeks ago - played with it for an hour, was similarly underwhelmed and it hasn't been out of the box since... I'll play with it more when I get some time, but I don't see it as accurate enough for doing anything musically unless you're happy at having random elements introduced also (which some people maybe)
FWIW, I've been using the leap as my main performance interface as long as it's been around, mostly with a customized version of aka.leapmotion. I've been looking at leapformax, hoping to get the v2 beta data out (much better tracking) without complete success at this point, but I can tell you that it's giving me a faster update rate than aka.leapmotion, somewhat to my surprise. one frame in aka.leapmotion comes in around 80-90 fps, in leapformax, I'm getting 95-105. It's a bit fiddly to install, and you have to go to the terminal to run it, but the results are good, and only about a 4 on the geekitude scale. I've also found you can throw the .json data into a dict, and that speeds up things a bit as well -- I can post the code if anyone is interested.
Doesn't cover all possible data, or gestures, but you should be able to use it as a model to get what you want. The tricky part is the changing size of the hand and finger arrays, but that's done. You may just need to alter the [dict.unpack] objects...
In the spirit of Leapformax's release, here's a compiled processing patch I'm calling LeapOSC JeremyLeapOSC. It's just Leap Motion for Processing's basic example mashed up with oscP5 so that it send hand and finger orientation and position out over OSC on port 5001 (more info on Leap for processing here:https://github.com/voidplus/leap-motion-processing)
-Complete revolution, first of all - Absolutely - NO-MORE-MEMORY-ISSUE.
-The tcpigola max patches are now independent from leapmotion, they can be used for ANY JSON OBJECT
-dict approach - a bit slower but stable- great for unpredictable JSON objects and where you cannot predict the fields you will be interested in reading, also very good for integrating with preexisting dict based systems.
-props approach - very fast - select exacly the fields you want and have them outputted as an array, then you can do whatever you want .. for example unpack it.
I'm not having any trouble with the dict patch. You might want to confirm that you are getting data out of [sadam.Udpreceive] with a message box or something.
There is an issue with the props patch -- if there is no hand visible when you start banging it, or when you take your hand away, you get that error. I've been in touch with the developer about it. It's in the dataparser-props js -- it needs some checking added for when one of the props comes up null.
v0.4 worked for me. When running tcpigolo-ascii.js in terminal I get "Error: Cannot find module 'websocket'" This didn't happen with tcpigolo.js, but I deleted so I'm not sure if it would work now or not. Everything was same, though since I have ran installer for websockets again but still get the same error.
also you should try having a look on the renewed setup pages, if you still did not.
If using windows I highly reccomand the auto setup script.
If using some other OS I strongly suggest use of npm now.
Wether portable or installed the "npm" tool is available on all os.
Try uninstalling websockets module and reinstalling it using "npm"
EDIT: care for the s in "websockets"
EDIT: there are significant improvements in the patches alone. if you need the tcpigolo-ascii.js this is UNCHANGED from v0.4 PREVIEW.ùyou can use the v0.5 pathes with tcpigolo.js coming from v0.4 PREVIEW.
Hi there I finally installed leapformax and it works real nice. I'm using the ascii file but I don't know how to retrieve other movements such as swipe and circle and tap. I tried the logical attempts, like props hands.swipeStrength with no success, is there somewhere a precise list of these?