How to get the positions in aka.leapmotion?
I need to get the palm positions of both hands in aka.leapmotion at the same time.
But, when I put my hands above the leap motion, aka.leapmotion gives the list of only one hand position.
I think that aka.leapmotion does not give the list for each hand position at the same time.
How can I get the positions of both hands simultaneously?
I wrote an external to do just that. Create two instances of the external and throw a 0 or 1 into the right inlet depending of if you’re using your left or right hand for that instance (0 = left). The zip file also includes a M4L device that I made where you can map your palm position to any device parameter. Here it is. Let me know what you think.
D’Oh! Just realized that didn’t include the help file with the object. It’s fairly straight forward: just set the hand flag on the right inlet (0 or 1) and bang the left inlet to get the X, Y, and Z coordinates for the specific hand. Here’s the help patch.
----------begin_max5_patcher---------- 786.3ocyXtrbaBCEFdM9oPCqc8.hqN6Z61tnSW0KSlNxfhQofDUHRbal7tWc AbvI3XLgPyBCVRGI8e9zQGI66VXYugsCWYCt.7Cfk0cKrrzUopvporkcAZWR NpRalcBqn.SE1KMsIv6D55+daMz5BBMGKzl61TYIRjjQna+IGmHLSmejyJmk .3Z8q.SA4SvkM84JFUTQ9KVONPYy6GeVsncBZqkjpUAay0uCFY2YDnnB8HX+ dNAk21RIGWI8BjfvnGnIesLhCUubZd.tT0o6WrP8X4KjSe6b4jW75WINENZN 4EGY3z5WMN8YTdAnjUQTS8EsMmSn3DVMUaC7LIID1kjQ5Bd9SAICdFRNgL4q mcriS7qTri273wvUfOfnaAhLLHGek.ncaffAj5qrVnaHolykcFT1Mjo2HFuy jetNtJj4FpSK3Z.ne7T.P24Aftq.eLiwpvfayHIYfLDMUgOAGk7qIYWUPT3S Yzjrsxc8zxnZIgpG4wTstmyf2CUh3REKv7ehonM43tN4gtdaTIQfKZNK19Sx fc6k.6kpGegrMSJm1IzzUweJwMFSnZa0e1aUWPNw4mjxeCle1IiBMjz8LSFM bPBOAf1HSkzOf7mV.IXa2JE53xV2.H3SAzvIg6yQhdI.bTAB3akc9IIc9sTj bF.1K.fmF.l8ZQti63pi56Ge42YZW92TKDObBzfudRP2Ca5Y4+k5lQiwWtJm Im2wd49f3YeutTvnme2dv3u2qWfwsBdw26cbbs8GC7ljq9ilqvPyd8fn+Wbs 4hxuI45DeW6ijz9ZwpbLprfr6HGse5D2PnICV7Hyb6cRF8nuzax7gQKcW027 8Q+8GZ2SU+gHrhUySZ0S6cF.O3fo3JAgpCr6Zj7jEf6dixHooXZ2fjTRkJtQ KemdWLGrdfCUONyidbFhdlO4DLP4LOqVQS.bJHokL4E5pZSh5rJPssSmD0yM ZeoGlmoCmRsAOg9CluE2CV25WM9ypZN0Zq2HUiIsEpr7FLupYH0BQli+ZFWU LbotHgZJpGQaN9FRq8dKTi18K9mFN2Qt -----------end_max5_patcher-----------
It works well!
Thank you, John:D
This is definitely what I wanted.
The help patch is very helpful to understand your external.
I really want to know how the external works, because I’ll extend it to get other positions of besides palms later.
The key is how to classify left and right hand. In your external, 0 inlet can detect left hand and 1 inlet can detect right hand, when both hands are above the leap motion. I’d like to know the main structure for classifying both hands in your external.
Thank you, again.
I’d love to say that I worked out a complicated algorithm to check which hand is being used, but it’s actually part of the LeapMotion API. All said, the aka.leapmotion is a better external and the author was kind enough to put his source code online; my stuff is really just an extension of that code that implements a few features that I needed from the Leap such as keeping hands independent and gesture recognition. I’m actually designing a set of externals for the Leap that I’ll post as a project whenever I get the chance to finish and debug them. I’ll post the source code with externals as well.
That is a nice job, I think.
I really hope to see your externals soon.
I just wanted to know the simple introduction of your way to track the both hands at the same time:) because I failed to do it using aka.leapmotion without working with additional external.
(Actually, I’m not familiar with making external)
Now, your external is very helpful for me though.
I really appreciate it.
And thanks for your kind post, again.
may I ask, what is the best current solution to work with the leap motion on windows?
I haven’t written my externals for windows yet and last I checked, aka.leapmotion is also OSX only. In that case, you’ll probably have to go the MIDI/OSC route (no pun intended). Doing a quick survey of the Leap Motion App Store, it looks like your best bets are going to be Geco MIDI and ManosOSC. In that case you’d just throw on one of those programs and use udp routing (possibly in combination with the CNMAT OSC externals) or normal MIDI routing to get data from the Leap.
I found the window external for leapmotion on max object database:
You can see "sekd.leapmotion" that was built on java.
I have not used it yet, so I cannot tell you about difference with OSX externals.
You can visit for downloading, here:
Or, you can download it from what I attached.
Thanks Kiske4, I’m trying it already. It does work on windows.
Will this object work for max 6.1?
It’s very nice
I’m really new to coding and using Leap Motion. I am able to get numbers/positions from my Leap Motion into Max, but how can you get Max to recognize gestures? If I want Max to recognize a pinch between my forefinger and thumb as a command, how do I get Max to interpret "if my finger and thumb start at these coordinates and then end on these other coordinates" that is a pinch? Any help is appreciated. :) Thanks!