Wow – can't wait to get this gesture interface !

May 21, 2012 at 5:01pm

Wow – can't wait to get this gesture interface !

#63692
May 21, 2012 at 6:43pm

It really is a great time to be involved in interactive media; so many great tools. And a tiny price tag too.

#229655
May 22, 2012 at 2:04am

Hmmm… Looks awesome . My inherent.. uhh.. skepticism – which I hope is wrong – kicks in though…

#229656
May 22, 2012 at 6:35am

skepticism about……..?

just as conventional acoustic instruments have yet to be supplanted by the new technologies, so too the computer keyboard and mouse? That this device will enjoy a brief moment of glory before being subsumed by the games/Kindle industry and then disappearing (RIP TouchCo)?

Just curious spectro

Brendan

#229657
May 22, 2012 at 7:16am

I think he’s talking about how accurate the demonstration is to what it can actually do.

#229658
May 22, 2012 at 7:28am

What Rodrigo said…

It really looks good. If it can do that kind of tracking and gesture recognition with low latency and fine granularity I’m sold. But, perhaps grudgingly to others, my skepticism remains till more evidence (other than an inhouse video) is available outlining what one might expect in a practical situation, on a less than stellar and totally optimised system, etc…

#229659
May 22, 2012 at 8:11am

please make me 1 with a range of about 10 meters ;)

#229660
May 22, 2012 at 10:52am

Holding my sceptical hands up in anticipation here!

#229661
May 22, 2012 at 1:47pm

OMG and other internet expletives. That sure does look sweetly but also the cynicalness hmmm yeah me too. Hopefully it gets released soon so reviews, feedback and all that jazz can sort out how good it actually is. £45 is cheap…like I could actually buy one cheap :P

#229662
May 22, 2012 at 3:04pm

Add to my skepticism a hope that it doesn’t have to work with annoying buggy proprietary software that makes it difficult to interface with Max!!

#229663
May 22, 2012 at 5:14pm

I have bought much less interesting gear for much higher prices. The makers pretend to have been working on that stuff for four years so I think the results are credible.

#229664
May 23, 2012 at 1:20am

Maybe being a developer for the project would allow a Maxer who can code externals (or js or java) to make that interface to Max more straightforward. Or, even better, they’ve already done it…maybe [hi] can see it, and all we need is a parser patch to get whatever we want out of it.

Gotta say, the potential is awesome and the price is a steal.

#229665
May 23, 2012 at 2:10am

From the faq

What are the tech specifications for the LEAP?
TBD.

I am afraid skepticism is not misplaced.

I also call total bs on (from the order page)

A limited number of Leap devices are available for immediate pre-order. Your credit card will not be charged until your Leap is shipped. You can cancel your order at any time.

There is no way on gods green earth that the number of preorders they will take is “limited”. It is possible that they will not be able to fill them all at once, but they should not even try to pretend they will not take all the preorders they can get. This is marketing claptrap, and transparent marketing claptrap at that.

That said, their investors seem like some heavy hitters. Whether or not all of them are interested in bringing this tech to market is another question entirely.

Another bothersome thing is that they say they’re still hiring machine vision specialists. This puts the existence of a solution to the core problem into question.

Kinect does all this with a zplane. Not planning on trashing my code yet – but then again, my code is hardware agnostic.

#229666
May 23, 2012 at 6:38am

The french website Futura Sciences interviewed the CEO of the company. The technology is based on infrared.

Read it here:http://www.futura-sciences.com/fr/news/t/informatique/d/leap-motion-linterface-de-controle-en-3d-qui-menace-la-souris_38871/

#229667
May 23, 2012 at 9:37am

Well one thing they clearly got working already is the hype machine…

#229668
May 23, 2012 at 9:51am

IR technology with a linear range of 8ft cubed with a resolution of 0.01mm? I clearly haven’t been keeping up to date with my own research in interaction technology.

Brendan

#229669
May 23, 2012 at 2:16pm

As IR’s wavelength is between 780 nm and 500 µm there’s nothing so surprising here.

#229670
May 24, 2012 at 7:36am

Seems like Wired had a hands on with one of these – http://www.wired.com/gadgetlab/2012/05/why-the-leap-is-the-best-gesture-control-system-weve-ever-tested/

Watch the video.

“It’s… really impressive. And.. I.. don’t want to leave, I want to stay here – and I’ve already asked the guys who run the place if they’re aliens and… and they won’t tell me yes or no, so… I’m gonna go with ‘yes.’”

#229671
May 24, 2012 at 9:43am

i seriously hope they’re working on a room size variant next :)

#229672
May 24, 2012 at 10:36am

to quote primer: “Every half meter, everywhere.”

Though, I suppose that would only get you the edges of your room.

#229673
May 24, 2012 at 4:05pm

!

#229674
May 24, 2012 at 9:40pm

I emailed them and they said it appears as a regular HID device with a 20ms latency which they hope to get down to 10ms in later orders.

#229675
May 25, 2012 at 12:38am

wtf? hope to get down in later orders? As in, we will ship defective product to ppl wh were the first to support us?

Smells like apple

#229676
May 25, 2012 at 7:20am

And 90% of the software industry

bug fix
revision list
update
upgrade

all industry-speak for “not quite ready yet”

It’s a necessary evil, but when’s the last time you bought a car or other hardware product that had minor faults that required you to regularly and consistently return to the manufacturer for improvements?

Brendan
(woke up cynical)

#229677
May 25, 2012 at 11:58am

Horsepuckey. Intel or AMD never did this. Only apple likes to do this stuff, one of the many reasons my hatred for them has grown exponentially in recent years. Sofdtware? Sure. Hardware? Hell no.

#229678
May 25, 2012 at 6:58pm

Their only mistake was telling you that they hope to get it down to 10ms later – if they had just done it without announcing it earlier, nobody would care.

#229679
May 25, 2012 at 7:41pm

MuShoo, well, more accurate to say that nobody would be able to do anything. Ppl who preordered would still be pissed but short of buying a new unit they wouldnt be able to do anything about it. Now preorders are gonna go off the cliff.

1) open mouth

2) insert foot

#229680
May 25, 2012 at 8:03pm

I do not wish to divert from main topic but, you can wave your hands in the air all you like – I certainly won’t be buying one; nothing compares to either Randy Jones’ (Madrona Labs), TouchCo’s (RIP) or Ruud van der Wel’s (MagicFlute) innovations

Brendan

#229681
May 25, 2012 at 8:28pm

For 70$ it looks like a bit of craic… 20ms means it’s not going to be very accurate but for general sweeping motions and some quantized switching it should be fine.

#229682
Jun 1, 2012 at 1:13pm

I sent them an e-mail asking about the tech-specs of the Leap to see what kind of data it sends to the computer and if it is compatible with Max/MSP, and this is their first reply:

“I can totally understand your skepticism – it’s a good thing to have in this day and age! Keep in mind, though, that Ocuspec is funded through the Founders Fund, which is a very important organization and they wouldn’t be fooled. Also keep in mind that all of the major news sites covered the release and many personally traveled to the company and did on site interviews and saw the technology. Scams will only have advertising done by them, not by well known news agencies. Here are some good news articles to learn more about us:” (a few links followed)

I told them I didn’t ask them to justify themselves, but I asked a question about tech specs which remained unanswered and which I would like answered, and their reply was:

“The technology is something we are keeping a secret for now. Let’s just say its a lot of very creative innovation.

The info we are planning to send to computer is aimed at being very generic mostly, like that of a mouse. “

And I still don’t understand why they talk to me in Simple English and can’t summon one technical term in their answers. I haven’t sensed a single bit of honesty in either of their replies, and their website looks fancy enough to draw attention and look professional, but there is no substance there either.

If they managed to build one of those, then they used it for the demonstrations with media to attract attention but I’m afraid the heap will disappear as soon as people realise this is a scam and no one will ever get their hands on one of those for $70. (But the Leap developers will get their hands on everyone’s $70..)

That’s my hunch for the moment – I really would love more than anyone for Leap to be true, but I just want to make sure it really is true.

#229683
Jun 6, 2012 at 1:58pm

I had the same impression without emailing anyone – shocker! I doubt they will accept many orders, I think they will make some beta version that will disappear into the vaults of whatever funders, to be used (or not) in their own tech.

#229684
Jun 6, 2012 at 9:41pm

Well, they did say that cards wouldn’t be charged until the units shipped. I preordered one, no charge yet, we’ll see what happens.

20ms is pretty good IMO, though of course, the faster the better, especially for musical and percussive things…I wonder why they can’t get it to 10ms now. Just a hunch, maybe they are waiting for a next-gen chip price to drop? Then again, maybe it’s on the programming side, or USB communication? Just speculating…

Also…anyone else notice that their company is literally a ten-minute walk to C74′s? :)

#229685
Jun 6, 2012 at 11:45pm

The geography factoid is pretty funny. Re datarate, you may well be right. The biggest pain in the ass associated with making a physical product is setting up manufacturing. Maybe they have a min. order for the 10ms chip.

#229686
Nov 27, 2012 at 12:41am

I have a dev unit of the device. I was thinking of making a max external for it, but I just haven’t had the time. (Never had to interface with an external api in Max, so I’m not sure how involved this would be). The device is nice so far and each couple of weeks they are making some really great improvements with the software. A Max external would make it a breeze to prototype applications for this. Maybe someone else will beat me to it. Anyone else get a dev unit?

#229687
Nov 29, 2012 at 2:12pm

Well, i pre-purchased, but was not chosen for a dev unit…. as far as:
“(Never had to interface with an external api in Max, so I’m not sure how involved this would be).”

From my experience wrapping apis in other languages: It is as involved as you make it: you provide wrapper calls to as many of the API calls as you *need* too, up to all of them.
I would imagine for Max the process is simple: the input parameters are filled from max inlet info, and the results are sent out max outlets…a max response method is created that calls the embedded api when the “driving” data comes in the correct inlet. On a complex lib, you probably will want to have several max objects, dividing up the api’s methods by functionality that deals with specific input data: You can easily make different api method calls in your object(with new symbolic messages), but it is more difficult to interpret your max inputs as greatly different input data: e.g. if your object’s input 2 is an x screen loc, it should probably be an X screen loc for all methods called from this Max object, but you might well have several different api calls available in that specific object, all dealing with that screen loc, and returning the same data type.

ok, more coffee, more property insurance, less fun.
wish i had a copy of the leap sdk…:)
cfb aka j2k

#229688
Dec 6, 2012 at 6:54am

I built a Max object for Leap Motion. It’s at the early stage at this time but I’ll publish it soon.

#229689
Dec 6, 2012 at 4:32pm

See the sneak preview at http://twitpic.com/bjlx3f

#229690
Dec 6, 2012 at 10:14pm

Great, I won’t have to do it then! When are you going to post?

#229691
Dec 7, 2012 at 3:56am

Sneak preview 2 http://twitpic.com/bjq4ll

It’s soon. > Marcos

#229692
Dec 7, 2012 at 10:06am

aka.leapmotion is available now for downloading. http://akamatsu.org/aka/max/aka-objects/

#229693
Dec 7, 2012 at 10:46am

Thank you, Masayuki Akamatsu!
I wonder when the leapmotion will be shipping in quantity… I want one! Or two ;-)
By the way, does the SDK and your object allow for several devices simultaneously?

#229694
Dec 7, 2012 at 6:46pm

Thanks, this will make prototyping much easier!

#229695
Dec 7, 2012 at 7:13pm

masa you rule. I wish I had one of these things!

#229696
Dec 8, 2012 at 3:51am

Pedro,

As far as reading the current SDK/API, there is no information such as device ID. Maybe we can use only one device at this time.

#229697
Dec 11, 2012 at 11:21pm

Thank you for your response, Masayuki! If we really need it, we could always use 2 computers and share the control data by OSC or something…

#229698
Dec 17, 2012 at 10:35pm

Just got accepted for the Leap dev program. Looking forward to getting my hands over it……

#229699
Dec 18, 2012 at 3:40pm

URL changed. aka.leapmotion http://akamatsu.org/aka/max/objects/

#229700
Dec 18, 2012 at 4:05pm

Ok I’m beginning to get excited. I have a question. In the video (around 00:50) in looks like they are taking in hundreds of datapoints. Is that real? Can we have that in max?!?!?! That could mean some amazing stuff…..

Oh fuck it… *Pre-order*

#229701
Dec 18, 2012 at 4:13pm

Masayuki you are the man! I can not wait to give this a try.

#229702
Dec 18, 2012 at 7:04pm

I look forward to them being for sale. Not into the pre-order idea.

#229703
Dec 21, 2012 at 2:05pm

Hi Masayuki
Thank you for building this. If you have already started working with the leap motion if possible could you share your sense of its accuracy and responsivenes. Have happy holidays to everyone.

#229704
Dec 21, 2012 at 5:44pm

“Ok I’m beginning to get excited. I have a question. In the video (around 00:50) in looks like they are taking in hundreds of datapoints. Is that real? Can we have that in max?!?!?! That could mean some amazing stuff…..”

Every release they are adding new features to the API…right now you cannot access the thousands of skeletal data points, but they say its planned for the future…you current have palm orientation, finger position, finger orientation, the size of a virtual ball that rests in your palm, etc…

“…if possible could you share your sense of its accuracy and responsivenes.”

For a dev unit, it works very well. They shipped the dev units with cheaper lenses which reduce the range by half (lighting performance is also reduced). The commercial units will obviously not be this way…

#229705
Dec 21, 2012 at 6:03pm

got it. Thank you!

#229706
Dec 23, 2012 at 10:50am

aka.leapmotion doesn’t work with the Leap firmware 0.7.0 (released on 23rd Dec.) because the APIs were changed. I’ll try to create a new version soon (or later?).

#229707
Dec 23, 2012 at 11:52am

Done.

aka.leapmotion 0.2 for The Leap 0.7.0

http://akamatsu.org/aka/max/objects/

Even though the new features are not implemented…

#229708
Jan 18, 2013 at 11:11pm

Masayuki -

This is excellent. I just received my developers unit yesterday and will be using your external to interface with Max.

Out of curiousity, how will you be using this in Max? I plan on starting by creating some form of interactive performance instrument.

Thanks again and again for this external and jump starting me to the creation stage!

#229709
Jan 18, 2013 at 11:44pm

I’m actually starting to get excited about this (I pre-ordered one).

#229710
Jan 20, 2013 at 5:49am

Hi Masayuki

Any plans for a windows version?

All the best

#229711
Jan 21, 2013 at 6:53am

Rodrigo, I am very much looking forward to seeing what you might do with Leap and The Party Van. I’m hoping it will spark off a whole new raft of monome/leap apps from the community too! I am waiting patiently (sort of) for mine (I pre-ordered about 8 months ago!)

#229712
Jan 22, 2013 at 4:01pm

Hi TheJaphyRyder,

I think it’s good for a interactive instrument even though I have no idea at this time ;-)

Hi edsonedge,

I have no experience on Windows but I’ll open the source code if someone want to port it.

Thanks.

#229713
Jan 23, 2013 at 2:51am

Wow. Now I really really want my pre-ordered LEAP!!

aka, thank you SO MUCH for providing this. You have yet again freely given of your time and expertise and allowed so many others to leapfrog over the technical details, and to be able to quickly dive into the fun stuff. Such details can be a real deal-breaker for many people who don’t have the programming savvy to make such hardware interfacing work…they just want the data, it’s next to impossible to figure out how to get it, and the whole process becomes hugely frustrating. All that becomes a non-issue with your contributions!

Can’t wait to see the possibilities of this little gadget in Max!

#229714
Jan 25, 2013 at 2:59am

Thank you ;-) seejayjames

#229715
Feb 3, 2013 at 9:18am

I published the source code of aka.leapmotion at:

https://github.com/akamatsu/aka.leapmotion

Maybe someone creates the windows version ;-)

#229716
Feb 3, 2013 at 1:51pm

Thanks Masayuki :)

#229717
Feb 3, 2013 at 2:45pm

The new dev units look and feel really nice, unfortunately due to a flaky cable mine does not work yet and It uses micro usb connectors which i don't have. Looking forward to getting a working cable and testing this thing out.

[attachment=213916,4986]

Attachments:
  1. SANY0162.JPG
#229718
Feb 7, 2013 at 10:16am

Can’t wait to get one!

#229719
Feb 15, 2013 at 9:15pm

Got my dev unit up and running! Thanks Leap for having me in the program!

A couple quick thoughts to start regarding the Leap and integrating with Max, will post more (and some videos hopefully) soon as I explore more…

First, again, hats off to aka, his external made this totally straightforward! I can’t thank you enough!

Tracking is definitely smooth and fast, very low latency, and supposedly the commercial units will have half the latency of these units, so there should be no worries about timing for controlling tempo-intensive applications. I experimented between [qmetro] 10 and 100, and found that 40-50 is a good medium for everything except really tempo-critical stuff. Usling [line] to smooth it out helps in that case. Am running this on an aging MacBook so there’s not a ton of extra resources, but it seems to be reasonably OK to bang for the data that often. When I get the specific data I’m interested in streamlined (so I don’t have the overhead of the visualization and using [colls]) that should help processing too.

The usable area is not terribly large, and it’s not uniform regarding how well it picks things up, so you need to experiment and pretty much stay in a “sweet spot” or area. It detects in an inverted pyramid, so if you want to spread out to the sides, you have to be a little ways above it. Comfortable elbow-resting height is good.

Trying one hand with closed fingers, you can go to about 2 and a half feet or so relatively consistently, then you start getting false reports of a
second hand and/or dropouts of the one hand. So it’s pretty decent in that regard. Position, direction, and the normals will all be very useful for controlling.

Fingers are a somewhat different story. If you’re near the middle and you move them very carefully (like playing a piano or typing) things work pretty well. But it’s a fairly small range if you want to use multiple fingers on each hand, because they tend to shadow each other and cause dropouts. When there are dropouts, the indices shift. Sometimes they shift even without apparent dropouts. So, if this is for broad-stroke gestures for a paint or audio FX application, that would probably be OK. However, if you’re trying to keep consistent and precise control over MIDI CC messages (especially with two or more fingers per hand), it doesn’t seem like it will work very well. I can try more experiments with different lighting to see if it helps, because I know some people are using this for typing, which is very precise. I might need a light table or something so there aren’t shadows from overhead light sources.

That said, using tools or just one finger on each hand is pretty solid, so if you make your app work with that (especially with gesture recognition possibilities), that would be great. For now, I’m interested in seeing what I can do for MIDI CC messages, because these can easily be sent to so many other apps like Abelton. I think I’ll start with full-hand motions, because the tipping, rotation, and position elements seem very solid and are intuitive (and actually less tiring than using the fingers in some ways). Will post updates as I get something going.

Again, many thanks to Leap, Max, and aka!

#229720
Feb 16, 2013 at 4:45am

Having conned Leap into passing as a developer myself, and having received mine probably the same day seejay did, I can pretty much confirm everything he has to say (including the thank yous!). However, from my testing, this revision at least has a few other problems:

– the sweet spot is small — their claims of having an 8 cubic feet operational area is a bit optimistic. towards the edges, the data gets not only less accurate (your palm is registered as tilting if you move too far off the x or z axes), but also, especially as you climb the y axis, gets quite noisy.

– I had one round of working with it when low sunlight was coming onto my desk, and that light seemed to confuse it quite a bit — false hands appearing and disappearing, mostly. Being accustomed to working with infrared-based positioning, this seems particularly un-robust, which makes me a bit nervous about its reliability onstage, when the ambient lighting can be unpredictable and changing. I’ve already banned lighting people from using red anywhere near me, but this seems much more sensitive, which is a bit of a concern.

– the disappearing fingers are definitely a problem, but at least in the SDK, they are promising full joint recognition in a future release, so I’m holding on to optimism about that. However, I am somewhat concerned about its long-term possibilities for finer-tuned degrees of control. Like most Hot New Gizmos, don’t expect it to solve all your problems…

M

#229721
Feb 26, 2013 at 10:19am

Things are getting interesting in the gesturesphere:

https://getmyo.com

#229722
Feb 26, 2013 at 4:31pm

Another nice find – if the reality is, erm, realized!

And a nice big up to Western Military Industries represented in the video too, so many more brown/heathen children to kill and maim with impunity!

Brendan

#229723
Feb 26, 2013 at 7:39pm

Don’t worry Brendan Bluetooth 4.0 seems to only have a range of around 160 ft, bit close for soldier 90…I hope
As usual, I wonder about the latency of MYO.

All getting very interesting if you want to hold/wear/sense gesture.
Yippee!

#229724
Feb 26, 2013 at 7:56pm

awesome. I’ll go for the MYO :-)

#229725
Feb 26, 2013 at 9:02pm

It is an interesting tradeoff. Having to wear something vs more mobile range. I’m curious to see the kind of accuracy they can get from muscle analysis. I’d want to see that a camera would be more accurate, but underneath that it’s muscles driving it.
Maybe there is some kind of calibration where you do fixed gestures to give it min/max values.

#229726
Feb 27, 2013 at 1:48pm

I hadn’t thought about the accuracy yet, good point. I like the idea of being able to wear it while playing guitar… should allow for some interesting possibillities

#229727
Feb 27, 2013 at 6:34pm

Jeez, the guy ‘Heil Drones’-ing on their front page is almost enough to put me off the entire project!

#229728
Mar 17, 2013 at 1:37am

Thanks for the object Masayuki! You are one of my fav ppl in the Max scene.
I have seen some really accurate stuff done with the object ie. Eric Samothrakis on Vimeo but I myself cannot seem to get the same accuracy as the Visualizer with the aka.leapmotion object.

There are times when I can see one finger in the visualizer, but it is not seen by the object.
For the object, I often have to first place my 4 fingers in an easily readable position, before it will accurately begin tracking.
(the object will usually do fine after it begins reading one hand)

has anyone had these issues?

#229729
Apr 24, 2013 at 8:16pm

has anyone done a windows version?

#229730
Jul 23, 2013 at 10:40am

Got my non-beta unit this morning. Tried it with Leap 1.0.2 and aka.leapmotion (compiled with SDK 0.7.0) under Snow Leopard. The hand tracking is robust but finger tracking is fragile. It takes something like 20% CPU (quad-core 2.3 GHz i7).

#256923
Jul 25, 2013 at 2:25am

I’ve just received mine and could immediately test it in Max ; Thanks very much Mr Akamatsu :-)
works like a charm, incredibly fast and reactive !
I’m very exited ! definitely very promising, we’re gonna have a lot of fun with this thing ;-p

#257080
Jul 25, 2013 at 10:06am

this may be a stupid question, but i just got my leap motion controller, and am trying to get it to work within Max. Thankyou for the external Masayuki, well done. However, i still cannot get it to work? What application should i be running in the background? Is it the leap motion Airspace app?

lol. edit. just realised its the Leap Motion app….face palm!

good work on the exeternal by the way Masayuki! and thankyou again…..

#257117
Jul 26, 2013 at 1:40am

aka.leapmotion doesn’t seem to recognize tools.
not yet implemented ?
(isTool flag always stay on 0, and chopsticks / pencils do not trigger any output)
(in LeapMotion / settings / Troubleshooting / Diagnostic Visualizer : I can see that tools are tracked by the device)

anyway, many thanks again for this external :-)

#257184
Jul 27, 2013 at 3:05am

does anyone know if this would work with an intel core 2 duo processor? I have a late-2008 unibody macbook, and on the Leap website it says minimum requirements is at least an intel i3 processor.. would love to buy it and try it, but I would be incredibly frustrated if I had the machine lying on my desk and just not being able to use it!

#257294
Jul 27, 2013 at 9:37am

@Laonikoss: I could try this configuration with my older MBP if you can wait a few days.

#257317
Jul 29, 2013 at 1:41am

That’d be awesome, thanks :)

#257417
Jul 29, 2013 at 4:35am

Just got mine in the mail a few days ago. This thing is fucking cool. For real. I’m amazed at how well it tracks. Some serious potential here…

#257432
Jul 29, 2013 at 6:43am

@Laonikoss: I have a late-2009 Core Duo MBP, and the Leap Motion works well with it.

#257445
Aug 2, 2013 at 6:15am

Just want to mention, that the aka.leapmotion works well under 10.6.8 (2ghz i7 MBP). Thanks Masayuki!

#257853
Aug 3, 2013 at 9:44am

@laonikoss: I tried it on my 2009/2010 MacBook Air (Core2Duo), and that was just too slow. It only has 4GB RAM, which I guess doesn’t help, but the difference between the MBA and my 2012 MacBook Pro was significant in terms of performance.

I’m also quite disappointed in the product in general, not least because of the “phantom fingers” problems. I can hold out one finger, completely still, yet a second “finger” will regularly flicker on/off. I guess this will make precision work with aka.leapmotion/midi controllers a bit difficult.

#257929
Aug 22, 2013 at 7:18am

the phantom fingers are a problem, especially under 10.6 inter core 2 duo. Performance is better on later systems, i have found.

#259633
Aug 22, 2013 at 7:31am

Works fine on my 2011 iMac… but the data needs analysis /filtering to be really useful… phantom fingers, bad data from edge of range… smoothing results… but it is neat to play with!

#259635
Oct 18, 2013 at 2:27am

“Gesture1″ is a quick and dirty Max4Live-patch to control any parameter within Ableton Live with gestures using the Leap-Motion-controller.
Thanks to Masayuki again (aka.leapmotion (http://akamatsu.org/aka/max/objects))

Demo video:http://youtu.be/d6bmkCSqMiY

download:http://www.maxforlive.com/library/device/1976/gesture1

#268358

You must be logged in to reply to this topic.