Wow - can't wait to get this gesture interface !
It really is a great time to be involved in interactive media; so many great tools. And a tiny price tag too.
Hmmm... Looks awesome . My inherent.. uhh.. skepticism - which I hope is wrong - kicks in though...
skepticism about........?
just as conventional acoustic instruments have yet to be supplanted by the new technologies, so too the computer keyboard and mouse? That this device will enjoy a brief moment of glory before being subsumed by the games/Kindle industry and then disappearing (RIP TouchCo)?
Just curious spectro
Brendan
I think he's talking about how accurate the demonstration is to what it can actually do.
What Rodrigo said...
It really looks good. If it can do that kind of tracking and gesture recognition with low latency and fine granularity I'm sold. But, perhaps grudgingly to others, my skepticism remains till more evidence (other than an inhouse video) is available outlining what one might expect in a practical situation, on a less than stellar and totally optimised system, etc...
please make me 1 with a range of about 10 meters ;)
Holding my sceptical hands up in anticipation here!
OMG and other internet expletives. That sure does look sweetly but also the cynicalness hmmm yeah me too. Hopefully it gets released soon so reviews, feedback and all that jazz can sort out how good it actually is. £45 is cheap...like I could actually buy one cheap :P
Add to my skepticism a hope that it doesn't have to work with annoying buggy proprietary software that makes it difficult to interface with Max!!
I have bought much less interesting gear for much higher prices. The makers pretend to have been working on that stuff for four years so I think the results are credible.
Maybe being a developer for the project would allow a Maxer who can code externals (or js or java) to make that interface to Max more straightforward. Or, even better, they've already done it...maybe [hi] can see it, and all we need is a parser patch to get whatever we want out of it.
Gotta say, the potential is awesome and the price is a steal.
From the faq
What are the tech specifications for the LEAP?
TBD.
I am afraid skepticism is not misplaced.
I also call total bs on (from the order page)
A limited number of Leap devices are available for immediate pre-order. Your credit card will not be charged until your Leap is shipped. You can cancel your order at any time.
There is no way on gods green earth that the number of preorders they will take is "limited". It is possible that they will not be able to fill them all at once, but they should not even try to pretend they will not take all the preorders they can get. This is marketing claptrap, and transparent marketing claptrap at that.
That said, their investors seem like some heavy hitters. Whether or not all of them are interested in bringing this tech to market is another question entirely.
Another bothersome thing is that they say they're still hiring machine vision specialists. This puts the existence of a solution to the core problem into question.
Kinect does all this with a zplane. Not planning on trashing my code yet - but then again, my code is hardware agnostic.
The french website Futura Sciences interviewed the CEO of the company. The technology is based on infrared.
Well one thing they clearly got working already is the hype machine...
IR technology with a linear range of 8ft cubed with a resolution of 0.01mm? I clearly haven't been keeping up to date with my own research in interaction technology.
Brendan
As IR's wavelength is between 780 nm and 500 µm there's nothing so surprising here.
Seems like Wired had a hands on with one of these - http://www.wired.com/gadgetlab/2012/05/why-the-leap-is-the-best-gesture-control-system-weve-ever-tested/
Watch the video.
"It's... really impressive. And.. I.. don't want to leave, I want to stay here - and I've already asked the guys who run the place if they're aliens and... and they won't tell me yes or no, so... I'm gonna go with 'yes.'"
i seriously hope they're working on a room size variant next :)
to quote primer: "Every half meter, everywhere."
Though, I suppose that would only get you the edges of your room.
I emailed them and they said it appears as a regular HID device with a 20ms latency which they hope to get down to 10ms in later orders.
wtf? hope to get down in later orders? As in, we will ship defective product to ppl wh were the first to support us?
Smells like apple
And 90% of the software industry
bug fix
revision list
update
upgrade
all industry-speak for "not quite ready yet"
It's a necessary evil, but when's the last time you bought a car or other hardware product that had minor faults that required you to regularly and consistently return to the manufacturer for improvements?
Brendan
(woke up cynical)
Horsepuckey. Intel or AMD never did this. Only apple likes to do this stuff, one of the many reasons my hatred for them has grown exponentially in recent years. Sofdtware? Sure. Hardware? Hell no.
Their only mistake was telling you that they hope to get it down to 10ms later - if they had just done it without announcing it earlier, nobody would care.
MuShoo, well, more accurate to say that nobody would be able to do anything. Ppl who preordered would still be pissed but short of buying a new unit they wouldnt be able to do anything about it. Now preorders are gonna go off the cliff.
1) open mouth
2) insert foot
I do not wish to divert from main topic but, you can wave your hands in the air all you like - I certainly won't be buying one; nothing compares to either Randy Jones' (Madrona Labs), TouchCo's (RIP) or Ruud van der Wel's (MagicFlute) innovations
Brendan
For 70$ it looks like a bit of craic... 20ms means it's not going to be very accurate but for general sweeping motions and some quantized switching it should be fine.
I sent them an e-mail asking about the tech-specs of the Leap to see what kind of data it sends to the computer and if it is compatible with Max/MSP, and this is their first reply:
"I can totally understand your skepticism - it's a good thing to have in this day and age! Keep in mind, though, that Ocuspec is funded through the Founders Fund, which is a very important organization and they wouldn't be fooled. Also keep in mind that all of the major news sites covered the release and many personally traveled to the company and did on site interviews and saw the technology. Scams will only have advertising done by them, not by well known news agencies. Here are some good news articles to learn more about us:" (a few links followed)
I told them I didn't ask them to justify themselves, but I asked a question about tech specs which remained unanswered and which I would like answered, and their reply was:
"The technology is something we are keeping a secret for now. Let's just say its a lot of very creative innovation.
The info we are planning to send to computer is aimed at being very generic mostly, like that of a mouse. "
And I still don't understand why they talk to me in Simple English and can't summon one technical term in their answers. I haven't sensed a single bit of honesty in either of their replies, and their website looks fancy enough to draw attention and look professional, but there is no substance there either.
If they managed to build one of those, then they used it for the demonstrations with media to attract attention but I'm afraid the heap will disappear as soon as people realise this is a scam and no one will ever get their hands on one of those for $70. (But the Leap developers will get their hands on everyone's $70..)
That's my hunch for the moment - I really would love more than anyone for Leap to be true, but I just want to make sure it really is true.
I had the same impression without emailing anyone - shocker! I doubt they will accept many orders, I think they will make some beta version that will disappear into the vaults of whatever funders, to be used (or not) in their own tech.
Well, they did say that cards wouldn't be charged until the units shipped. I preordered one, no charge yet, we'll see what happens.
20ms is pretty good IMO, though of course, the faster the better, especially for musical and percussive things...I wonder why they can't get it to 10ms now. Just a hunch, maybe they are waiting for a next-gen chip price to drop? Then again, maybe it's on the programming side, or USB communication? Just speculating...
Also...anyone else notice that their company is literally a ten-minute walk to C74's? :)
The geography factoid is pretty funny. Re datarate, you may well be right. The biggest pain in the ass associated with making a physical product is setting up manufacturing. Maybe they have a min. order for the 10ms chip.
I have a dev unit of the device. I was thinking of making a max external for it, but I just haven't had the time. (Never had to interface with an external api in Max, so I'm not sure how involved this would be). The device is nice so far and each couple of weeks they are making some really great improvements with the software. A Max external would make it a breeze to prototype applications for this. Maybe someone else will beat me to it. Anyone else get a dev unit?
Well, i pre-purchased, but was not chosen for a dev unit.... as far as:
"(Never had to interface with an external api in Max, so I'm not sure how involved this would be)."
From my experience wrapping apis in other languages: It is as involved as you make it: you provide wrapper calls to as many of the API calls as you *need* too, up to all of them.
I would imagine for Max the process is simple: the input parameters are filled from max inlet info, and the results are sent out max outlets...a max response method is created that calls the embedded api when the "driving" data comes in the correct inlet. On a complex lib, you probably will want to have several max objects, dividing up the api's methods by functionality that deals with specific input data: You can easily make different api method calls in your object(with new symbolic messages), but it is more difficult to interpret your max inputs as greatly different input data: e.g. if your object's input 2 is an x screen loc, it should probably be an X screen loc for all methods called from this Max object, but you might well have several different api calls available in that specific object, all dealing with that screen loc, and returning the same data type.
ok, more coffee, more property insurance, less fun.
wish i had a copy of the leap sdk...:)
cfb aka j2k
I built a Max object for Leap Motion. It's at the early stage at this time but I'll publish it soon.
See the sneak preview at http://twitpic.com/bjlx3f
Great, I won't have to do it then! When are you going to post?
Sneak preview 2 http://twitpic.com/bjq4ll
It's soon. > Marcos
aka.leapmotion is available now for downloading. http://akamatsu.org/aka/max/aka-objects/
Thank you, Masayuki Akamatsu!
I wonder when the leapmotion will be shipping in quantity... I want one! Or two ;-)
By the way, does the SDK and your object allow for several devices simultaneously?
Thanks, this will make prototyping much easier!
masa you rule. I wish I had one of these things!
Pedro,
As far as reading the current SDK/API, there is no information such as device ID. Maybe we can use only one device at this time.
Thank you for your response, Masayuki! If we really need it, we could always use 2 computers and share the control data by OSC or something...
Just got accepted for the Leap dev program. Looking forward to getting my hands over it......
URL changed. aka.leapmotion http://akamatsu.org/aka/max/objects/
Ok I'm beginning to get excited. I have a question. In the video (around 00:50) in looks like they are taking in hundreds of datapoints. Is that real? Can we have that in max?!?!?! That could mean some amazing stuff.....
Oh fuck it... *Pre-order*
Masayuki you are the man! I can not wait to give this a try.
I look forward to them being for sale. Not into the pre-order idea.
Hi Masayuki
Thank you for building this. If you have already started working with the leap motion if possible could you share your sense of its accuracy and responsivenes. Have happy holidays to everyone.
"Ok I'm beginning to get excited. I have a question. In the video (around 00:50) in looks like they are taking in hundreds of datapoints. Is that real? Can we have that in max?!?!?! That could mean some amazing stuff....."
Every release they are adding new features to the API...right now you cannot access the thousands of skeletal data points, but they say its planned for the future...you current have palm orientation, finger position, finger orientation, the size of a virtual ball that rests in your palm, etc...
"...if possible could you share your sense of its accuracy and responsivenes."
For a dev unit, it works very well. They shipped the dev units with cheaper lenses which reduce the range by half (lighting performance is also reduced). The commercial units will obviously not be this way...
got it. Thank you!
aka.leapmotion doesn't work with the Leap firmware 0.7.0 (released on 23rd Dec.) because the APIs were changed. I'll try to create a new version soon (or later?).
Done.
aka.leapmotion 0.2 for The Leap 0.7.0
http://akamatsu.org/aka/max/objects/
Even though the new features are not implemented...
Masayuki -
This is excellent. I just received my developers unit yesterday and will be using your external to interface with Max.
Out of curiousity, how will you be using this in Max? I plan on starting by creating some form of interactive performance instrument.
Thanks again and again for this external and jump starting me to the creation stage!
I'm actually starting to get excited about this (I pre-ordered one).
Hi Masayuki
Any plans for a windows version?
All the best
Rodrigo, I am very much looking forward to seeing what you might do with Leap and The Party Van. I'm hoping it will spark off a whole new raft of monome/leap apps from the community too! I am waiting patiently (sort of) for mine (I pre-ordered about 8 months ago!)
Hi TheJaphyRyder,
I think it's good for a interactive instrument even though I have no idea at this time ;-)
Hi edsonedge,
I have no experience on Windows but I'll open the source code if someone want to port it.
Thanks.
Wow. Now I really really want my pre-ordered LEAP!!
aka, thank you SO MUCH for providing this. You have yet again freely given of your time and expertise and allowed so many others to leapfrog over the technical details, and to be able to quickly dive into the fun stuff. Such details can be a real deal-breaker for many people who don't have the programming savvy to make such hardware interfacing work...they just want the data, it's next to impossible to figure out how to get it, and the whole process becomes hugely frustrating. All that becomes a non-issue with your contributions!
Can't wait to see the possibilities of this little gadget in Max!
Thank you ;-) seejayjames
I published the source code of aka.leapmotion at:
https://github.com/akamatsu/aka.leapmotion
Maybe someone creates the windows version ;-)
Thanks Masayuki :)
Can't wait to get one!
Got my dev unit up and running! Thanks Leap for having me in the program!
A couple quick thoughts to start regarding the Leap and integrating with Max, will post more (and some videos hopefully) soon as I explore more...
First, again, hats off to aka, his external made this totally straightforward! I can't thank you enough!
Tracking is definitely smooth and fast, very low latency, and supposedly the commercial units will have half the latency of these units, so there should be no worries about timing for controlling tempo-intensive applications. I experimented between [qmetro] 10 and 100, and found that 40-50 is a good medium for everything except really tempo-critical stuff. Usling [line] to smooth it out helps in that case. Am running this on an aging MacBook so there's not a ton of extra resources, but it seems to be reasonably OK to bang for the data that often. When I get the specific data I'm interested in streamlined (so I don't have the overhead of the visualization and using [colls]) that should help processing too.
The usable area is not terribly large, and it's not uniform regarding how well it picks things up, so you need to experiment and pretty much stay in a "sweet spot" or area. It detects in an inverted pyramid, so if you want to spread out to the sides, you have to be a little ways above it. Comfortable elbow-resting height is good.
Trying one hand with closed fingers, you can go to about 2 and a half feet or so relatively consistently, then you start getting false reports of a
second hand and/or dropouts of the one hand. So it's pretty decent in that regard. Position, direction, and the normals will all be very useful for controlling.
Fingers are a somewhat different story. If you're near the middle and you move them very carefully (like playing a piano or typing) things work pretty well. But it's a fairly small range if you want to use multiple fingers on each hand, because they tend to shadow each other and cause dropouts. When there are dropouts, the indices shift. Sometimes they shift even without apparent dropouts. So, if this is for broad-stroke gestures for a paint or audio FX application, that would probably be OK. However, if you're trying to keep consistent and precise control over MIDI CC messages (especially with two or more fingers per hand), it doesn't seem like it will work very well. I can try more experiments with different lighting to see if it helps, because I know some people are using this for typing, which is very precise. I might need a light table or something so there aren't shadows from overhead light sources.
That said, using tools or just one finger on each hand is pretty solid, so if you make your app work with that (especially with gesture recognition possibilities), that would be great. For now, I'm interested in seeing what I can do for MIDI CC messages, because these can easily be sent to so many other apps like Abelton. I think I'll start with full-hand motions, because the tipping, rotation, and position elements seem very solid and are intuitive (and actually less tiring than using the fingers in some ways). Will post updates as I get something going.
Again, many thanks to Leap, Max, and aka!
Having conned Leap into passing as a developer myself, and having received mine probably the same day seejay did, I can pretty much confirm everything he has to say (including the thank yous!). However, from my testing, this revision at least has a few other problems:
-- the sweet spot is small -- their claims of having an 8 cubic feet operational area is a bit optimistic. towards the edges, the data gets not only less accurate (your palm is registered as tilting if you move too far off the x or z axes), but also, especially as you climb the y axis, gets quite noisy.
-- I had one round of working with it when low sunlight was coming onto my desk, and that light seemed to confuse it quite a bit -- false hands appearing and disappearing, mostly. Being accustomed to working with infrared-based positioning, this seems particularly un-robust, which makes me a bit nervous about its reliability onstage, when the ambient lighting can be unpredictable and changing. I've already banned lighting people from using red anywhere near me, but this seems much more sensitive, which is a bit of a concern.
-- the disappearing fingers are definitely a problem, but at least in the SDK, they are promising full joint recognition in a future release, so I'm holding on to optimism about that. However, I am somewhat concerned about its long-term possibilities for finer-tuned degrees of control. Like most Hot New Gizmos, don't expect it to solve all your problems...
M
Things are getting interesting in the gesturesphere:
Another nice find - if the reality is, erm, realized!
And a nice big up to Western Military Industries represented in the video too, so many more brown/heathen children to kill and maim with impunity!
Brendan
Don't worry Brendan Bluetooth 4.0 seems to only have a range of around 160 ft, bit close for soldier 90...I hope
As usual, I wonder about the latency of MYO.
All getting very interesting if you want to hold/wear/sense gesture.
Yippee!
awesome. I'll go for the MYO :-)
It is an interesting tradeoff. Having to wear something vs more mobile range. I'm curious to see the kind of accuracy they can get from muscle analysis. I'd want to see that a camera would be more accurate, but underneath that it's muscles driving it.
Maybe there is some kind of calibration where you do fixed gestures to give it min/max values.
I hadn't thought about the accuracy yet, good point. I like the idea of being able to wear it while playing guitar... should allow for some interesting possibillities
Jeez, the guy 'Heil Drones'-ing on their front page is almost enough to put me off the entire project!
Thanks for the object Masayuki! You are one of my fav ppl in the Max scene.
I have seen some really accurate stuff done with the object ie. Eric Samothrakis on Vimeo but I myself cannot seem to get the same accuracy as the Visualizer with the aka.leapmotion object.
There are times when I can see one finger in the visualizer, but it is not seen by the object.
For the object, I often have to first place my 4 fingers in an easily readable position, before it will accurately begin tracking.
(the object will usually do fine after it begins reading one hand)
has anyone had these issues?
has anyone done a windows version?
Got my non-beta unit this morning. Tried it with Leap 1.0.2 and aka.leapmotion (compiled with SDK 0.7.0) under Snow Leopard. The hand tracking is robust but finger tracking is fragile. It takes something like 20% CPU (quad-core 2.3 GHz i7).
I've just received mine and could immediately test it in Max ; Thanks very much Mr Akamatsu :-)
works like a charm, incredibly fast and reactive !
I'm very exited ! definitely very promising, we're gonna have a lot of fun with this thing ;-p
this may be a stupid question, but i just got my leap motion controller, and am trying to get it to work within Max. Thankyou for the external Masayuki, well done. However, i still cannot get it to work? What application should i be running in the background? Is it the leap motion Airspace app?
lol. edit. just realised its the Leap Motion app....face palm!
good work on the exeternal by the way Masayuki! and thankyou again.....
aka.leapmotion doesn't seem to recognize tools.
not yet implemented ?
(isTool flag always stay on 0, and chopsticks / pencils do not trigger any output)
(in LeapMotion / settings / Troubleshooting / Diagnostic Visualizer : I can see that tools are tracked by the device)
anyway, many thanks again for this external :-)
does anyone know if this would work with an intel core 2 duo processor? I have a late-2008 unibody macbook, and on the Leap website it says minimum requirements is at least an intel i3 processor.. would love to buy it and try it, but I would be incredibly frustrated if I had the machine lying on my desk and just not being able to use it!
@Laonikoss: I could try this configuration with my older MBP if you can wait a few days.
That'd be awesome, thanks :)
Just got mine in the mail a few days ago. This thing is fucking cool. For real. I'm amazed at how well it tracks. Some serious potential here...
@Laonikoss: I have a late-2009 Core Duo MBP, and the Leap Motion works well with it.
Just want to mention, that the aka.leapmotion works well under 10.6.8 (2ghz i7 MBP). Thanks Masayuki!
@laonikoss: I tried it on my 2009/2010 MacBook Air (Core2Duo), and that was just too slow. It only has 4GB RAM, which I guess doesn't help, but the difference between the MBA and my 2012 MacBook Pro was significant in terms of performance.
I'm also quite disappointed in the product in general, not least because of the "phantom fingers" problems. I can hold out one finger, completely still, yet a second "finger" will regularly flicker on/off. I guess this will make precision work with aka.leapmotion/midi controllers a bit difficult.
the phantom fingers are a problem, especially under 10.6 inter core 2 duo. Performance is better on later systems, i have found.
Works fine on my 2011 iMac... but the data needs analysis /filtering to be really useful... phantom fingers, bad data from edge of range... smoothing results... but it is neat to play with!
"Gesture1" is a quick and dirty Max4Live-patch to control any parameter within Ableton Live with gestures using the Leap-Motion-controller.
Thanks to Masayuki again (aka.leapmotion (http://akamatsu.org/aka/max/objects))
Demo video:http://youtu.be/d6bmkCSqMiY