Ultraleap / Leapmotion
I just got the new Ultraleap device. The control panel is working with an M1 Mac / Monterey. Is somebody already working on a way to integrate it into Max?
Oooh, where'd you get it from? I only saw it as a pre-order on the webpage.
I guess GECO has been updated, if you don't mind middleware:
https://uwyn.com/geco/
(this is from a recent cdm article):
https://cdm.link/2023/07/geco-for-ultraleap/
I too am interested in this subject as I'm hoping to incorporate one into a drum setup.
Got it from Robotsound in Quebec ($ 139)
Unfortunately, from looking at the GECO docs, it's a pretty blunt instrument. No finger/joint tracking, so you can't calculate finger bends/touches or anything like that. I also don't see any indication that the OSC output is anything other than midi-style 7-bit resolution. Maybe there's more info on their Discord server, but for some reason it's 'unable to accept invite', for me at least...
Does look like Leap does have a beta at least for hand tracking hope...
\M
Was looking at the GECO page yesterday and it looks like you can send 14-bit MIDI, so I imagine the OSC would be something similar to that (an assumption as I've not tested it).
edit: from the webpage:High-resolution 14-bit MIDI support with customizable fine message number
High-resolution OSC support with customizable paths and OSC server selection
Also had a look a there used to be a couple different leapmotion externals, but I think those are all abandonware at the moment since the leapmotion1's support got discontinued by leapmotion.
It would be great to have some kind of native (external?) way to parse the data from it as I'm always a bit dubious of middleware for performance and consistency reasons.
Leapmotion was great - there was aka.leapmotion for Max. I built a very well working theremin patch with it, but then it didn't work anymore with Mac updates. I hope, we find a way again with the new controller.
I use the IRCAM leapmotion object with my V1, works great in Big Sur, at least. Perhaps they're working on an update for Gemini....
Hi Rodrigo.
Also had a look a there used to be a couple different leapmotion externals, but I think those are all abandonware at the moment since the leapmotion1's support got discontinued by leapmotion.
It is true that the older drivers got discontinued but fortunately the new drivers also work for the older leapmotion 1 controller. I've just tried it and it's working better than before. I've bought GECO and it's working but I would rather prefer the "external" solution, since it gives you access to a lot more...
I only briefly played with the externals back in the day. It's promising if the new drivers work. I wonder what that means for the v2 version. Hopefully IRCAM gets one and updates things for it.
Waiting on that as well....
Years ago, I used GECO successfully for a while, but it was lacking some of the things I needed (see MATTYO's comments above). However, you guys might want to have a look at GLOVER by MIMU (both OSX and Windows), designed for use with the MIMU gloves and the Leap Motion, now also supporting Gemini.
I've got Leap 2 running under Gemini on M1, via Unity -> OSC -> Max 8. Not all parameters are useful for me and choices had to be made since there is definitely a data bottleneck. At the moment, I'm using speedlim to curtail the stream but that is not ideal - should be done at the C# source. In my clip, you can see the hand tracking in Unity on the left, and I built a calibrator, as you can see on the right side. The latter is needed to set up "playable" areas. Now comes the testing phase. I will report ;-)
Ah nice, the latency/tracking looks pretty tight.
I did look at GLOVER but it looks like you can't get finger positions and smaller details, with most of it being focused on the broader gesture stuff (similar to GECO).
Since you have one set up, do you find it tracks even remotely ok if you have it oriented in this manner? (I saw something about being able to define where the sensor was relative to the hands like if you have it mounted on a headset)
Thanks, Rodrigo.
I haven't looked at different orientations yet, but I do know there is a dedicated "screentop" mode next to desktop and VR modes. There is no dedicated app though, like there use to be in earlier versions. Instead, setting it up in another mode means changing my Unity setup (or C, or Unreal). As soon as I find the time, I will look at that.
The general latency is ok, but as said, for as yet, I need to use speedlim to limit the amount of data. We're working on that one.
Just in general: I liked the Ircam external very much, but there were also things missing.
Awesome, do let me know if you have a chance to check. I basically haven't bought a v2 yet as my v1 has comfortably sat in a drawer unused for years...
My current/intended use case is mounting one sideways on a drum with the idea of picking up my hands moving in it's FOV, like so:
So if it works in that orientation, that's hopeful/promising, and if it doesn't, it moves far lower in priority of things to get/test.
BTW, Alisa Kobzar and I are more than willing to release the Unity C# script on Github, but without the connecting maxpat that wouldn't make sense and I still have huge issues to deal with bringing the OSC into M4L - under time pressure. Guess? Yes, indeed, the dreaded live.remote~ to LOM-map things. So please bear with me.
It would be great to have a maxpat as was aka.leapmotion - I think, the ultraleap could be a great tool for gesture controlled music - I used the old leapmotion version e.g. for a very well functioning Theremin.
I know, Bruno. But the aka.leapmotion was a Max external, so that needs someone diving into C/C++.
The Unity approach (C#) has the advantage that it is relatively easy to integrate it with games & 3D animation. There are so many people using Unity for different purposes. For many younger programmers, it is their very first platform...
I'd love to have the Unity C# sooner as a thing to start with :)
Sorry - in my old age I am a slow learner.... so, it seems, I have to wait, until somebody cares to write a new Max patch with the same functions as aka.leapmotion....
Geco looks great, seems maintained and updated, and should work perfectly with Max. curious what about it is limiting folks from using as a solution for modern leapmotion useage ?
Not downloaded/played with it, but I think the GECO thing doesn't offer the full level of detail/granularity that the external versions offered (in terms of tracking individual fingers etc...).
aha, that's unfortunate. can't imagine why they wouldn't just pipe everything over based on some setting.
That would be useful, like an "advanced mode". Maybe it has something like that, but the webpage is quite sparse on info, but as far as I can tell it's more oriented around piping over gesture-to-MIDI mappings.
Hey LeapMotion lovers !
the short version : I'm working on a new UltraLeap / LeapMotion external ;
available : hopefully soon :-)
the long version :
I’ve been using LeapMotions A LOT since 2014 in many different projects (music, installation, VR, ..)
http://www.mathieuchamagne.com/apertures
http://www.mathieuchamagne.com/volumes
I think I used all the available max externals and other 'bridge' softwares (leap to MIDI, OSC, …) !
Lots of fun... and frustrations, because almost every new version of the drivers break the externals that most of the time were never updated…
The Leap Motion hardware hasn't changed for almost 10 years, but drivers have received lots of incredible updates over the years, improving tracking drastically. This device is still amazing, I really love it !
+ new Leap Motion 2 is arriving on the market, and up to date drivers are finally available on macOs :-)
so :
Tired of waiting for some hypothetical new external (or official support of this device in Max), I've decided to hire a developer to help me create a new leapmotion / ultraleap external.
I’m unable to create such an external from scratch myself, but I think I should be able to fine tune it once I have something that connects to the SDK and returns some data...
I want to make an open source external and distribute it for free.
I saved some $€ to start paying the dev, and I’m looking for a way to fund this project.
I don’t want to make money with this project, just fairly pay a couple of days of work to a talented developer and share the external with the community.
(the talented developer /friend of mine is : https://jcelerier.name/ )
We’ve just started working on it.
We will share the external and source code on github as soon as we get something working.
We'll be using github sponsor to pay the dev :
https://github.com/sponsors/jcelerier
Would any of you be interested and OK to give some £€$ to contribute to this project ?
Here is a quick roadmap :
1) make something very similar to aka.leapmotion / ircam leapmotion / j.leapmotion ...
(fingerTips and hands id, pos, orientation, velocity, ...)
2) return all fingers joints / bones position & orientation (probably as a dict)
(--> draw full hand in VR, ..)
3) optimization to control a hand model
(hierarchy of jit.anim.node bound to the bones of a 3D hand model )
4) output leapmotion cameras as textures (à la jit.leapmotion)
...
________________
Mathieu Chamagne
mathieu.chamagne@gmail.com
This is great! If you succeed to do a really working Max externnal, I am happy to pay for it! For me Leapmotion was a very important part of my performances - I hope to be able to use it again with Ultraleap.
Ah very exciting!
It's also fantastic to have the source open/available as a way of, hopefully, future proofing things a bit.
Definitely down to chip in some money towards it (even though I don't own a v2 yet).
FYI your links don't work with the httpS in them.
Wow, thanks Mathieu for getting it together to take some action on this -- I'm very excited. I've pitched in!
\M
@RODRIGO : about your idea :
My current/intended use case is mounting one sideways on a drum with the idea of picking up my hands moving in it's FOV
- no need for a leap motion v2 : v1 works fine on Mac with the new drivers.
v2 just adds a wider FOV and is a bit smaller, but basically it's the same.
- I already tried things like this : place the leap on it's side on a table and track fingers : it works quite well :-)
(it's basically what they are doing in their touch free solution : https://www.ultraleap.com/enterprise/touchless-experiences/touchfree-solution/
But if you grab some objects on your drum, you might loose tracking...
(there used to be a very cool tool tracking in v2 drivers... but it's gone)
( placing the leap upside down, on a mic stand or so, above the drum, and use "screen top" mode may give better results...)
Worth a try :-)
Ah cool.
I tried something similar years ago with the V1 with disappointing results (hence being excited about the increased FOV in the V2), though I imagine it has improved some with the drivers.
Someone on the Leap discord (one of the devs I think) had responded to a similar question showing them moving around and holding "sticks" (I think pencils in their case) and it seemed to work ok. The objects did interfere with the tracking.
From their response:
And here's a list of things that will impact the tracking in this setup:
Holding sticks
Being closer than about 20cm from the device
The angle of the device (it needs to be facing almost entirely straight down)
The surface texture of the drum may impact tracking if it's overly IR reflective
I spent some time last week hacking together an arm64-only external that uses the new Ultraleap SDK, that seems to work so far. It currently just implements palm-positions and fingertip positions, since that's all I needed for my own purposes, but it should be fairly straightforward to implement all of the other tracking endpoints in their API. I'm attaching a current build here, in case anyone wants to give it a try, and have posted the source code up on github: https://github.com/pixlpa/ultraleap-for-max
I hope to document the code and dev process a little more soon, since there are some tricks to getting it to build with their SDK dylib and headers.
Dear fellow Leapers.
That's great, Andrew. And Mathieu, too. An external such as IRCAM's would be very elegant.
Sorry for the radio silence from my part, recently.
Our C# Unity version is running already - in fact I need to present new works with it half November here at IEM in Graz, a bit of a race against the clock. For that source code, I'm collaborating Alisa Kobzar.
Palm position, direction and velocity are fully implemented, as well as pitch, yaw, roll, and w (the quaternion) and grab and pinch, and Y axis of the fingers.
My m4l interface is covering all that and enables internal Live mapping with live.objects.
Extensive. At a price, though: we still have serious issues with orientations suddenly reversing, probably due to the C# LERP function, needed to keep the processing load down into OSC. As you can probably confirm, identifying the individual hands, and that is the culprit, can be quite a challenge. But without LERP, my M1 completely stalls and a speedlim object in m4l drops too many frames to be useful.
Anyway, we're hoping to get that sorted in the next 2 weeks.
thanks!!! works on m1 mac. would you tell what is x y z regarding the palm and finger data specification. Great work
The position lists come out as, for example, left/right [x] [y] [z], and the fingertip positions are output using a 0-4 index to identify the finger followed by [x] [y] [z] list. Ideally, time allowing, I'd like to add a dictionary output for the object that exposes more of the complete tracking frame output rather than dividing it up across more outlets.
I have to say that from my experience, dictionary outputs are considerably slower than old-fashioned data-> route , so keep an eye on that.
/M
I second that, Matt. Most solutions based on the old SDK also followed the route approach, for good reason.
Andrew: Am I stupid? (or just getting too old?) My M1 Monterey/ Max 8.5.0 sees the ultraleap.mxo, but it seems, that it doesn't find a file called libLeapC.5.dylib. What s wrong? Thank you for your help.
Dear all,
Here is a first public version of our new Ultraleap Max external :
(Max Package available in the release section : https://github.com/celtera/ultraleap/releases )
Compatible with the Leap Motion Controller 1 & 2, Ultraleap 3Di Stereo, IR 170 Evaluation Kit.
Requires Gemini: Ultraleap Hand Tracking Software
Build for Max 8 for Windows, Mac intel & arm64.
This external returns :
frame_info : frame_id, left & right hands tracking status, device_framerate
hands (palms) : position(xyz), orientation(quat), velocity(xyz), pinch, grab
Fingers (tips) : position(xyz), orientation(quat), velocity(xyz), fingerExtended, fingerLength
This external is built thanks to the Avendish library, which allows (among other amazing things) automatically generation of Max/MSP and PureData objects.
Some notes about this new external :
Previous Leapmotion externals (aka.leapmotion, j.leapmotion, ircam leapmotion, …) based on drivers v3 & 4 used to return raw hands & fingers ids because in Leapmotion SDK up to v4, the leapmotion service was able to track up to 4 hands… so it was probably easier to output everything based on ids, and let user do the routing in Max.
Now in Gemini (driver v5), tracking is limited to 2 hands (left & right). It's too bad, I'll miss this "multiplayer" feature... (tools tracking is gone as well) but (according to the dev) these limitations allow a faster hand detection and smaller latency, so we have to deal with it. The good point : as the device can track "only" 1 left hand + 1 right hand : we decided to separate data to different outlets for left & right hands and fingers in order to ease routing in Max :-)
This external outputs frames as soon as they are returned by the device/API, allowing higher framerate and minimum latency.
On our todo list :
Add a « poll » mechanism : send a bang to return latest frame (to sync data to jitter rendering / VR / limit frame rate…)
Output all fingers joints / bones position & orientation (maybe as a dict, or as jitter matrices ?…)
Add a @unit attribute :
@ unit mm (millimeter) (Default) returns raw values, as in current version
@unit m (meter) all position and velocity values are divided by 1000 (= can be used directly in jitter / OpenGL)Output stereo camera images (jitter matrix / texture)
Support Multiple devices : Gemini allow multiple devices connected to the same computer. Select device by id.
Known issue :
- Fingers velocity is currently not working (returns 0. 0. 0.) ; Ultraleap sdk doesn’t returns fingers velocity, so it should be computed internally.
- attributes do not work yet in this first version.
If you like it, please consider sponsoring this project https://github.com/sponsors/jcelerier :-)
Wow. Amazing!
So excited to try this, thank you so much!
Amazing! I was wondering where this was up to.
@mathieu chamagne Congrats! That's very cool. I'm glad that someone is working on a more full featured external. Mine is admittedly a pretty casual endeavor. I'm curious how you find it to have it work in the instant-output mode. With my attempt, I ended up using a worker thread to service the queue and having it output tracking data only when it received a bang, because anything else I tried was giving me a lot of latency.
@bruno make sure to install the Gemini software first (linked by Mathieu above).
For anyone curious, I also created a "dict.ultraleap" object that outputs a dictionary of tracking data instead of outputting messages. It doesn't seem to be noticeably slow in my testing. I mostly just wanted to see if it would work, and I do think it is handy (lol) for referencing the more detailed tracking data.
edit: I renamed my objects to avoid any naming conflicts with the other ultraleap externals in development.
@andrew : I wanted to try to get the smallest latency and higher frame rate possible, so I asked Jean-Michaël (the main dev) to try this method.
I think that the point was to returns data in main Max scheduler thread to make it work as expected. (have a look at the source code :-)
In my tests, I works really well, and I can fell that the latency is a bit better than with the polling method. No issue in Max & Max for Live, with lots of mapping and notes triggering.
We will add a « poll » mode soon, in order to sync frames output to Jitter rendering, or reduce frame rate when needed.
About dict output : at one point I was thinking about returning everything in a dict, but parsing a big dictionary seems not very efficient in Max, so for the time being we preferred to stick on multiple outputs solution.
But for the the fingers bones, we will probably add a dict output, or maybe jitter matrices, as it would probably be the more straightforward way to draw all fingers joints in Jitter...
MATHIEU CHAMAGNE thanks for this object. I have problems with it on a Mac M1 and Ventura Os
latest Maxmsp. error : No such object. while Px.ultraleap works. I have tried to remove the quarenteen flag but problem still.
@balam : looks like you didn’t copy the package in your documents/Max 8/packages folder…
did you ?
@mathieu that's very cool. I'll have to check the source for how you do it. My personal use case is Jitter specific, so I opted for taking a polling approach first. If you're curious, you can check my px.dict.ultraleap object which builds a dictionary of tracking data in the bang method. I have to admit it's a bit trickier to get used to dictionary usage in C vs. the classic message passing approach, but I felt like it made sense for supporting the extended API data than to keep adding more outlets.
@Mathieu. Thanks for the reply. I have properly added the folder into the package folder of max objects. Andrew‘s objects works fine.
is working now, I copy the ultraleap folder only to max folder.
I notice a bug, right hand palm in vertical position makes the object to stop reading the device, left hand works fine
@BALAM : can you see any discrepancy between Ultraleap hand tracking service / control panel and what you get in Max ?
There should be any difference, the external just returns the exact same datas that are provided by the sdk / service.
(If hand tracking looks unreliable, check your tracking mode in Ultraleap control panel, clean your device, close your curtains (because of sun IR light)... or remove your glasses 🤓 ! (in some cases I notices that my glasses could produce some "ghost" hands)
If I wasn't on tour, and didn't want to change horses in midstream, I'd be trying it!
By the way, has anyone got a sense of whether the v2 hardware is worth it? Anyone notice a significat improvement on the hardware end?
\M
@mattyo : I have a bunch of leapmotion v1 controllers, and one v2.
IMHO the only point can has been improved is the field of view of the V2.
(and maybe the tracking distance)
For VR stuff, it can make a difference. When used in desktop mode : not so much.
The device itself is "just" a couple of IR cameras, the tracking is achieved on the computer, so nothing new on this side.
@MATTHIEU: I do precision stuff with V2 in desktop mode and experience less susceptibility to external infrared. No idea how Ultraleap has achieved that. But the real test case is being on the road with it, playing under all kinds of different conditions, like I did with V1. A photographer in the venue using infrared to focus on me was a problem, as well as security movement sensors. Performing at open air festivals is a big deal on a sunny day, or under moonlight. @MATTYO: Yes, the hardware end is ultimately responsible for good reception. The tracking is driver software and works fine for me.
Tip: Nowadays, I always bring double-necked music stand clip-on led lights, taped with some white gaffer tape to dim them. Attached to my laptop stand, I aim them upwards against the palm of my hands under an angle right in front of me so that the audience does not look straight into them (and which can add the collateral fun of a shadow play against the back wall ;-). Furthermore, I absolutely insist on having all the stage lights in the grid to be dimmed to the bare minimum if they are led, and if they are tungsten, I can not perform. All that has made a difference, basically taking the light situation of the venue into my own hands.
@JBBOLLEN: Wow, that's a pretty rigorous setup! Frankly, probably because I try to travel as light as possible, I never even thought of local illumination, and I guess if you didn't want the light visible, there turns out to be all kinds of IR lights for the VR crowd. Definitely you have to keep light deom coming in -- a million years ago, when I was still using the P5 glove, I once had a gig go completely south because the lighting guy decided to go crazy with dramatic red illumination...
@MATTYO. Yeah, external IR is trouble. Strangely enough, sometimes I play from a score, and I even have to make sure the light of my iPad is not reaching my hands... Often, I stand right IN the visuals (like here https://youtu.be/lBatyd0k0L8?si=lWaawxW4ZAxjulFR and here https://youtu.be/DRoWYApiWKI?si=izie4i8QUEDvt895&t=280) and with reasonably new projector technology that does not seem to be a big deal. I prefer solid-state laser DLP's - they disturb the Leap the least. A big LED screen behind me is also great, just sometimes a little overpowering (see photo).
It's quite impressive how the leap tracking software has evolved over time since it's first release.
When I started using Leapmotions (2014) the device was VERY sensitive to external lights, I had a couple of bad experiences on stage with tungsten lights, or daylight :-s
But software updates really improved the tracking in many situations. ("robust mode" in V3 & 4)
When using a leapmotion on stage, I usually ask for LED lights only, no daylight.
Other devices emitting IR light can be an issue as well : Kinect, HTC VR lighthouse, ...
Thanks, Mathieu. Have been testing your external this morning and haven't come across any issues. Great work, congrats! Much snappier than via Unity/OSC - which project we will complete anyway since quite useful in general. But I was wondering why you are considering implementing the fingers joints / bones position & orientation differently (dict, jitter matrices)? Is it because of the sheer amount of data? For me, your approach, i.e. in line the tradition of the old V1 external, works absolutely fine. Any time stamp on implementing them finger bones? I am going to support this, you're contribution to this small community of users is very valuable. Thanks again!
Thanks JBB :-)
about fingers joints / bones : Yes, the main reason why I wanted to try a different approach is because it's quite a big amount of messages, I'd like to optimize parsing. but I'll have to test different methods to know it it worth doing it with dict...
I like the jitter matrices idea, as fingers bones data will most of the time be used to draw full hand in jitter / openGL.
But maybe you'r right : I'll probably try first to add separate outputs for fingers joints, and see how it's going..
We have a running version with a poll mechanism and multi-devices support, a new release will be available very soon :-)
Fingers joints will be the next step.
@Mathieu. From my Unity/OSC experience I know that the Y-axis of the first two bones of the fingers are very useful for both switching and continuous data, rather than the finger tips. This is, of course, because they are a less subjected to what the rest of the hand does in terms of Y output.
Anyway, the great thing about the new external is, that it is relatively easy to adapt legacy Max/Maxforlive patches, as I was able to establish today... Especially, since it is downward compatible with the old hardware.
@Jbbollen: For continuous finger data I usually use the distance from fingertip to palm, but I'd be interested to hear how you work with the first two finger bones -- sounds interesting! I usually some kind of ML for switching rather than calculating anything -- just train a model on thumb touching index for example from various angles, and only training on a couple of points rather than the whole hand (I forget which I use exactly) so I don't have to think about other finger positions, and that works pretty well for me...
\M
Hello. I downloaded the ultraleap package @MATHIEU CHAMAGNE and save the folder into the MAX 8 package's folder. But I when I try to use the ultraleap object still get the "No such Object" error message in my max console. I already installed the Ultra Leap control panel (Gemini) and it runs fine. I'm using MacBook Por, Intel OS Ventura 13.5.2.
Any advice would be greatly appreciated. Thank you.
Hello Mei-Ling
you downloaded a folder named UltraLeap-1.0.0-Max-package ;
it contains a sub-folder named "UltraLeap" : this one must go in the Max package folder.
@ Mathieu, Thank you SO much. It worked. I copied the wrong folder before. Now I can go back to my music writing. Thank you!
@MATTYO. I use the finger direction, and the only really useful one for me is the 2nd value, Y.
How do you ML your setup? Are you using Glover?
Interesting -- I never tried that -- I should have a look.
I can't find my training patch, but looking at my parser, I'm using mubu.gmm, and trained it on the 3-d distance between all four of my fingers and my thumb, nothing else. Just five states: 1-4 for a finger touching my thumb, and 5 for no touch.
@MATTYO I looked at Mubu and it looks very interesting (albeit a tad cumbersome ;-). I never got round investigating the ML path properly. As far as my setup is concerned, mine is focused just on 'cause & effect' in the basic sense of the word. My maxforlive internal mapping schemes differ per piece but they are always very complex. I have been of the opinion that with proper in- and output scaling I could get a long way. That turned out to be true, but I think there will come a point that I will have to deal with it ML..
95% of what I do is also with just scaling and our good friend the ease object, as my approach is pretty much like yours -- I just found ML more reliable for triggers than doing math. I've worked a lot with MuBu over the years, so I'm used to it, although it is a bit of a battleship, no question. You can do more or less the same things with ml.* or FluCoMa....
Thanks Matt, have seen ml.* and FluCoMa around and will check them out, again.
Haven't tested this with Leap Motion V1, but with V2 under Gemini, the grab and pinch work differently - and not for the good, since there is a large area where they are not properly separated. You can try this for yourself. Here we go:
Bring a hand into the space horizontally, stretched, and then pinch - it is not reliable. It gets worse when the hand is further removed from the Leap. Then grab. This is reliable ONLY if the thumb is stretched out.
Now bring the hand in vertically, with the thumb at the top. Now pinch. That is more reliable. But when you grab in this position, the pinch joins in.
The result of this is, that grab is only reliable horizontally and pinch only reliable vertically.
With my Unity implementation, it is exactly the same..
Anyway, this was slightly better under 2.3.1.
Do you have similar experiences?
Dear leapmotioners
I've just released a new version :
https://github.com/celtera/ultraleap/releases/tag/1.0.0-beta2
Lots of new features :
bones : (2 outlets added, one for each hand); return :
fingerIndex boneIndex prev_joint(xyz) rotation(quat) next_joint(xyz) width(f) length(f)poll mechanism added : a bang returns latest frame
multi-devices support :
@device_index : (default : -1 : returns all frames from ALL devices)
@device_index 1 : use first device plugged-in
@device_index 2 : use second device plugged-in
...
@device_serial : specify device's serial number@unit attribute : (default : mm = native millimeters)
@unit m : all distances are multiplied by 0.001 (=allow direct draw in openGL in world unit (meters))
Have fun, and let me know how it works for you :-)
@Mathieu. Have been playing around with the new version of the external this afternoon and it behaves very stable. The polling is a real bonus. Well done!!!
Hey -- just opening this stuff up, now that my tour is done...
Does everyone's computer fan immediately rev up to 5000 and the core temp hits 90º as soon as you launch the Hand Tracking Service?
on a 2021 MBP...
\M
@MATTYO You're on M1, right? What do you mean by 'launch the Hand Tracking Service'? Ultraleap told me that once it is installed, plugging in your Leap will start the service (libtrack_server - see Activity Monitor). Yes, it is expensive, I reach peaks of 44% CPU (Idle, it is around 5%). This was even more so the case on my Intel, that ran incredibly hot, so for which I installed TG Pro to be able to keep a good eye on the demands and have manual control over the behaviour of the fans. I thought I wouldn't need it for the M1, since the CPU is far more efficient and there is more room inside the laptop's casing to deal with the heat, but now I'm designing new works with lots of Leap mappings inside m4l, I think I will also install it on this machine.
@MATTYO. I installed TG Pro on my M1 just now and see no changes in fan behaviour or temperature with the Leap plugged in. Must be something else going on with your machine :-(
I've got an intel still -- so maybe it has to do with that. I use Macs Fan Control for the same reason -- I'm constantly running my machine hot, what with all that crazy DSP and Jitter. Maybe time up upgrade my machine. Sigh.
@MATTYO Definitely an Intel issue, imho. The good thing about getting an M1 is that you can reserve the Intel for gig backups. Heavy though, those 2 machines + adaptors in one's backpack. Heavy on the wallet, too.
Had a successful final presentation at IEM, Graz, tonight with 3 new audiovisual solo pieces using Mathieu's external. Thanks, Mathieu! Thanks also to Andrew Pask @Cycling for his support. The only few glitches in the hand tracking were indeed from the reflections of my hands through my glasses...! The Leap Motion 2 might just be too sensitive! Maybe I need to go lenses for these gigs, but with +6.5 (& 1.5 reading addition) that is a bit steep... ;-) Will keep you guys posted.
That's great JBB !!
(I have anti-reflective glasses and I think it really helps ;-)
I've also started using Ultraleap in concerts, seems very reliable. :-)
In some situation (when Ultraleap hand tracking service requires too much CPU, or because USB 3 cables are too short!) it's more convenient to dedicate a computer (mini-pc, Mac mini, ...) to the leap motion and send data over network / OSC.
Here is the easiest / more efficient way I'v found to achieve it using spat5 osc external.
Each leapmotion frame is sent as one OSC bundle, so it does't congest network.
This patch requires IRCAM Spat5 package :
with CNMAT openSoundControl or o.dot packages : one osc bundle CANNOT contain multiple messages with the same address... so each finger & bone should have a unique OSC address.
with spat5.osc.collect it's ok to build a bundle containing the same address several times
+ messages are kept in the same order as they were collected.
(this patch will be included in the next release)
For those who maybe don't want to drag around a second computer, I definitely got better responsiveness by creating an app that does nothing but read the leap data (and do whatever my standard parsing is) and send it out via OSC to my performance patch. Since I did all my poarsing in the app, vanilla OSC worked fine for me...
\M
@Mathieu. Thnx, I looked at SPAT anyway for use in the Cube@IEM but did not have the time to further explore the new version (so I used simple surround manners). Thanks for the patch! When Gemini was not yet available for Mac, I bought a NUC to run it and thus OSC transmitted the data via an ethernet cable. So I'm all set-up for your suggestion. I've basically been on the road for 7 months so looking forward to finally return home on Friday and give the SPAT approach a try.
@MATTYO @Mathieu. Didn't realise you were running PD for your Leap projects, Matthew. How nice & hardcore, bless you ;-) I have no issues with Mathieu's Max external and responsiveness on my M1. I use the polling option set to 5 (ms). But the 2nd-hand NUC I bought is tiny and weighs nothing, so if you have any problems, that might be a solution, indeed.
Oh, no, Jan-Bas -- it's all in Max! And this is all with the old version -- I'm currently in the midst of the nightmare of migrating to a new computer, so that first, then a serious look at Ultraleap...
Is anyone else having basic performance issues with the tracking with the Leap 2? I'm finding it's barely useable - I get much better performance from MediaPipe etc. I struggle to think that something so buggy could ship, so I'm wondering if others have had similar experiences, or whether I'm doing something wrong?
@IRONSIDE. No basic issues w Leap 2 here. What are your system specs?
Hi - I’m using a MacBook Pro 16” M2 Max 32GB. I find my leaps performance unusable, hands keep appearing and disappearing etc every few seconds. For now I’m just using mediapipe and also oak-d on a pc, but I’m pretty baffled. Have tried a load of different lighting conditions to try to remove that variable.
@IRONSIDE. Your problem might be specific to M2? Sorry, can't help you - running Intel & M1, here. Could be just as much a software tracking problem as a hardware failure. Did you contact Ultraleap?
I used the Leapmotion controller for several years and loved it, until it didn't work anymore. I bought the new Ultraleap and found the performance unsatisfying. Now I am using the current software with the old Leapmotion and this combination works absolutely fine.
@IRONSIDE
I'm using the exact same computer (MacBook Pro 16” M2 Max 32GB), both leapmotion controller 1 & 2 do work very nicely here.
I managed to connect up to 3 Leapmotions simultaneously on this laptop, very high frame rate and low cpu. 👍
Are you experimenting these issues only with our Ultraleap external, or can you observe the same problems in Ultraleap control panel as well ? (what is your device frame rate ?)
I noticed that leapmotions do not like USB extenders or low-end USB hubs... (frame rate is ok, but after a while, it randomly just stops working without any warning, need to be unplugged / replugged.)
About multiple devices : On a MacBook Pro M2, each leapmotion must be connected to a different port on the computer (so maximum devices number is 3)
I'm also using leapmotion controllers v1 on quite old Mac mini (2012) @90 fps, which is very satisfying :-)
I've found the new tracking code to be kind of mixed (I'm using an MBP M3, and the original hardware). This is all related to things I can see in the Ultraleap app itself, so it's not the Max object, which seems to track the data as it's coming from the Leap just fine.
It is much more sensitive to lighting conditions than it once was. I could use the old tracking software in all kinds of lighting conditions, which is n o longer true (I have to close my blinds now to work with it during the day, for example
It is much more likely to identify random bright(er) objects in its field of vision as hands, so you have to be careful to make sure it's not seeing anything it might find attractive.
I only use my right hand, and it often identifies my right hand as my left
The finger tracking is far inferior to the original version. When in perfect conditions, it's smoother, but it likely to confuse adjacent fingers.
You're more likely to lose a hand (or get crap finger data)when your hand is pitched/rolled. I've had to rescale everything dependent on those values so I don't tilt too far and lose a hand.
On the plus side, when everything is tracking, data is smoother and more accurate.
Personally, I found it far more usable in the older version (and the Leap has been my primary performance interface since the hardware was still in beta). I have had to change my playing style somewhat to adjust, and be much fussier about lighting conditions. I find the finger tracking almost useless now, and considering going back to a glove with bend sensors. However, this is all things that need to be taken up with the Leap people themselves, as at least on Mac the ultraleap object is just repeating the data coming from the Leap just fine.
\M
I been trying to get the all the finger tips Y position. any suggestion
Hi @MATHIEU CHAMAGNE
I already installed the Ultra Leap hand tracking software
I downloaded the UltraLeap package and save the folder into the MAX 8 package's folder.
But I when I try to use the ultraleap object still get the "ultraleap: unable to load object bundle executable"
I'm using a i7 macbook pro 10.13.6
i would appreciate your help !
have a good day
Hi Thomas
looks like Ultraleap / Gemini require macOS version 11.0+ ...
Could you see your hands in "Ultraleap Hand Tracking.app" ?
thank's @Mathieu, i've not yet the controller
i wanted to know before buying it if it could work on my computer
Ultraleap 3Di not supported under Mac OS and latest drivers. Would be great kf it does.
Has any body had experience with ultraleap 3Di?
Dear Leap Motion aficionados
I'm happy to announce that our Ultraleap package (v 1.0.0) is now available in the Package Manager :-)
Please delete any previous version, install it from the package manager and give it a try !
Cheers
Mathieu
Great, Mathieu. Looking forward. In the mean time, here's a clip of some work that I did last month with our joint previous version. As I mentioned before, I scale all data twice, both at input and output, before it goes to my audio and visuals parameters. I also built a preset to provide downward compatibility with LM V1.
Thanks JB, sounds and looks great !
@Mathieu
Brilliant work mate, I’ve been using leap since 2012 in interactive installations and have found successive updates of windows and max have caused a lot of problems so was hoping somebody would do something like this!
@Mathieu.
Only from now I will have some time to look at the package.
But before I touch it: did you guys change anything since the beta2?
No breaking changes, but some improvements in v1.0 :
fingers velocity fixed
dump outlet added (report infos on all connected devices)
maxref file added
updated .maxhelp
Patches made with beta2 should work just fine.
Freat! Thank you very much! and congratulations to JB Bollen for his performance!