Link Leapmotion data to rigged 3D hand
I want to control a rigged 3d hand with the leapmotion. I tried to link the data from the leapmotion object to anim.nodes but without any luck. The rotation of the whole hand looks simular, but if try the fingers to get sync, i get some very weird handmoves what would be anatomically very challenging :-)
I found this javascript tool which i want to recreate in max;
https://github.com/leapmotion/leapjs-rigged-hand but unfortunatly my javascript knowlegde is very poor.
There is also a live demo online (if your leap is plugged in) : http://leapmotion.github.io/leapjs-rigged-hand/
or u can see it here without a leapmotion: http://leapmotion.github.io/leapjs-rigged-hand/?playback=1
I attached a patch with a rigged hand model, could someone point me out how to control the fingers so the whole hand moves like the sketch example of the leapmotion. The patch is a editted leapmotion helppatch sending the data to anim.nodes.
Thanks alot in advance!
For the rotation of the whole hand i now use the palmnormal data of the leapmotion object and convert it to rotatexyz data to go lefthand_Wrist jit.anim.node, would that be best way to do it?
And also for the fingers is there a smart way to convert this data? Which data from the fingers and which attribute control this in the jit.anim.node?
In unity there is set to get the hands in a 3d world: http://blog.leapmotion.com/image-hands-bring-your-own-hands-into-virtual-reality/
Unfortunately i've never used Unity but i can imagine the code can show how it's done. I would be great like this unity example to combine the virtual hand with the video of jit.leapmotion.
I found also an image which explains the terms of the finger data:
Hi
I'm also trying to achieve something very similar, so let's join forces !
In your patch, you try to use palmnormal to control rotatexyz ; it can't work this way : palmnormal represents the normal vector of the hand ; it's not the full 3D orientation.
Inf act, I'm quite surprised to see that the leapmotion does'nt seem to output hand orientation in quat.
we can only get palmnormal + direction.
jit.anim.node has a 'direction' attribute, no normal or up_vector ...
There must be a way to combine these 2 values (direction + normal) to get a proper quat or rotate or rotatexyz ...
but actually, I'm stuck on this problem !
and then, we will have to do the same for fingers, with one more potential problem : we will need to set each phalange orientation as relative orientation to it's parent...
(and I'm afraid that the direction + normal given by the leapmotion external are absolute, not relative ; but I may be wrong !)
I continue investigating !
Mathieu
yo, i'm not looking srsly at this but maybe, if the features of the leapmotion external object are unsufficient, you could use the tcpigolo js bridge ? at www.leapformax.com . There are more options accessibles iirc but it's waaaay harder to install and use
Thanks Vichug
I used tcpigolo a bit last year, but AFAIK, the leap API simply do not returns orientations of hands & fingers in other ways.
here is a frame of one hand captured with Leap WebSocket JSON Viewer :
http://pastebin.com/P7QtXwhR
IRCAM leapmotion external returns also hand orientation as 'basis' :
basis, -0.981261 0.045569 -0.187215 -0.078338 0.79335 0.603704 -0.176038 -0.607058 0.774914;
... but I don't know how to transform this list into a valid quat :-s
(this is only for hands ; fingers have only direction)
Hey Mathieu and Vichug!
Nice to have some extra forces to try to fix this!
This reference made some things more clear for me: http://digitus.itk.ppke.hu/~horan/LeapImageJ/LeapSDK/docs/java/devguide/Leap_Overview.html but unfortunately with the tests i did today i had no success...
In another post i saw that Rob Ramirez came into possession of a leapmotion and planned to look into this. Rob (and/or Jeremy and Andrew) would this also a part of your looking into it? That would be great!
It must be that one can get access to more than just the basis vectors.
According to link below and basic animation principles, you need to calculate from the derived rotation matrix.
If the API provides this to other consumers it should be available in the Leap API, the question now is whether any of the current Max objects expose this or not:
I'm interested in sensor fusion to extend the node graph of the Kinect to the Leap Motion and drive a complete avatar including the hands. So I will be glad to help where I can, hopefully generating more help and example patcher files to add to the Worldmaking package at the same time.
hey guys, i've attached an example using the Ircam leapmotion external, of how to properly orient the hands.
basically, you take the hand's "palmnormal" and "direction" properties as the yaxis and zaxis attributes of jit.quat. you can then take the cross product of these two axes to get the xaxis (my example uses jit.gen for this, but it's a simple math function to do in JS for example).
after providing these axes, you then multiply the quat output with an offset quaternion which will vary depending on the default orientation of your model files. this quat goes directly to the jit.gl.model object.
fingers are of course another story. i suppose it's possible to derive finger-bone orientations from the previous and next joint position values output by the leapmotion external. however it would be better to simply get those values from the Leap SDK directly, using bone.Matrix(). i'm not sure if there's any way currently to retrieve those values.
Hey Rob!
Great!! Thanks a lot! Works great, I'm gonna check the patch so i can fully understand what's going on.
I hope the fingers can eventually also join the party...
Thanks again for the effort!
Hey guys,
Here is a patch I made and example video
Hope this helps
When I get the time , for Version2 I will parse each finger and create à physical dynamic mesh string and dress each finger up with jit.gl.path.
I just need the time to do it.
patch in attachment
phiol
Rob, thank you very much !
I can't count the hours I spent trying to do this... my math sucks !
and about fingers : yes, Im pretty sure that we are missing something in the existing leap max externals.
ideally, we should probably have a dedicated outlet to directly drive a hand model's bones, with relative orientation of phalanges...
Graham, can you read us ? :)
Hi Phiol,
Thanks for your patch, looks great!
I only don’t see the blue balls representing my hand, the physics does work only with an invisible hand.
I don’t get it, i see the gridshape connected to the jit.gl.multiple but shows nothing..? Is it something i just don’t see or missing?
The mouse sphere just works like it should be and show up when enabling. The cube i see also.
I just emailed Jules Françoise (maker of the leapmotion object) if he can help how to get the finger orientation data, hope he can help.
Hello Aartcore,
yeah I know, it's a little bug I gotta fix, you must put out your left hand first. then right hand will appear as well.
has to do with a pack somewhere (left inlet put out the message kind of thing). I gotta fix it. I put thing super quick for a prototype I had to do.
let me know if (left hand first) works.
phiol
Hi Phiol,
it works sort of... i see the blue spheres now, but it stutters. My hand(s) (and now also the physics) very quickly show up and than disapear (and the physics won't work anymore). I don't see my hands anymore, reseting the metro works but for a very short period. It is very unstable behaviour so also hard to describe properly.
here is a small video from an installation I've made recently using an HTC Vive + LeapMotion :
https://youtu.be/Jh_WobA35hk
the hands & fingers are drawn using several jit.gl.path
I like this appearance ! but it's quite cpu consuming... that's why I would love to use a simple hand model with bones, it would certainly be more efficient.
BTW : there are several parts of this patch that I'm willing to share (why not in the word making package ?..)
just need to strip it down & clean up a bit to turn it into intelligible patches !
Looks very cool Mathieu!!
Would be very nice to be able to see parts of the patch.
Bravo Mathieu !
@aartcore: I don't know what to say, what I get is what you see in the video.
Hi all
Just catching up on this thread.
Bravo indeed Mathieu, that video looks awesome. (Mind if I use a linked/credited screenshot on the worldmaking package page?)
I have a half-finished leapmotion external that would be happy to make part of that open source package (and even happier if others can contribute -- definitely appreciate the offer Mathieu!).
The stage I got too included both video-through and most of the joint tracking data. The goal was being able to drop an abstraction into a patcher and get good hands for VR, including basic gestures and collisions, and reasonably aligned video overlay if desired.
The point I got stuck on was the sheer amount of data that the LM produces, and how to reduce this to what is really useful in a patcher. I was exposing all of this was via outputting dicts, but it gets expensive to filter and extract this data for every joint of every hand in a patcher. So my question is, what's the most optimal data to get out of the leap for modeling/animating a hand in Jitter?
If there isn't a one-best-way perhaps there could be 'modes' with different output styles.
Graham
thanks guys :)
Graham : screenshot on worldmaking page : please do ! ... I should credit you as well :-)
in another project I used an external called j.leapmotion
https://github.com/theod/j.leapmotion
we made this one 2 years ago with some developers of the Jamoma team. It's quite similar to aka.leapmotion, but with different outputs for fingers, hands, gestures, .. (just like IRCAM / Jules Françoise external's)
it doesn't output all phalanges (because I didn't need it at this time!) , but it has some useful info that do not work or do not exist in IRCAM's external such as 'finger Extended' info, or a working Tools outlet (only available with leap driver 2.x)
Of course, fell free to grab anything from this code if needed (but I think it's really very simple.. )
When I use this external, most of the time I start by storing all datas into dictionaries.
it's then quite easy to query the desired info for each hand / finger, ...
so I like your idea of outputting directly dicts.
but you're right : i can be expensive to get and route all bones infos for all fingers on each frame...
maybe a dedicated outlet to send directly orientations to models anim.nodes could allow to bypass some complicated max parsing & routing & complicated geometry & math steps ?
I like the way dp.kinect works : datas can be sent in many different ways ; the external can be configured to send only what user really needs. (matrices in different formats, joints with all sorts of syntax & type of coordinates, etc..)
the ircam leapmotion external just outputs lists, and seems to work great. it's just missing quaternion info for each finger joint. not sure if packing into dictionaries will help or hinder.
i've attached another patch to demonstrate using leap data with jit.phys. probably nothing new to you experts, but hopefully helpful to some.
Mathieu, this also addresses your 3d point picking feature request by using a jit.phys.ghost to do the picking. this is really the best way i can think to do this, so would be good to know if there's something about this technique that doesn't work for you.
the patch simply creates a phys.body around the left hand (which could obviously be made more accurate using Phiol's technique above). it also has a basic detection for a "pinch" gesture (simply measuring thumb and index tip proximity), and when connected replaces the phys.body with a phys.ghost. when a collision is detected, a simple phys.point2point constraint is created for the picking action.
for added sauce, it triggers a little pinching animation using the anim.nodes from the hand mesh.
Thanks Rob for this new patch !
using jit.phys.ghost to test position of fingers / collision with bodies : that’s also what I tried to make. But I encounter some limitations that finally made me think that a ‘test3D’ function / message in jit.world would be helpful :
- the position of the contact point is returned in world coordinate
in our example, in order to set the position1 & position2 of a jit.phys.point2point (to grab the body by it’s exact colliding point),
we’ll then need to perform a world 2 local coordinates translation …
I can’t see any easy way to do this on an arbitrary body without extra complicated patching around jit.Phys.body…
or how doing this when working with jit.phys.multiple ? …
- jit.phys.ghost reports collisions only with the surface of a body
(if the ghost is inside a body, the position of the collision will be outside of the colliding body)
sometime it’s useful to grab / attach a body from a point that is not on its surface…
I would like to have an easy way to test arbitrary 3d points and get name of colliding objects, with world + local coordinate.
here is the feature request I sent off list :
I would like to submit a feature request about jit.phys.picker :
I need to pick objects (bodies) in a 3D space, not using a mouse or a touchscreen (2D), but using 3D controllers (VR).
the current 'touch' message works in screen coordinates only, so it's not usable in 3D.
I think we're missing a solution to dynamically grab bodies in 3D, and I can imagine 2 different approaches :
- adding a 'touch3d' message / function to jit.phys.picker : it would work just like the current touch message... but in 3D, and in world coordinates.
a more general solution :
- adding a 'test' (or 'testpoint' or .. ) message / function to jit.phys.world :
it would work just like raytest, but for a point instead of a ray.
it would return colliding body with world and local coordinates.
(raytest cannot be used to test a point because it doesn't work (returns nothing) if ray length == 0. or if ray starts inside a body)
It would then be easy to use the result of this test to set a jit.phys.point2point to grab the desired body.
here are some discussions on this topics (but not answering my needs)
https://cycling74.com/forums/jit-phys-picker-touch
https://cycling74.com/forums/how-to-simulate-jit-phys-picker-without-using-mouse
Thanks !
@graham: As you can see in my patch I'm parsing the joints using zl.lookup and it work quite well. In my first version, I got it to work simply using a bunch of jit.gl.gridshape, then I mad a jit.gl.mutliples version. Surprising the performance results where not much better , as I would of expected.
Also, I thought using a jit.gl.material would upped the performances as well but nope.
Anyways, I was planning on doing a V2 using jit.gl.path as well.
I'll post it here when I get time to do it.
Thanks Rob for the phys patch, really nice!!
Phiol may be it's my computer, i have a fps of about 12 with your patch, with drops under 10fps, My MacBook Retina 2.6 (mid 2012 first generation) is getting old i guess. Especially my graphics card (NVIDIA GeForce GT 650M 1024 MB) is getting behind with not able to send 4k resolution to a display. The patch of Rob is running at 55 fps btw.
Graham I prefer list output like the leapmotion does now, but maybe i've to use dict more to get more used to it. Different modes and dedicated outlet for orientation like Mathieu suggested sound ideal!
Hi
here is a patch using jit.gl.path to draw hands & fingers.
Playing with gl.path attributes and materials can produce some cool looking (non-realistic!) hands :)
Wow, really nice Mathieu!
Indeed very funny to play with!!
hey guys, i've made a little tool called jit.phys.3dpicker that should address the feature request Mathieu posted about above. it is also a good demonstration of how to use javascript and jit.anim.node objects to perform complex spatial transform operations necessary in a lot of 3D and AR work.
the abstraction uses phys.ghost for collision detection, and allows you to attach to an external anim.node object (e.g. the finger bone of a hand-model) for convenience. it transforms the world-space collisions point into local-space for both the colliding body, and the phys.ghost. it also creates and manages a phys.point2point constraint, to mimic the behavior of phys.picker.
simply download and drag into your packages folder, and create a new [jit.phys.3dpicker] object in your patch. you can provide a @drawto argument and it will automatically update. otherwise you must manually bang it to update (middle inlet). left inlet is simply a toggle to enable and disable it's functionality. there's a dynamics message (similar to phys.picker) that toggles the constraint dynamics functionality.
download here:
https://github.com/robtherich/jit.phys.3dpicker/releases
there's a help file that should explain it pretty well, but let me know if anything is unclear, broken, or doesn't work like you'd expect.
Rob : thank you very much !
That seems to be exactly what i was looking for !
I'm away from my computer for a couple of days... but I'm really looking forward trying it !
All the best
Really nice Rob! I can now play my own claw crane machine without insert coins :-)
This feature will be so cool if the fingers of 3d model hand will be also responsive, so you can really virtual grab thing is space!!
Any updates on your leapmotion external Graham? I emailed Jules Françoise (maker of the leapmotion object) a while ago, but unfortunalety no response.
Just some quick little things:
If i place the folder in my Packages folder the jit.phys.3dpicker object isn't recognized in the maxhelp file, if i place it in my library folder i works fine. With other Packages the helpfiles works fine.
And another small thing, when i load the 3D-Picking-example.maxpat i get these two messages in the max window:
jit.gl.model: could not create texture
jit.gl.model: can't find texture /Users/Stephan/Desktop/Freelance/Leap Motion/Hands/Rig/High Quality Base/Maps/leapmotion_basehand_diffuse.jpg
it works flawlessly, thanks again Rob !
and it's a wonderfully great example of Jitter in javascript ; I'm gonna study it carefully !
there's a small typo : the sub-folder 'patches' should be renamed 'pathchers' in order to make it work as a package...
(otherwise, the whole folder jit.phys.3dpicker-1.0 must be added to Max search path..)
and yes, we're missing the texture :
jit.gl.model: can't find texture /Users/Stephan/Desktop/Freelance/Leap Motion/Hands/Rig/High Quality Base/Maps/leapmotion_basehand_diffuse.jpg
great to hear guys.
re: missing texture - yeah that's just the model file distributed from leap. i'll take a look and see if there's a texture in their package that i can include.
re: leapmotion external - i'm looking at the feasibility of a simple update to support the quaternions. if so, i'll rebuild and post here, and send a pull request to Jules. no promises at this time.
re: can't find abstraction - this is probably due to the typo Mathieu noticed. i'll get that fixed and updated.
thanks much for the feedback!
would be great Rob if u can make it!!
Thanks a lot for this advices and examples.
Im using also Jules Francois External with great results , but using it as a m4l device to interact with ableton . With a condition system is a great controller with a big potential for expressivity.
Yet im more a musician than a programmer , if any of you could check my patch i will be really thankfull for any advice on how to improove latency, glitches and other issues you might see .
i will dig on all your submited patches as well since im sure i can find usefull tips in there already.
Thanks again !http://www.maxforlive.com/library/device/3611/leap-motion-mapper-alpha
hey guys, i've got an update to my 3dpicker package that shows how to animate fingers in a model file using the leapmotion finger bone data:
https://github.com/robtherich/jit.phys.3dpicker/releases/tag/v1.0.1
basically how it works is in a javascript, we read in our model file and read in some mapping values that determine how leapmotion is mapped to the nodes of the model file. we create two object maps, one for the model file structure, and one for the leapmotion hand structure. leapmotion data is sent in and direction data for each bone is stored in the appropriate object. then each draw frame we create rotation quaternions from the direction data, and apply those to the appropriate node in the model structure, based on the mapping values we've sent in.
check out the Leap-Fingers patch to see it in action. there's lots more that can be done, and in fact the right-hand thumb doesn't work as expected. you could fix this by applying an offset quaternion to that particular mapping. the JS shows how to multiply quaternions using an instance of the jit.quat object.
feel free to ask any questions here regarding it's usage and implementation.
Wow great Rob!
Thanks a lot although i can't test it now properly without a leapmotion, first thing i will do tomorrow morning (it's already late in the evening here)
I've got already a question how to add a video texture over the hand? Unfortunately I'm not a js programmer but the hand.dae file is loaded in the js object, i don't how to add the texture.
Thanks again!
in your patch:
[ jit.gl.texture @name mytex ]
in the JS:
mymodel.texture = mytex;
Hi
I think I've found a bug in jit.phys.3dpicker when used with jit.phys.multiple :
if the grabbed body has an instance number > 9 : a yellow error is printed in Max window :
: bad number
and the object is grab with wrong position (generally with a big offset)
I guess it has something to do with the regexp inside the js // the way the object's name is parsed
it probably truncate the instance number, or tries to use as instance number a value that is not a valid integer ...
sorry, I couldn't fix it myself, I suck at js as much as I do at math :s
but here is a patch showing the problem :
hey mathieu, thanks for catching.
bug is fixed, download latest release here: https://github.com/robtherich/jit.phys.3dpicker/releases/latest
Hey Leap Motion lovers
looks like since I updated to latest Leap Orion drivers 3.2.0 (require manual update, since auto update is broken in previous driver version) , the IRCAM / Jules Françoise external works very badly :
it still works, but the experience is very poor, left and right hands are inverted very often, elbow position is wrong, fingers sometime take very unnatural poses ...
so I guess that this external needs some update to work well with latest Orion 3.2.x
I tried to put the latest version of leap.dll (found in leap SDK) in my max support folder (as I did with previous version) : but the external refuse to work with dll version > 2.2.6
(screenshot of error message attached)
so : if you want to keep a fun & smooth experience with your leap motion in Max : don't follow Leap recomandation, and don't update to latest Orion version :-s
...or let's find a way to fix this !
Would simply rebuild the external with latest SDK resolve the problem ?
https://github.com/JulesFrancoise/leapmotion-for-max
I can't build it myself here...
Could someone try to do it ?
thanks
Mathieu
Hey Guys,
Any ideas why my fingers look broken? They get bent in other ways. Ideas on how to solve this?
Thank you!
I am using leap motion v.3.2.0. Is this the reason why I have this problem? The hand sketch that comes with leapmotion external from IRCAM looks decent.
Later Edit: I tried v.2.3.1 and I get the same problem. Fingers run in other directions
Hi Mathieu, did you find a way to revert to driver before 3.2.0 ?
Perhaps Rob can update the leapmotion object for max...
Thanks
I just uninstalled leap drivers, and installed 3.1.2.
(if it's not available any longer online, I can send it to you)
But it's a poor workaround...
I agree we all need an up to date max external !
PLEASE C74 Team , PLEASE make max externals for all hardware relative stuff. It's ridiculous that you take credit for being an interactive software (with artist projects presentations and interviews), but have no or very, very little vanilla built-in hardware externals.
Hats off to ben bracken for looking into the VR package for a while. But the rest (leap , kinect, zed, etc..) is left to 3rd parties. Never understood that from you guys.
Other small companies (like Derivative) support these.
Yes Mathieu, I will be very interested for the driver 3.1.2.
info at patricecros.com
I tried to reinstall the Orion Installer 3.1.2 but it does not work.
Can you rapidly describe me the way you uninstall and reinstall, i suppose its simple..
perhaps i uninstalle 3.2 then installed 3.1 and later plugged the leap.
I should have plugged the leap first...
Yes PHIOL.
I never understood this cycling policy of beeing dependant on 3rd party external for sush important matters
Im seriously thinking of moving to TouchDesigner for that..
I've been doing Touch 3 years now and like all softwares it has it's good and back.
But for stuff like this, man they support everything that comes out.
Because of it, payable gigs have become much , much more paying directly because the amount of time you save.
The huge downside for TouchDesigner is "State Machines" , do this if this is True N amount of time etc.... Touch is weak for that. Max totally wins on that side.
phiol
Yeap! I am trying to learn TD as well as I am very disappointed with the Max team for not looking after their video/visual community. And this after about 10 years of Max.
What do you mean by State Machines?
when you do interactive installations or games or etc..
you always have the user go into states
take a photobooth installation for example :
state 1 : screensaver visual
state 2 : user steps in bounding box and visual change (parts are blinking etc)
if user points for 3 sec , then go to state 3 or if he stairs for 3sec (no pointing ) go to state 4 .
if he walks out go to state 1
I think you get what I mean.
a way to do this in Touch is with multiple true flags using logical « and » & logical « OR » with the logic operator.
Problems is that Touch is pull software and Max is push.
Pull is great for visual stuff , Push types conditions tend to slip through the cracks, making to conditions not update and so the state is not changed.
here is Orion 3.1.2 installer :
https://www.dropbox.com/s/s5t4xs4hyrn59ko/Leap_Motion_Installer_fix-libsigning_public_win_x86_3.1.2%2B40841_ah1889.exe.zip?dl=0
Thanks, im gonna try it.
I knew there was something wrong with the non libsigning version I had
Hey Phiol and Mathieu,
Do you think we should make an official thread for CLAIMing
official cycling objects for sensors?
How would you name it?
forum :
[CLAIM] Official Cycling Objects for SENSORS. Leap Kinect Wacom etc...
We think Cycling should provide and maintain its own set of official objects for common sensors.
Relying on 3rd party external is unreliable and has been a loss of time for many of us.
It hindered the use of Max in commercial installation and push people to other software that DO provide official objects, like Touchdesigner.
We are sure, many pro people interested in max just never went into it cos they just saw there was no objects for their sensor.
Please post simply your sensor name and OS.
eventually post also the existence of official objects in other softwares.
Thanks
@spa i think it's not a bad idea ! though the open sourceness of those things seems kinda cool, in practice an out of the box solution would be cool, and that's the kind of thing they could maintain on github anyway as they already do with some other objects. Though a [fature request] rather than claim seems more appropriate ;)
@vichug it would not take away the open sourceness at all. Keep both.
But @ least have proper working vanilla objects is a must. Max is my bread and butter tool (as well as others) but mostly Max.
I agree with the feature request terminology, but on the other hand its more than a feature.
Not just an extra add on. Its a must.
A feature request would be , when you delete an object that is connected in the middle of two other objects, i.e. object a object b object c and delete object b , well object a & object c should automatically be connected. all other nodal softwares I've worked with have that.
phiol
Great ideas, guys! And thank you, Phiol, for the explanation.
Let's do a [REQUIRED]. ;)) As it is a must. Like I said previously. I feel C74 team left us visual designers for dead.
I kind of got the message when Ableton acquired C74. The message being that they will concentrate on taking care of the audio part of Max and silly video engines for DJs. :|
ok posted
Go VOTE
and propose every day a new sensor
to maintain the thread alive...
https://cycling74.com/forums/-required-official-cycling-objects-for-sensors-leap-kinect-wacom-etc
Here is a patch using Mathieu Chamagne's draw-leapMotion-hands-with-gl.path patch with the HTC Vive. Using htcvive external form the worldmaking package.
looks great!
Slightly off topic, but I feel like you guys could point me in the right direction.
How can I go about "painting" in this jit.world space? I would like to create a line that comes out of the tip of the pointer finger. I just want to draw a line...like jit.lcd, but unsure how to do this in jit.world.
I'm guessing jit.gl.path would be the object to use? Just don't know how to make the path follow the finger.
jit.gl.path is a great option, especially if you want to use the tube or ribbon drawing modes. I'm pretty sure you already have the finger position data since you're drawing hands with it. simply use one of the fingers as the draw source (possibly adding some offset), and grab position data from the stream at some interval (or controlled by the user) to add to the path (append message).
Yeah man! Works great. This is so cool. Thanks!