Are Max For Live's JavaScript functions, which run in the low-priority thread, latency compensated?
According to the documentation of Ableton’s Max For Live LiveAPI Object, it is no longer possible to configure JavaScript functions to run in the high-priority thread of Max’s scheduler. So, Max For Live JavaScript functions run in the low-priority thread. However, I could not find any information on whether these functions are latency compensated or not.
Are Max For Live's JavaScript functions, which run in the low-priority thread, latency compensated?
For example, say I create a Max For Live device, which uses JavaScript functions, and the MIDI processing inside those functions takes 50ms. Assuming Live's latency compensation is enabled, will the 50ms of latency my Max For Live device induces be correctly compensated by Live?
My understanding is no - the low thread is the UI or "main" thread so it does not have any audio timing plumbing. If you want to run code in Live in the high priority thread, you can use my Scheme for Max project, which can run in either the high or low thread (you must pick one).
However, it's also worth nothing that any Live API calls should come from a low thread object anyway as they will get deferred to the low thread, so using JS for Live API calls is fine. This is the same as the Live API for Python control surface scripts btw - all control surface actions are running in a lower priority thread.
This is the same as the Live API for Python control surface scripts btw - all control surface actions are running in a lower priority thread.
Ah! You are talking about the Python MIDI remote scripts, right?
If that is the case, and since I already have my project implemented with Python + MIDI remote scripts and I am happy with its performance, then I should also be fine with the performance I would get if I implement it in Max For Live!
I really appreciate your reply! That is very helpful, and I think I have given myself a green light to invest the time to attempt to implement my project in Max For Live. FWIW, I am interested in porting my project to Max For Live, since it seems like it can probably run on the new standalone Push (apparently user remote scripts cannot) and since if, down the road, I decide to try to monetize it, it will be easier to do that with Max For Live, since there is a lot more support for selling Max For Live devices compared to MIDI remote scripts.
I have seen your Scheme for Max project, and I am going to take a closer look at it.
Thanks again.
I'm actually talking about the Python SDK that is used for commercial control surfaces, but I would imagine that is the same or very similar under the hood. (I made Python scripts in Live years ago when you could only do this with the reverse engineered tools floating around the web!) I am also not sure whether the kind of latency and jitter you will get running JS in an M4L device is goi ng to be exactly the same, but I would expect it would be similar.
I can say, having been working on this sort of thing a lot over the last couple of years, that you have a lot of options in M4L, so I think it will be well worth the time investment. One of the tricks I have used when I need something to happen sooner is to have Live tracks using midi loopback devices so that Max devices can listen to the plugsync~ output (which runs in the audio thread) and can update parts of live through MIDI remoting. This is sometimes a better solution in my work than the Live API calls. I use both approaches a lot. In my own Live setup, I use both low and high priority S4M objects and have them message each other to ask for function calls - something that is very easy in Lisps as you can just send a list of symbols and have the recipient excecute it with eval and apply.
Good luck!
It's also worth pointing out that latency, performance, and jitter are all different. So for example, if you trigger a live API call from a clock running in your high thread, there may be some jitter depending on how long it is until the next pass of the low thread. But as this does not accumulate, it often doesn't matter. Real acoustic musicians have huge jitter, but we don't notice so much because it doesn't get progressively worse.
I'm actually talking about the Python SDK that is used for commercial control surfaces
Yep. That's the one I used.
I am also not sure whether the kind of latency and jitter you will get running JS in an M4L device is going to be exactly the same, but I would expect it would be similar.
It is helpful that you confirmed that the Live API calls via the Python SDK also run under the low-priority thread. So, I think I should expect similar performance. It is at least worth doing a POC.
I can say, having been working on this sort of thing a lot over the last couple of years, that you have a lot of options in M4L, so I think it will be well worth the time investment.
For general programming, I find the M4L patcher thing very off-putting. In any case, my project requires a lot of dynamic programming. I.e., I have listeners which register new devices that are effected by my code, and as far as I can tell, that is either difficult or impossible with only the M4L patcher objects.
One of the tricks I have used when I need something to happen sooner is to have Live tracks using midi loopback devices so that Max devices can listen to the plugsync~ output (which runs in the audio thread) and can update parts of live through MIDI remoting.
Interesting. I will keep that in mind. I have done some dastardly things with MIDI loopback and remote scripts. ;) I was hoping I could get away from the need to use MIDI loopback with M4L.
This is sometimes a better solution in my work than the Live API calls.
I can see how it could be, but again in my case I think the dynamic nature of my project requires some type of dynamic programming, which AFAICT, requires API calls.
It's also worth pointing out that latency, performance, and jitter are all different.
Yep yep. I am well versed in all of that stuff. :)
Good luck!
Thanks!
Hi there, if you are looking for dynamic programming capabilities, you might want to try my Scheme for Max project. I use it to run dynamic code in both the low and high priority threads in large Live projects and it works with great timing in the high thread, and is a really pleasant way to do the Live API in the low thread. There's no more dynamic language than Scheme! I made a video of using it within the Live context here.
https://www.youtube.com/watch?v=j0sKBA-Pv2c&t=3s&ab_channel=MusicwithLisp
I need to make a video on using the Live API in Scheme, but it's awesome because the Live API *is* based on lists of symbols and nothing beats Lisps for manipulatinglists of symbols!
Live API calls look like this:
(define (fire-clip track slot)
(post "(fire-clip)" track slot)
(live-api 'send-path (list 'live_set 'tracks track 'clip_slots slot 'clip)
'(call fire)))
(define (stop-clip track slot)
; as above, but using back-tick lisp syntax
(live-api 'send-path `(live_set tracks ,track clip_slots ,slot clip)
'(call stop)))
Hi Iain,
I appreciate your reply. I just completed a POC of a M4L device which uses ONLY JavaScript to dynamically create live.remote~ objects and interact with the Live API through them. It was struggle to figure it all out, but it seems to work perfectly. There is no latency and no overflows, since I am passing all of the messages to control Live's parameters through live.remote~, and live.remote~ seems to use the high-priority thread. Now that I have it working, I am shocked at how easy it is, and I am surprised such an example is not common.
Since I have that working, I need not bother with any third party components, which I would be reluctant to use anyway.
Once I have some time, I will try to post about my solution.
Thanks again.
Glad to help!