Forums > Max For Live

live.remote latency

March 28, 2013 | 10:13 pm

Hi,

Ive noticed quite a lot of latency using live.remote, when compared to using lives own audio effects.

For instance:

If I automate the Rate on Auto Pan, and then draw in some automation, to automate between 1/4 and 1/8 , it sounds great and really in time.

However, if I connect a live.dial, to a live.remote and then map it to the rate parameter on the Auto Pan, drawing the same automation, it sounds completely different. Quite laggy in fact. So much so that I can actually see the latency if I compare them side by side on the same track.

Not only this, but there is a distinct skipping sound, when using live.remote at this fast rate, as if it is hitting different points between the two automated rates, whereas the live automation is very clean.

My questions are, is there anything being done to help Max get to the same speed as Live? Is this a known problem, with any foreseeable workarounds?

Cheers

N


March 29, 2013 | 1:19 am

The latency of live.remote~ when used with an audio signal for control is a set number of samples, I forget exactly what it is, something like a buffer + 64.

The latency of live.remote when used with Max messages is the same as the latency with live.object, that is, subject to the activity in Live’s UI thread. If you can show otherwise in either of these cases please show us the code and we’ll check it out

Cheers

-A


March 29, 2013 | 2:58 am

Hi Andrew,

Thanks for explaining, although I think my problem might be slightly different as it is not using an audio signal. I have created the following patch:

– Pasted Max Patch, click to expand. –

This controls the device to the right of it (works on AutoPan, AutoFilter etc) and controls the sync rate parameter.

If you load a synth (say operator) with a normal square tone.

Then load a normal live Auto Pan and draw in some automation in 16ths, switching from 1/4 to 1/16 each 16th

Then load next to it my device and another Auto Pan (which the rate parameter is controlled by the device), again draw exactly the same automation (noticing that my device removes the triplets).

If you cycle between both (switching from the live version, to the device controlled version) You can hear they are incredibly different. You can see the difference. There is a noticeable lag on the live.remote controlled version.

When the live.remote is controlling the parameter, it is much slower and also seems not completely consistent. There is a distinct skipping sound that I cant quite put my finger on, kind of like when Live is choking when it’s out of memory. When the live.dial moves, it seems to stop half way, and then carry on to it’s destination.

Whereas when live is controlling it’s own device (native to live) it is very clean sounding and consistent.

I would really be grateful to find out if we can get close to live’s own automation speed. I do a lot of parameter work and it’s the one thing that gets me everytime I’ve spent a while building a patch.

Cheers for looking into this.


March 29, 2013 | 8:39 am

Hi there,

I assume that your problem is the result from the fact that automation data is not latency compensated and a max device introduces a audio buffer of latency.

Best,
Christian


March 29, 2013 | 2:00 pm

As I understand it, calls to the Live API are asynchronous by design, and thus basically not suited for time critical synchronization.

BTW, is latency of M4L devices mentioned somewhere in the documentation?


March 31, 2013 | 12:26 am

I made a stupid mistake in my patch and set the "steps" in the inspector, to way higher than they should have been, latency is still there, but not so bad now , cheers N


Viewing 6 posts - 1 through 6 (of 6 total)