[live.remote~] values sent as messages or as an audio signal?

Charles Turner's icon

Hi all-

Read this in the [live.remote~] reference page:

"Integer or float values are send into left inlet of live.remote~, as messages or as an audio signal. The values are applied sample-precise (if sent by the audio thread of Max) with a constant latency of a single audio buffer."

Is it possible to get some authoritative clarification/expansion? What controls whether ints and floats are sent as messages? (as opposed to an audio signal?)

Does the quote mean that:

1) Ints and floats whose values are changing faster than the period of the audio vector are sent at audio rate?
2) Ints and floats whose values are changing slower than audio rate are sent as messages?
3) None of the above

I haven't gone back to read all the material about high-priority threads, but I seem to remember that the high priority scheduling can be placed inside the audio interrupt. Is this the default for M4L, or something unique to [live.remote~]?

Or is the quote saying that depending on whether SIAI is set in DSP status, [live.remote~] will send its messages in that fashion?

Thanks, Charles

Andrew Pask's icon

What it means is you can send ints or floats via messages, using the standard message patch cables, or you can send values as an audio stream on MSP cables, like the output of the sig~ object, or any other MSP object.

-A

Charles Turner's icon

Thanks Andrew, so I overlooked the bit: "are send [sic] into [the] left inlet."

But this piece still has me confused: "The values are applied sample-precise (if sent by the audio thread of Max)." Presuming that [live.remote~]'s output is an audio signal, then its output will always be sample precise, correct?

So is this just awkward prose that's attempting to remind folks that messages are not sample-accurate?

If so, "input is an audio signal" might be more clear than "sent by the audio thread of Max."

Best wishes, Charles

broc's icon

On describing the messages the documentation mentions that [live.remote~] is not applied in real time if the input is not an audio signal. So I guess in this case the timing is basically the same as with set operations on [live.object]. But overall [live.remote~] is obviously more efficient as undo and automation are disabled.

Charles Turner's icon

Ah broc, thanks for reminding me of that passage...

"A floating point number value received in the left inlet will be applied to the selected Live DeviceParameter,
if any. Obviously not in realtime."

So this evokes a picture of an object that can be both a Max "control-rate" object or MSP audio-rate object depending on what's connected to its input. That's what my take-away from reading the ref page was, but I wasn't sure. Andrew's reply then reads as a clarification, which I didn't take it to be at first. (Sorry, Andrew!)

But then, I presume that [live.remote~] with a int/float patch cable connected runs in the high priority thread, and likely [live.object] doesn't, running in the low priority thread? (So it can deal with undo history, etc.)

Best, Charles

broc's icon

Your assessment makes sense to me. But implementation details regarding priority threads are probably under NDA.

Andrew Pask's icon

Think of sending ints and floats to live.remote~ via messages as the same thing as sending them to live.object, except they bypass Live's undo history. This object was created in this way to solve what we called the "LFO problem", so that automating parameters would not flood the undo history and make lots of people sad.

-A

Charles Turner's icon

Andrew-

Thanks again for your contribution, and my understanding is now crystal clear. But are you asserting that aside from Undo history entries, [live.object] and [live.remote~] handle messages identically?

That would sure run counter to the impressions that average users have of these objects, such as:

"So you try with live.object instead which does write to the automation lanes and it works ok for one or two at once but the more live.objects you have running the more messed up it gets. Alas live.objects's communication with the API is asynchronous and the bandwidth available to the API appears very limited. It's not meant for this application obviously, it's for one-shot get and set tasks."

broc's suggestion of NDA silence is painful if it's preventing brief, concise descriptions of threading priority for M4L automation, etc. such as what Max users got in version 4.6 on pages 56-60 of "Writing Max Externals in Java" and pages 43-45 of "Javascript in Max." These descriptions have been very educational for me, and would short-circuit a lot of blind testing and speculation if they could grow to encompass MaxForLive.

Best, Charles

Andrew Pask's icon

You're sending messages from the scheduler (main) thread in both cases, so the problems are the same, minus perhaps a little bit of overhead on the Live side for the lack of undo.

Live.remote~'s audio mode was designed to get around these scheduler bottlenecks, the ability to send messages to it is a convenience rather than a "super scheduler mode"

If you like you could try it with 100 instances of each and let us know what you find.

-A

broc's icon

Thanks Andrew for the explanations.

I think it would be helpful to add a bit to the documentation, something like
"live.remote~ can run in two modes, depending on the input..."

In the overview there is a plain statement
"live.remote~ control Live device parameters in real time"
which may lead to confusion.

Personally I would prefer to have 2 different objects, ie. with and without "~".
But there may be good reasons for the dual mode approach too.

Charles Turner's icon

If you like you could try it with 100 instances of each and let us know what you find.

Well, I might. Although we all have to make our tools, it would be nice to make some music instead of puzzling out the rough edges of M4L. :-)

My last question: I presume that the Javascript LiveAPI.set() is the equivalent of [live.remote~] using messages, and not the equivalent of sending [live.object] the "set" message?

Best, C.