Massive CPU spikes when controlling VST with M4L

vojko v's icon

Hi, I am having big problems controlling a VST called Chromaphone 2 via M4L. When I do regular automation with Ableton automation curves there are no problems, but when I connect the VST's parameters with Max for Live, there are MASSIVE CPU spike lags when automating and controlling it. The system that I use to connect these parameters is standard Live API business that is tried and working smoothly on many other VST-s, but in this case it's not working at all. Has anyone have simmilar problems controling some VSTs and how did you solve it?

Here is the API controlling scheme but I really don't believe that the problem is here

Max Patch
Copy patch and select New From Clipboard in Max.

11OLSEN's icon

Does it make a difference if you not change to the signal domain before entering the remote~ object?

vojko v's icon

looks like [downsamp~] before the remote~ helps a lot

Max Gardener's icon

Perhaps some judicious thought about precisely which events you wish to send down the pipe (and how often) might also be in order. I have had considerable success with that, although I will freely confess with a little embarrassment that it was difficult to break the habit of "I'll just send any kind of stuff at fantastic rates of speed to all these parameters because something is likely to happen maybe...." that I got used to dealing with MSP only. Vaulting across the Max/Ableton blood/brain barrier is another matter. :-)

vojko v's icon

Is there some documentation on this if it's so basic?

I was convinced that you always send 0. - 1. float or signal values to the attached VST parameters and that's it

Roman Thilenius's icon


most VSTs do not support signal rate modulation and other fancy stuff, and in my opinion it is good like that.

vojko v's icon

Is there any info what kind of modulation is the automation in Ableton? "message rate"?

Pedro Santos's icon

Every DAW records automation as discrete messages, like MIDI messages, instead of signals. Some VST instruments/effects that need data at signal rate would need to use a side chain audio channel to receive it.
As a reference for the maximum rate of MIDI message automation, most sequencers have a temporal resolution of 480 divisions for every quarter-note.

Roman Thilenius's icon


pedro, if you allow me to i will quote you in his other thread about recording automation from live devices.
https://cycling74.com/forums/ugly-automation

vojko v's icon

"Some VST instruments/effects that need data at signal rate would need to use a side chain audio channel to receive it."

pls elaborate

Roman Thilenius's icon


VST2 allowed to control parameters with numbers between 0. and 1.

VST3 allows you to control parameters with anything you like, but of course any parameter which should be controlled by audio signals would need another audio input (in the one or other form) - this is why almost nobody uses it in his plug-ins.

and even where implemented it is questionable if you could reach that from a live device. (i think that was the main message)