Adding an effect to a track on the fly using M4L.

Tom Swirly's icon

Hello, Live People.

I have a nifty performance controller with in M4L, using a lot of Javascript.

Given a track, I'd like to add a new effect to it on the fly.

Well, I can change parameters if an effect is already there but I've been unable to add a new device that wasn't there before. For example, I tried setting the API path to be 'live_set tracks 1 devices 0', but if I then try to set the property 'class_name' to PingPongDelay or 'name' to be 'MyPingPongPreset", nothing at all happens, and if I dump my LOM into a text file, I see no changes to it.

Any hints gratefully received.

broc's icon

The Live API can only change device parameters but not insert or delete devices. So all used devices must be loaded in the Live set and then can be switched on/off as needed with the 'Device On' parameter.

Tom Swirly's icon

:-( Just as I feared.

What a terrible, terrible weakness. It means, specifically, that you can't build up a library of effects and just recall them on the fly; nor can you select a patch you haven't already put into your Live Set; and I needed to do both of these.

What sort of "programming" system is it where you can only turn things on and off and not reconfigure them? Did anything stop and think what a "Live" musician might want to do?

I'd ask if this problem will be fixed... but they never fixed the "Max For Live only works on one channel" bug (which is APPALLING and causes me endless trouble) and it's been two years and we've seen a new full version of Max since then.

Thanks for the answer, though - it at least saves me wasting more time!

hems's icon

I had the same kind of problem when making a personal environment in Max/MSP.

I wish to dynamically change my patch in order to add and remove EFx ( or even channels ) on the fly. But some objects, like vst~ would stop the sound after being created.

Said that, one solution is to have a pool of instantiated objects and then route the stuff dynamically with this "limited pool of objects". Which lets say… Is not the ideal solution.

Some people told me Max is going in a more "live patching" direction, and that problem should be solved in a not so far future. Lets see.

Porting this "problem" to live, one solution would be have a pool of objects in the live set and somehow route them dynamically. Sounds silly i know, might even be impossible to do.

At this point i don't know if the python API ( "everyone" says is the real API for live? ) would help much, but might be a solution.

If none of those helps. The solution might be to have your instruments in MAX, SuperCollider or whatever, and use live basically as a CLOCK DRIVER or MIDI controller. Hum… Seems like at this point you would be free to select other DAWs since now, live is just a simple "sequencer" and as a sequencer, you can have better options depending on your focus.

This all i said might just be a lot of random bullshit. But honestly, that's what passes by my head when instantiating a device in max4live is not a straight forward task. In other words: Sorry for thinking loud.

hems's icon

I guess asynchronous instantiation would be a "doable" solution at this point.. Well.. just saying..

happy patching

alersito's icon

i think is possible. maybe using bpatchers and recalling them when you need an effect. Like MU (jazzmutant) you need to map all effects previously

Tom Swirly's icon

> Porting this "problem" to live, one solution would be have a pool of objects in the live set and somehow route them dynamically. Sounds silly i know, might even be impossible to do.

Not at all, that's a good idea and I think that's what I'm going to do for my instruments at least. It means a large number of tracks, nearly all of which are disabled at any given time, and it isn't flexible, but I really have no choice.

One of the things that frustrates me about using Live is that unless I want to do something that I have more or less completely pre-sequenced, I always have to spent at least a minute staring at and mousing on the computer - which is TOTALLY BORING for a live audience.

I was hoping to build up a library of patches and effects that I could scroll through from my footpedal and select sounds or effects "on the fly". And it works fine - for my external hardware.

I have all the technology written, integrated, tested, ready to go - I have a really sweet organizational system, I can do it all from a pedal or even from program changes in Live, I can display the program names on my footpedal - it's really frustrating to realize I have to have one Live track per audio patch I ever use.

To compare, on the one hardware synth I still bring, I can access 2 banks of 128 sounds, mainly instrumental, which I have organized into about 30 categories in hierarchies like sax/soprano/BrtSopSx.

I only use three buttons on the foot pedal for this, but I can easily get to any patch I chose within 10 seconds - and even more, I can get to the last sound I used within any "category" fast, so if I quickly need to be "a soprano sax" and I'm not picky about the specific patch, I can do this in literally three seconds. (And no memorization - you scroll through categories and subcategories...)

And I have a display window that pops up with just the information I need in BIG LETTERS so I can just "glance" at the computer from several feet away if I have to.

I have the full Live Suite and the full Native Instruments Suite (and Reason, which I don't use much these days, should do more with it...), and I have made my own patches and even imported patches I made 20 years ago on a TX81Z onto a virtual synth(!) - I don't even want to guess how many software instruments I have.

The various Native Instruments programs have various sorts of browsers and organizers that help me out when I'm composing, and Live has its own mechanisms for storing patches with projects...

Between my own patches, patches I've tweaked and stock patches I've practiced with, there are at least a hundred patches I know and and need to jump to in some song and perhaps as many as a thousand(!) I've tried out I might want to load on a whim at some key point in a set or playing with other people. And, man, some of these patches are things I only dreamed of when I first started doing computer music (1980! It was a PDP-11).

And yet I'm thinking of getting another hardware unit (very small please!) just because it's impossible for me to reach any of these thousand ultra-cool patches live without me turning my face away from the audience and fiddling with my laptop for a minute or two - so when I pick a new sound to play, it's nearly always from the hardware instrument.

Isn't it ridiculous that I might feel forced to purchase another hardware device because the access to sounds is so much easier in hardware! (And it probably can't happen - I don't want to spend the money and I don't want to carry it around...)

Tom Swirly's icon

All right.

I have a solution, a strange and somewhat ugly solution but one that will work and not be a lot of work for me... BUT I worry I'll reach some sort of constraint because I'll have to have an immense number of tracks (even though almost all of them will be completely disabled most of the time).

It seems to me that a track that's completely disabled (i.e. has no input, is muted, and has all its devices disabled) will use no CPU at all. There will be RAM usage but this might well get automatically swapped to disk if it's simply not being touched, and I don't see that my synths and effects are really using a lot of memory - it's samplers mainly but even then we have a lot of memory these days, and most of these samples are things like drums where the samples are quite short.

So I'm going to create a single master document with a very large number of tracks - though most of them will be hidden in two groups.

One group will be all the "instrument patches" and one group will be all the "audio effects". When I warm up, all of these tracks will have input and output disabled.

When I select an instrument, my Max For Live code will turn that instrument track on and turn off any others that were on. Heck, I'll bet I could manage a fade pretty easily...

When I select an effect, I turn on that effect track and dynamically route whatever I'm trying to effect to that.

The really neat part about this is that I can dynamically read the contents of those two groups when the program starts up and populate my interface accordingly. I'll probably need a good naming convention for this - something like category-subcategory-name, e.g. drum-conga-tumbadora2 - but it will mean that I can just edit the Live document and not have to change any data in the Max For Live world.

Except for reading tracks that are inside a group, I have all of this already working, so the coding should be easy. The question is, what happens when I have 200 tracks?! At least I'll be able to use "quite a few at a time".

Stay tuned...

amounra's icon

Ableton's answer to this has always been the chain selector. Except for a few devices (like beat repeat), all of the native Live devices are essentially disabled if audio is not passing through them or if they are not part of the active chain in a group device.

Yeah, irritating....instead of merely calling up a preset, you have to have all possible presets loaded and switch between them via the chain selector. But it is nice that when those devices' chains aren't active, they aren't using any resources.

As far as m4l devices go, I have no idea how this works (and am anxious to find out, if someone has the info).