Forums > Max For Live

Controller python script or M4L ?

Jan 11 2010 | 1:53 pm

Hi !

I got a padkontrol and i used personnal Mackie and Tranzport emulation, added Bome midi translator to get even more controls (like cycling through the folders in Browser view,very cool and simple :D)

But now i’m bored with such a mess…too much things at the same time

Two ways for me now, at least to eliminate emulations :

-make my own controller python script , like Korg should have done :D , based on nanokontrol Myr or the FCB1010 looper scripts : they work on AL8 (python version changed on AL8), so i just need to add some old decompiled python from other controllers scripts ;)
-use M4L, with python or regular patches, to access the same API

My question is :

-can i do the same things with API in M4L than with controller python scripts ?
Are the objets,method… exactly the same, so i can integrate controller python scripts directly inside M4L, with minimal recoding ?

May be the question is already asked thousand times, but i can’t access the Search function here : every time i click on Search, it disapear (old IE6 crap at job ;)

Thanks ;)

Jan 11 2010 | 8:11 pm

Before MFL was released I tried to control Ableton Live through the Python interface of LiveAPI. This works! In fact the LiveAPI in MFL is at best a wrapper around the Python interface.

I’m working on Windows XP. When there is a syntax error in the Python script Ableton Live crashes. In my case the resources for ASIO are not freed and and my PC has to be rebooted in order to use ASIO again.

I installed Iron Python on Visual Studio. That helped me to discover syntax errors before compiling with Python. But every development loop required me to restart Ableton Live. Tedious and very time consuming.

The LiveAPI is still not officially documented. There are small differences from one minor update of Live to the other. For instance some boolean value has changed for true/false to 1/0 (or the other way around) without any notification. The parent property was silently introduced recently to facilitate the workings of MFL.

Debugging is close to none-existend in this Python environment. You have to setup your own try/except structures and create a way to communicate debugging messages. I have used a UDP-stream with an external console for this.

This means that the Python interface is hard to work with and errorprone at best.

After licencing MFL I have created a very straightforward Pythonscript for my Novation Nocturn. All it does is forward Midi CC’s to a MFL device and echo the changes from MFL back to the led’s around the knobs and in the buttons.

From this patch I can do anything I want with the CC Changes using the real-time programming possibilities of MFL. No crashes, only messages to the main MAX window.

After struggling with Python this sure is a relief!

Jan 11 2010 | 8:27 pm

I used python backdoor since 2 years.
with m4L, all became official/supported etc.

so I only go with m4L. safer…

Jan 11 2010 | 10:58 pm

Thanks for your replies ;)

I was afraid to loose some access to API (like non implemented objects, functions…)

That’s clear, the big advantage with M4L is support !!!
When Ableton change the API, python version… they tell nobody
And some things stop working (like the LockButton in generic midi remote in AL8, solved now)

Now with MFL, they are obliged to tell at least cycling74 ;)

And debugging/crashing is another very good reason : i want to make music, not spend hours of debugging, crashing/restarting :D

You convinced me with your experiences and arguments ;)

Jan 12 2010 | 8:25 am

in 2008 june, I had the chance to meet Gerhard B (Ableton CEO) & Robert Henke in France (a little workshop around Ableton Live before a monolake performance)
We discussed about python API. At this time, they couldn’t tell me anything about max for live (announced in 2009 january) but they just told me that:
"it works with the current version. if you want to build things, go. But we cannot assure you we’ll support this in the future" … They convinced me at this time :)

"And debugging/crashing is another very good reason : i want to make music, not spend hours of debugging, crashing/restarting :D"
I agree 200% !

If you’re interested in API/interface/hardware, you can check my page:

and especially this part :
unfinished yet, but I will!

all the best

Jan 12 2010 | 8:06 pm

Pipotron4000 wrote:

"I was afraid to loose some access to API (like non implemented objects, functions…)"

I’m not sure each and every feature of Python can be accessed from MFL’s Live Object Model.

If you really need access to a property or function that is not supported in the LOM you can always write a Python script for UDP or OSC.

One more reason in favor of MFL is latency. The Python LiveAPI lives at a clock with 10 or 12 clocks per second. You must defeat latency to make this to good use.

I have not done extensive explorations, but some simple tests show that the MFL timepoint bangs within 10ms of the specified time-point. I expect that experienced Max/MSP users can teach us ways to do even better.

Nov 20 2012 | 8:15 pm

Reviving this old thread (awesome by the way)… So if m4l is a wrapper for the python live api, how can it beat the latency issues of python? i ask because I’m wondering if m4l is able to do anything that the python api cannot in terms of access to live functionality.

It would be cool if you could make controller scripts using m4l. I like not having to drop an m4l in a set to be able to use my interface. A plus i see with with the python api is you dont have to own suite in live 9 to be able to use it. if i make some controller interface i want everyone to be able to use it. I don’t know if theres another workaround for this…

Nov 20 2012 | 10:54 pm

A good number of latency limitations that plague this sort of thing aren’t inherent in m4l or Python: they are due Live itself. The thread process used for Python is locked at a minimum of 60ms latency or more, and there are certain ways to get things done faster than that in your Python routines, but that doesn’t speed up Live’s interaction with Python any (for instance, even if you poll the position of a clip at faster increments, Live’s engine only updates the pointer to that position every 60ms).

M4L can do lots of things that the Python API can’t do simply because it provides more tools, and has a faster timing mechanism. But it still contains the same limitations for interacting with Live (with the exception of the [live.remote~] object, which can be used at audio vector rate) as do any of the native Python processes for the most part. In m4l, there is additional latency induced by having to pass through an extra layer to get to the same processes.

In short:

m4l (C) -> Python -> Live(C) -> Python -> m4l(C), with the introduction of an extra step to get to javasript at each end if you do things like I do and use js for the API interaction:

–Even though you can do things faster in Max than Python (regarding latency), getting to Live via m4l is slower because it has to pass through several translation layers.

Python -> Live(C) -> Python:

–Less overhead is required for processing events, since there’s no extra layer(s) to go through. In addition, MIDI assignments can be made directly through Live to their intended targets, without the need for extra processing time to be taken by going through the MxDCore layer and Max’s own machinations.

(wait, that wasn’t very short ;) )

Making calls to Python can be generated faster than the 60ms update barrier, and they are processed as they are received. There’s just (currently) not a way to spawn a thread in Live’s Python and have it run any faster than 60ms without prompting from some external call. I’m working on that bit right now, actually…..

As far as I can tell, all m4l activity (except for [live.remote~]…I haven’t investigated its mechanism at all) is handled by a single MxDCore instance. Presumably (pure speculation on my part), all Python activity is limited to a single thread.

What does all this MEAN?! Well, I tend to think it is faster and more efficient to include as much API interaction as possible in a Python script, and then do the things that are not possible with it (or are not timing sensitive) in m4l. This works well, but requires a certain amount of "hacking" to get things to work. If there were a reliable means to script all of the stuff I do directly from Max, I might consider it….but, in the end, Python is generally faster and more efficient.

YMMV, cheers :)


Nov 20 2012 | 11:54 pm

Thanks heaps for that extensive breakdown.
this begs my next important question, because m4l is only available with suite now from live 9 onwards, perhaps we are better off with python anyway and dodging m4l so more users can utilise what we build using max runtimes and such.

Apart from live.remote~ which is an important one-
Is there no equivalent in the python api for this? id love to know how it does its magic! maybe its jumping the python step and going straight from m4l into LiveC

On another note, can live.remote~ be recorded into automation? I’m guessing note since it disables the parameter in lives interface.

Nov 21 2012 | 5:35 am

Another question, if an mpc uses the same python controller scripts by default with midi, how come it doesn’t have these latency call problems? Im pretty sure if i press a midi note on a controller it doesn’t suffer from large latencies. thanks again guys.

Nov 21 2012 | 7:30 pm

Apart from live.remote~ which is an important one-
Is there no equivalent in the python api for this? id love to know how it does its magic! maybe its jumping the python step and going straight from m4l into LiveC

No. That doesn’t enter Python, as far as I can tell. Abe’s wrote some hooks (or, more likely, exposed existing ones to the Max extension) that are accessed directly for this.

On another note, can live.remote~ be recorded into automation? I’m guessing note since it disables the parameter in lives interface.

I’ll leave that one for someone else…I honestly don’t use [live.remote~]…..ever. Functionality vs. performance hit has never been a fair trade off for my work, which usually sees my LiveCPU @ 40%+ the way things are without it ;)

Another question, if an mpc uses the same python controller scripts by default with midi, how come it doesn’t have these latency call problems? Im pretty sure if i press a midi note on a controller it doesn’t suffer from large latencies. thanks again guys.

Yeah, there’s an important concept to grasp at the center of that: a great deal of the Python _Framework implementation is built around forwarding commands to Live. So, for instance, if I want to map a CC to a parameter, I send c_instance Live a message and tell it to assign CC#blah to live.parameter[‘blahblah’]. Live does all the work after that, and Python doesn’t see any of the interaction, unless we explicitly forward the CC# back to Python by telling it we want a callback.

In that example, the CC# going out to the control is going to be updated without the 60ms latency limitation. However, if we are forwarding the data to Python and then doing something with it, we get the latency.

So, for MIDI Mapping purposes, things are pretty fast. But for scripted functionality, things aren’t so much.

Another note: sending MIDI in and having the script process it takes more time than sending the script a message, say, from an m4l object, and telling it to send MIDI out. This seems to be instantaneous, and I’ve have faster results sending MIDI loopback through an IAC bus from LivePy than I going from track to track in Live. (Figure that one out??!)

On the other hand, I might just be crazy.

Nov 21 2012 | 9:02 pm

Enlightening, thankyou. So perhaps i could use the python script to dynamicly assign midi CC (time critical) inputs to device parameters – strings given by max over udp (not time critical). the updates for the assignments can come from max, but the actual data is midi map defs that are taken over by live and handled in with low latency. are there example scripts around that do this closely? Maybe livecontrol by ST8 does it?

Nov 21 2012 | 11:17 pm

Not sure how ST8 does things, but that’s how I’ve been doing things most recently (but through m4l instead of OSC). It works well, and is a lot faster than routing MIDI into m4l from the Python and then sending data back to Python for processing. The DeviceComponent in the _Framework shows how to do this, and I’m currently working on a more general solution for this sort of thing for Monomodular b995. You can see the technique in the most current Livid stepp:r devices (the backend is embedded in the CNTRL:R and OhmModes2 scripts…although the Device stuff isn’t commented at all at this point). Its easier with m4l because you can send live.object pointers through the m4l/Python wall and detect it there, but you can also provide name dictionaries in the PyModule to look them up. We do both.

Nov 22 2012 | 12:52 am

cool thanks ill have a look through all of this!

Jan 22 2013 | 1:43 am

Good READ!

So I have a question.
I have been using M4L since it came out, to satisfy my custom APC40 needs.
A few times I tried digesting python/ remote scripting and just abandoned the idea because I knew enough M4L to get things done, and I didnt want to spend any more long hours learning and building instead of playing music.

That said there’s results with python I just cant get with M4L, and vice versa.
A prime example would be having an M4L device that engages a mode that turns the APC40 matrix into sending notes.
It’s easy to put this device before an instrument and play it, and it’s easy to get a recorded midi clip in that track to send LED feedback of it’s notes, it’s kind of a lame work around to actually record notes in to the track this device sits on.
With a remote script I could select the APC40 as track input and channel and BOOM.
However, learning python enough to do these kinds of things, thats my limitation.

Should pursue python? Or pursue a better workaround in M4L?

Jan 22 2013 | 9:49 am


Off the top of my head, here’s what you can do in Python that you can’t do in m4l:

You can access MIDI directly (sort of) that is coming into the script. caveat, below.

You can tell Live to translate MIDI coming into the script’s port into different channels and/or id’s.

You can bypass several layers of code translation (this makes things faster).

You can communicate with other MIDI Remote Scripts that might be installed directly.

You can send MIDI data directly out of a connected MIDI port (and since you can communicate between Python scripts, this gives you six possible destinations).

You can change the way the _Framework operates, and modify the base-level functionality of published MIDI Remote Scripts (with m4l you can turn stuff on and off and send/grab/watch values, but you can’t really modify the essential behavior of the components).

That’s not an exhaustive list, but those things are pertinent to what it sounds like you’re trying to do.

What you can’t do with Python (yet, anyway, and I don’t think this will be changing anytime soon):

You can’t "inject" MIDI into Live. i.e. you can’t forward MIDI data to one of Live’s input port’s to be received in a Track.

You can’t transform CC’s to Notes and vice versa.

You can’t schedule messages more often than 60ms (you can process LOTS of them every 60ms, but then you have to wait for the next thread update…this means if you want a button to blink on and off, say, every 10ms, it’s not going to happen).

You can’t send/receive arrays of more than 4 elements between Python and m4l (this is a limitation of the c++ process that ties max and Python together).

You can’t receive notes from Live, as from the output of a MIDI track. Live’s Python is rather isolated in this way (it can’t send to Live’s tracks, it can’t receive from Live tracks).

You can’t store settings along with Live sets.

What I would do as a solution to the puzzle you mentioned, personally, is to set up the translation in Python so that the notes are forwarded to the proper MIDI channel that your instrument is sitting on, and then use m4l to bump them back to Python so that you can display what you want coming back from the track (it sounds like this is already sort of what you’re doing). It would get slightly complex, but not overly so depending upon exactly what you’re trying to do.

FWIW, I think I’ve done most of what you’re talking about already. I can’t offer a completely "out of the box" solution for you, but all my scripts do both of these things rather well: they just don’t do it together. It’s a good idea though now that I think of it (being able to call the translation from m4l), when I have more spare time I’ll pursue it myself.

Unfortunately, my APC20/40 scripts are the only ones I’ve written that don’t use some sort of channel/id translation, so they’re a poor example. But hit me up if you need more info, I’m happy to help if I can.

Sorry for the book I just wrote. I guess my real answer to your question is this: Hopefully you’ve already learned something about Python today, so go make some music and forget about it until tomorrow ;)

Jan 22 2013 | 10:16 am

The Python LiveAPI machinery is, I assume, still totally undocumented and constantly subject to change? (Having said that, many vendors seem to be happily turning out working scripts.)

Jan 22 2013 | 7:21 pm

Commenting has gotten a bit better internally, but no official docs or support. Changes in the past have generally been pretty "do no harm", I’m sure the Abes don’t want to have to rewrite all of their own scripts either. With the imminent release of Push, things are changing quickly and there’s a lot of breakage, but I tend to think they affect my own work more (I tend to monkeypatch and augment a great deal) than most other people that are either not using the _Framework and building from scratch, or using the _Framework largely as it stands. In any case, there are a lot of cool new additions to the Python code used in Live9.

Keep in mind, their custom build is still based on Python 2.5, even though they’ve implemented some stuff from newer versions and appear to have rewritten things to support new-style classes.

Jan 23 2013 | 7:47 pm

Cool – thanks for the summary.

Jan 23 2013 | 7:52 pm

amounra, thanks for the book, I enjoyed it, seriously!

Mar 18 2013 | 12:16 pm

Hi !

i’m working in a ipad modular concept controller using the live API and python. i’v started with the LiveOSC scrpits and i’v modified lot off stuff to fit my needs. As modular i need to control the devices not with his path.I need to keep referenced the object with a unique id in all the project.
Does anybody knows if it’s possible to identify the devices not with his path (like track[0].device[0]) with a unique id that persists in the project ?i’ve problems identifying a device that has been deleted when i have for example 3 equal devices in the same track or if the user modify positions in the live set when the app is not listening the script.

I know in M4L this concept, but i don’t know how to get it with the python environment:

Persistence: The live.object object has a special entry in its inspector labelled "Use Persistent Mapping". This setting, when enabled, causes the id associated with the object to persist when the Live document is saved and restored, and when the Max Device is moved between the Live application and the Max editor, or within the Live Set. Beginning in Live 8.2.2, Live API ids remain persistent between launches of Live, which in conjunction with the Persistence feature of live.object, and live.remote~, makes it possible to create simpler devices which retain their association with elements in the Live user interface.

I’ll be really appreciating any help! you guys are the experts in this field !


Mar 18 2013 | 12:47 pm

Hi. add IDs are now persistent across the set, i.e. they will not change during their lifetime – once an ‘object’ (be it track, clip, device etc) has been created with the device id then it will stay the same…. removing a device and then re-adding it will cause a new id to be created. so yes, you should be able to do what you need….

but i only work in m4l, not done any python stuff, but i would imagine that the concept is the same…

Mar 19 2013 | 11:12 am

Hi Lee!

Thanks for replying! The problem i have is i can’t access or set a ID attribute in the Device object class in python to identify it in all the live set..I’m starting thinking that this is an internal feature of M4L, i’m bit desperated.. Also i need to identify the Live Set name or unique id to load the user config in the app when loading the project. My controller is not like the others i’v seen, i’m trying to customize my own group of controls with different devices params in it.

Really appreciating your replay and waiting if someone can help me.. ;)

Mar 19 2013 | 11:20 am

i see it’s quite different then….

as for set name, i agree – don’t know how to get this either – at the moment, I’m going with a directory selector on my main device where the user can specify a directory and then saving that path with the device….

ideally it would be nice to be able to get the path of the current live set….

Nov 24 2013 | 1:00 pm


Thanks for bringing up this question, I am also looking for a way to get a persistence reference, in my case to clips.

Say I used add_start_marker_listener to add a listener to a clip. When the start marker is changed, my method clip_start_marker is called with clip, track and clipslot arguments. However, the clip argument does not always seem to be the same object, even though the clip is still the same in Live. For example when you change the position of the clip in the session view, the clip argument no longer refers to the same memory address (which you can check with the id() method in Python).

Looking at the existing examples, apparently all scripts refer to clips by their clip slot position and track index. This is not always practical since I want to be able to save the state of my Live set and not query it all over again every time one parameter of one clip changes.

Does anyone know if there is a way to get a persistent reference (of any kind) to a clip object in the Python API?


Viewing 25 posts - 1 through 25 (of 25 total)

Forums > Max For Live