Max for Live: A Sneak Peak at the Live API features


So far we have talked about how Max for Live will allow you to create your own custom Max devices that run inside of Ableton Live. Most of the examples you’ve seen so far have been pretty similar to your average plugin, with the fundamental difference of being to edit the device in place. That in itself is pretty spectacular, and probably enough to please a lot of people and keep everyone busy. Well now I’d like to talk about a couple of features that really make Max for Live unique and pretty exciting: namely, the Live API objects.

For those of you who aren’t well-versed in geeky acronyms, the Live API provides the ability to access the greater Live user interface from within your own device. This will offer an unprecedented amount of control and interaction, and it will be fully documented.

Download the devices used in this article.

The Live API made something of a public debut in 2007 when a few Live users exposed a Python-scripting interface to control various aspects of a Live set. This API was originally developed by Ableton for testing and creating hardware controller templates. With Max for Live, we have had the opportunity to work with Ableton to fully integrate the features of this API into Max in the form of four objects – live.path, live.observer, live.object, and live.remote~. We have also convinced Ableton to add a few key features to the API to make Max patching and creating simple utilities more straightforward and robust.

What it can do

The Live API provides access to a Live set so that we can gather information about what is happening or change the behavior or state of the set. This means you could write a Max device that triggers clips, randomly generates parameters for other devices, and behaves differently depending on what else is going on in Live. The Live API also provides access to the same tools Ableton uses to create hardware control surface templates and interfaces, with the addition of all the features Max brings to the table. To give you a better idea of how this works, let’s look at the objects themselves and some really simple examples.

live.path

In order to control something in Live using the API, you have to navigate the object hierarchy to find the specific parameter you want access to. For this purpose, we have live.path. This object takes navigation commands as input (goto …) and outputs an ID number for the specific element you navigate to. This ID is used by the other API objects to point them in the right direction. The live.path object can also be used to gather information, like how many tracks are in a set, or how many parameters are in a device. Using the “goto this_device” command, followed by a “getpath” message, you can also find out where in the Live set your Max device is located.

live.observer

Sometimes you just want to know if a particular clip has been triggered yet, or what the volume settings are for your tracks. For this, we created the live.observer object. This handy object attaches itself to a specific UI element or parameter in Live and reports the state of it. Whenever the value changes, it will output the new value. This allows you to do things like listen for specific clips getting triggered, or modulate values in your device based on the parameters of another device.

live.object

This object allows you to control the state of various values and trigger events in the Live Set. This is the real workhorse of the API objects, since it allows you to do things like making basic clip alterations, changing the values of different parameters, querying information, and significantly altering the behavior and state of a Live set. Most things you can do with a mouse click in the Live interface, you can do with live.object. This includes manipulating MIDI clips, changing clip colors, triggering events, and altering playback. This object also provides the interface for designing custom control surface mappings for Live.

live.remote~

Since live.object is designed to mimic user interactions with the Live Set (and adds to undo history), there are certain things that it probably shouldn’t be used for, like rapidly modulating the parameters of effects. For this purpose, we created an object called live.remote~, which allows you to directly modulate the parameters of any “remoteable” control in Live at signal rate. Those of you familiar with the Pluggo modulator plugins will be astounded at the possibilities opened up when every knob and slider in Live is controllable by a humble Max patch with sample accuracy. For example, one could copy the clever LFO patches Gregory Taylor writes about in his recent articles and use these same processes to modulate the drive on a Saturator device or the transposition of one of the drums in an Impulse device.

CODEBASE="http://www.apple.com/qtactivex/qtplugin.cab">







Video represents an earlier stage in the Live API development and should not be depended on for programming techniques.

Click here to view a larger version of the demo movie.

What You Can Do

To see a couple simple examples of things you can do with Live API, have a look at the video above. This video shows live.path, live.object, and live.observer in action. In the first instance, we are just using live.path to find out how many tracks we have in our set. The same could be done with clip_slots, devices, parameters, etc. In the next segment, we are using live.observer to monitor the volume of one of our tracks, and then live.object to set the volume. It’s important to note that even though we are just using number-boxes as an interface, any number of procedural methods could be employed to set these values. Lastly, live.object is being used to trigger some clips in our track. If we were to query the number of clip_slots in our track, we could easily set up a random number generator or other logic to trigger clips in interesting ways. While we’re at it, we could also alter transposition and scrub the playhead around too.

CODEBASE="http://www.apple.com/qtactivex/qtplugin.cab">







Video represents an earlier stage in the Live API development and should not be depended on for programming techniques.

Click here to view a larger version of the movie.

The video above shows the live.remote~ object being used as an assignable LFO for the parameters of a Saturator device. While there are a couple of simple things happening behind the scenes (getting the list of device parameters, scaling the LFO to the range of the parameter) you will see the patch itself is pretty straightforward. Extending this little device, we could create all sorts of complex sonic behavior with just a little patching. Since the Integrated Timing features of Max 5 also work with the Live transport, we can also set up perfectly synced waveforms to use as LFOs with live.remote~.

Of course, it’s not all about oscillators. Live.remote~ (and live.object) can also be used to map custom non-MIDI controllers to specific device parameters without having to convert the data to 7-bit MIDI messages. Since the Live API objects bypass MIDI, you can take full advantage of the sample-accurate, floating-point precision of the Live device controls.

We think the Live API objects for Max in Live will open the door to completely new ways of working with Ableton Live, but it will also allow you to create really practical “utility” devices that help you to solve specific problems in your Live project. Ever wished Follow Actions did a little more, or that you could just connect one knob to another knob, or add more chaos to the environment? Want to connect a hardware device to Live that doesn’t use MIDI? Do you just love to set up bizarre control structures and interdependent, complex systems for event sequencing? All of these things will be within reach, and many more that I can’t think of.



djvalentine
January 23, 2010 | 6:52 pm

This is all very great and exciting, but please, where is the actual documentation for the LiveAPI? Like, how did you know that mixer_device was the name you were looking for in the live.path? This must surely be documented somewhere for us humble users, or are we expected to guess through trial-and-error what the various controls are and how they are structured within a live.path hierarchy?

A simple hyperlink to the actual LiveAPI documentation would be extremely apposite here in this otherwise fascinating article.


January 25, 2010 | 7:42 pm

As the title may suggest, this article was written before Max for Live was released, and was never intended to be a tutorial on the Live API. That said, we do have a bit of documentation inside the Max help browser (search for Live API). Here is a link to the online version of that document for reference.


Viewing 2 posts - 1 through 2 (of 2 total)