If you’re a Max for Live user, you may have noticed a number of useful upgrades available to Live 10 users over the past year (some of which you may not know about yet). Live 11 has continued and expanded those changes considerably by adding a number of requested features and some entirely new features:
A live.scope~ UI object
Better/tighter integration of Live 11 with Max for Live
New features for developers of Max for Live devices
New additions to the Live API
In this series of two articles, we’re going to run through those new features. In addition, the downloadable file includes some example Live devices that demonstrate the features described in this article, along with some example Live sets that demonstrate their use.
A New UI Object and New Features for live.* Objects
We’ll begin with the Max external objects created to work in the Live application environment
Let's start off with a brand new object that was recently added to Max. With live.scope~, you get a performant and accurate oscilloscope, tuned to match Live's visual style by default and ready for your own customizing touches.
Integration upgrades
We’ve worked to make the overall experience of working with Max for Live devices feel more fluid in Live 11: Device interface rendering has been overhauled, which prevents unwanted visual artifacts caused by zooming the Live interface or scrolling the Device view.
There have been numerous updates to Max for Live objects to support better interface integration:
The live.comment object now works seamlessly with the Live interface colors
The live.arrows object is now parameter-enabled (This is set to off by default)
The live.text object has a new @blinktime
The live.toggle and live.text objects now respond to the Return key, allowing you to use your keyboard to toggle both objects.
The live.dial and live.slider objects now mimic the behavior of native Live UI objects — the shift+arrow keys now change the dial/slider value by 12 steps.
The live.colors object now includes more native Live colors (e.g, histogram and spectrum).
When you right-click the patcher background in Max, you will now find a Max for Live category that provides you with direct access to essential building blocks. Here’s an example that shows one of the new snippets: Global.dB2Value. This snippet shows how to convert a value in dBs to a value that you can send to live.object in order to get the correct amount of dBs on a gain or volume control in Live.
A thispatcher object in your Max for Live device can now get and return the file path of the device it is located in.
Usability and development updates
Live 11 includes some new features that enhance the overall experience of working with Max in Live when developing new devices:
When there is a warning or error associated with a Max object in the Max for Live window, you will now see which specific external object caused it. Double-clicking on the error line in the Max window will take you to this device. Right-clicking a message will show how the operations (such as filtering per object) have been extended to the device. You can now copy text directly from the Max for Live window, too.
As a developer, you now get tools to limit your device to the Live and Max release versions and the platform in which it functions optimally. There’s a new Max for Live section in the Patcher Inspector that you can use to set those limits. In addition, you can also access these attributes via the live.thisdevice Max for Live object.
The Parameters (menu > View > Parameters) and Banks (double-click live.banks) windows have received some love, too — you can now edit the Info Title and Info Text for all your objects in one place in the Parameters window and it is easier to edit banks in the Banks window.
MPE support
MIDI in Live 11 supports MPE, and a new feature for Max for Live users reflects the embrace of MPE. In the new Max for Live category in the Patcher Inspector you can use an attribute (@is_mpe) to specify whether your MIDI Effect or Instrument is an MPE device. Setting the @is_mpe attribute enables your device to modify and generate MPE data.
If you’re interested in how you can change MPE data for notes, we’ve provided a sample device in the download for this article that demonstrates how it’s done — the example device adds a pitch slide for every note that passes through:
One new bit of support will be exciting for lots of you out there: just as is the case for VST devices, Max for Live Audio effects can now route MIDI to anywhere within Live and accept MIDI from anywhere in Live. Instruments can also accept MIDI from anywhere within Live.
You’ll find demonstration devices in the download for this article that demonstrate how that’s done, too:
API additions
Live 11 includes some exciting new features for the Live API for Max for Live users, as well.
The biggest single addition is an entirely new way to interact with MIDI clips. Those changes are so new and so extensive that we’re going to dedicate the second article in this series entirely to exploring them — so stay tuned!
But that’s not the only stuff that’s new in the Live API. This time out, we’ve picked out some other highlights to share, and included example devices that show them in action in the download example folder.
Simpler slices
You can now query the slices in Simpler. Matching this with incoming MIDI notes and another new property in the Simpler (sample_rate) will let you know exactly what position in your sample will be triggered.
Arrangement Clips
You can now access the clips in Arrangement View with arrangement_clips. Most of the things you have done with clips in Session View are now applicable to the Arrangement View as well.
Warp Markers
Warp markers are no longer a mystery for Max for Live device creators — you’ve got access to them through the Live API. Querying warp_markers gives you a single dictionary with all the information you need to know where and how fast a clip will play back at any time in beats. This enables you to determine the current playback position in warped seconds, as well as a warped clip's current playback rate:
More: Clip Launch Properties, Macros and Grooves
And just in case that’s not enough, we’ve got a few more tricks up our API sleeve:
Clips now give you access to their launch properties: Legato, Launch mode, Launch quantization and Velocity amount.
The new Macro Variations can be fully controlled with Max.
Macro knobs can be added, removed, or randomized using Max.
The Groove pool and all of a Groove's properties are now accessible from Max.
The features we’ve described here really don’t cover the full list of small improvements to Max for Live added recently, along with the new Live 11 features. For the full lists, see the change logs for Live 11 and Max 8.
For example, did you know that freezing now works for Max for Live, even propagating to the editor?
Next Time….
With the arrival of MPE support and features such as velocity deviation, probability and release velocity, the concept of a note in Live 11 is a new and different beast compared to Live 10. We’ve added a completely new way to interact with MIDI clips in Live 11. Those new features are so extensive that we wanted to dedicate a separate article to cover them, and that’s what you’ll be seeing in Part 2 of this series. See you next time!
Corrections
In a previous version, the text stated that all devices can send and receive MIDI to and from anywhere in Live. However only Audio effects can route MIDI inputs and outputs, and Instruments can only route MIDI output. MIDI effects' inputs and outputs and Instruments' MIDI inputs are necessarily fixed to their device chain.
The MIDI Routing Output device did not correctly contain its Routing and Colors abstractions. This should be fixed in the current download.
Re: the MIDI routing devices – reading the comments inside the Routing object: "Devices can currently have only one MIDI input or output channel (index 0). Only Audio Effects and Instruments support MIDI routings as MIDI Effects routings are fixed."
Just to confirm then, it's not possible yet to have multiple 'midiout' objects in the same device, each with an independent routing assignment, is that right? I'd still have to use 'send' & 'receive' objects in order to route to multiple midi tracks from a single device (assuming each destination track has a 'receiving' device)?
Oh, I meant to comment on that as well – re: errors loading Colors & Routing. I ended up grabbing those (plus a third inside of Routing) from the MIDI Inputs' amxd, and saved them into my Max library, then dragged the device into Live & replaced (otherwise, the original device from the session wouldn't know where to look). Will see to attach here later (file's on a different computer).
Great news !! Thanks for your effort on the midi side . However it doesnt seem to find the subpatchers I suppose , namely "Routing" and "Colors" are not found . Ive installed Live 11.0 and max 8.1.10 . Windows 10 machine . Added the zip contents to the Max 8 Library folder too .
Anyone got it running on windows ?
edit: Interesting , I do have the Routing subpatcher on the MIDI input example , but not on the routing out . Same goes for colors .
Concerning the missing abstractions: indeed there was an error in the download that should now be corrected (see Corrections at the bottom of the article). Thanks for bringing it to our attention!
Thanks Mattijs! Just wanted to confirm as well, re: the corrections / my 1st question above:
"...only Audio Effects can route MIDI inputs and outputs, and Instruments can only route MIDI output. MIDI effects' inputs and outputs and Instruments' MIDI inputs are necessarily fixed to their device chain."
Is it true that it would be *one* routing assignment per device? For example, I can have an audio effect send MIDI out directly to another track, but it couldn't send multiple outs, is that right?
This is great work. Really love the tighter integration btw Max and Live. I am using the new note format in JS. Like that too (but am getting a dictionary exceed error when reading clips > 123 notes (no problem writing clips though)). Looking forward the next article in the series.
This is all very exciting! One question though: is sequencing/automating macro variations possible? Or would it have to be done by automating the macros themselves? Have tried a bit using the commands/functions in the LOM but apart from counting, randomizing, saving and recalling the macros, I haven't been able to do much else.
looks like its not even possible in Ableton. There are no Automation paramaters available. There is also no possibility for mapping macro presets to a midi controller.
I would say that it should indeed be possible to build a Max for Live device that links the selected macro variation to an automatable parameter, or that triggers it with notes.
I have also a question. I started a new thread about, but maybe this is the right place to ask. I have a little problem with note modification. It looks like the live.object apply_note_modification only handles dictionary arrays of notes. But if there is only one note entry in the dictionary there is no array created and live.object put out an 'invalid arguments' error. Is there a way to format a single dict entry like an array? By the way, the new maxforlive integration is really tight. Never had such a good performance. Also the new note modification and writing process is working very well.
I really like the new note format with dicts. One thing I can't figure out though is how to use it in JS. It would be very cool if someone could post a simple example what syntax to use when getting the notes into a dict object with for example "get_selected_notes_extended" and then setting the notes back with "apply_note_modifications". thanks
'call apply_note_modifications { "notes" : [ { "note_id" : 1, "pitch" : 10, "start_time" : 0.0, "duration" : 0.25, "velocity" : 100.0, "mute" : 0, "probability" : 1.0, "velocity_deviation" : 0.0, "release_velocity" : 64.0 } ] } ' Don't know anything about JS but maybe this helps. This is the 'raw' message which can be handled by live.object. this dictionary format is called array. The square brackets are necessary. More notes can be modificated at once. They must be appended inner the square brackets and seperated by a comma.
Thanks DFW. Found out that "mydict.parse" does the trick when getting the notes and then stringifying dicts before using them to set notes worked out, if someone else is wondering.
Sure, here is an example. It uses the "apply_note_modifications" but it works just the same with "add_new_notes" I assume, if you want to create a new note from scratch. So if you create a string like '{"notes":[{...data for 1 note...}]}' and use dict.parse like in the example it should give you a dict with 1 note inside an array. Yes, I'm also wondering how to create a single note in Max.
All super duper awesome stuff... Still not possible to create a kind of 'Strip Silence' type function using max for live is there?... maybe Ableton can implement this instead?... Basically Split/Trim clips and create fades on any selected audio (or midi) clips that contain silence, would be a very welcome feature I think.
Also its just awesome to have all these examples as a springboard for some great new devices, looking forward to finishing off a few of my MIDI effects I've got laying around, half built.
I think that is already possible somehow. You can cut clips by API commands. With the new update also arrangement clips are accessable and you can read out all necessary data, i think. Would be some work to do, but not impossible. But it would be a kind of 'realtime' process.
I've saved a copy of the [Routing] patcher in my search path, but when I copy/paste the Routing + menu section of the patch in one of mine, the menu are empty. Any idea why ?
I decided to look up this article to get information about warp marker access, and accidentally found that the new "Paste From" menu immediately solved a very specific, unrelated issue I had been trying to figure out for the last hour 😅 So much really amazing stuff, great job to everyone working on Max for Live! It just keeps getting better and better.
@chapelier fou, assuming the Routing patch was loaded properly; you do need to have a MIDI in or out object in your device before the dropdown will be populated. Also you need to save the device first before the routing options are properly found. Does that help?
Inside the Routing abstraction is some explanation that might help:
Together with two umenu objects, this patcher provides a device routing chooser. The third inlet expects two arguments: Argument 1 (symbol): channel type: which type of channel we want to control. One of the following types: - audio_inputs: the audio input channels of this device - audio_outputs: the audio output channels of this device - midi_inputs: the MIDI input channels of this device - midi_outputs: the MIDI output channels of this device Argument 2 (int): channel index: which of the available channels for this channel type we want to control. One routable MIDI input channel and one routable MIDI output channel are available to Audio Effects. Instruments support one routable MIDI output. MIDI inputs and outputs of MIDI effects are not routable since they are fixed to Live. The amount of audio input channels available to a device is determined by the channels specified in the plugin~ object. The amount of audio output channels available to a device is determined by the channels specified in the plugout~ object. Note that after modifiying the midi, plugin~ or plugout~ objects in a device, the available channels will update only after a device is saved.
Ah, could the issue be that you are working in a MIDI effect device? Those have no routable MIDI channels at all, see the explanation in the Routing abstraction. Do you see the same if you copy that bit to an Audio effect device?
So here it is, my first device taking advantage of this, it's an Audio Gater with 6 ADSRs triggered by MIDI notes. Thanks for the support ! https://maxforlive.com/library/device.php?id=7120
Playing with the MIDI routing in audio effects and having a great time!! So many new possibilities.
I noticed there's talk about only having one MIDI output per device. I've been experiencing an issue where my device is working perfectly on one track, but if I put a separate instance of the same device on a different track, the MIDI out for both stops working — even though the two instances never communicate between each other. Is this related to the one MIDI output limitation being discussed above? Thank you :]
edit: turned out to be some rogue send and receive messages that slipped by me! a simple --- fixed this issue. apologies!
I am trying to get MIDI routing in/out to work in a new patch. It is a M4L Max Audio Effect as a template to start. I get the menus to select my input/outputs and channel select works. I don't actually get MIDI data and when I go to save I get these errors. Any ideas of what is happening?
Hi Greg, these errors should be gone in Live 11.0.5. They occur because the MIDI mapping ids are reset every time a device is saved or changes location, which was a bug that is now fixed.
I wonder if it's possible to send data to Arrangement Clips and Warp Markers or if only reading is possible? Those patches are cool but how exciting would it be to be able to create and move warp markers around from Max!
Here's your guide: https://docs.cycling74.com/max8/vignettes/live_object_model If it doesn't say "set" for either entry, then there's no way to do it. This isn't a comment on whether it's an interesting idea or not, obviously - you can always request that this feature be added to the Live Object Model, but that would be a request to make to Ableton rather than Cycling '74.
How do i overcome the " MIDI inputs and outputs of MIDI effects are not routable since they are fixed to Live. " I'm trying to send different midi channels inside a drum rack to have like the 16 midi channels of a device sent to individual drum tracks.. It's not doable ?
Hi Emmanuel, that's correct, at the moment it is not possible to route MIDI to anywhere else than to the device output when you are in a MIDI effect.
Also, in a Max device, Live does not give you access to the original channel info from your controller since the channel data is already used to filter input for MIDI tracks. In other words, regardless of the incoming MIDI channel data, in your Max device the channel will always be received as 1.
Perhaps there is another way to split incoming MIDI data to different tracks depending on their channel. The most straightforward way that comes to mind is to use the channel selector drop-down under the MIDI from header in the tracks' IO settings interface, but perhaps that doesn't apply to your use case?
Great article and wonderful we can get some proper insight in the new features, the ideas behind them and what you can actually do with them. And nice and tight too! Well done!
I was interested with the MIDI channel routings feature and tryed it unsuccessfully, so to make sure - it is only a Live 11 new feature that would not work on Live 10 with Max 8.1.3 ? No equivalent or similar feature for live 10? I wish to get the midi notes of an instrument on a midi channel (that gets midi from a slot and outputs audio), to an M4L audio device that chained after it. Thanks!
Hello again Mattijs - After buying Live 11 and enjoying your Midi Routing patch, I have a small question -
When using the menu to select a Midi source to come into my device channel - If I don't select the hosting channel's Midi Source but different channel's Midi Source, and if still the hosting channel has a playing midi clip - than the remote source will be mixed with the midi from the hosting source, which is not chosen.. (saying after testing but can you confirm?)
I would expect intuitively that midi from hosting channel will 'disappeared' because it has not been chosen..
Any way - You think I can get midi into one device, from 2 different sources, lets say with 2 sets of patches that you have shared, that maybe designated to a two different [notein] objects? My motivation is to 'close in a gate' the unwanted midi from the hosting channel and to separate 'routed midi' from regular (current channel) midi'. I hope I didn't go to far.. (-;
I don't think I understand your question. If you'd be able to include a patch with comments that demonstrate what you are after, that would be very helpful.
Aah, yup I get your question now, thanks for taking the time to put this together. Indeed at this time, devices only support one MIDI in and out channel, unlike audio channels, where the index after "audio_inputs" defines which channel you're assigning.
For what you are doing, perhaps there can be workarounds where you use two devices that communicate via send/receives, but within 1 device indeed what you're after will not work, both drop-downs will change the same setting.
I was hoping to be able to switch outputs between notes, but it seems very slow to change so far. perhaps i was being too ambitious there. Those multiple outputs per device would be a huge help!
I like the smart workaround, but I can indeed imagine that this is too slow for certain purposes. Perhaps until this feature is available there is another potential workaround, using global sends and receives, i.e. using multiple devices with each their own routing setting that are controlled from one device?
Yeah I have made systems like that before, in this case I realised that the use didn't really need separate tracks, just using an instrument rack and changing the chain selector did what I wanted really, and I realised I had already made a device that did that before lol
With the "midi out routing device" in an m4L instrument device.
when I point a track in the menu, even if the destination track has an instrument device inside, the 16 channels are not in the sub-menu. There is only "Track In".
But in the rest of Live, in each sub-menu of a track you can choose to route midi directly in a chossen channel of an instrument..
To make sure I understand what you mean, here are two screenshots. Live's "MIDI To" option allows routing directly to the Zap instrument, whereas the MIDI output option in the Max device doesn't.
Indeed this doesn't seem to be consistent, I'll file a feature request.
Hi! I have a trouble with Routing patch. I create a m4l device and try to use Routing to translate midi-notes, but i have error. I need to transfer notes from my m4l device to the midi-track.
I have read a lot of forums, but i still do not understand how to solve this trouble. Help me please
It looks like the `Routing.maxpat` file is not found by your device.
After unfreezing an example device that contains it, you should be able to find this file in ~/Documents/Max 8/Max for Live devices/<Example device name> Project If you move this file next to your device on disk, your device should be able to find it.
Good news: in Live 12, you will no longer need to use this abstraction, there is a new live.routing object included instead.
@roman : sorry I don't get the joke. with live.routing, I can only get audio ins and outs in a Max Audio Effect. No MIDI in a Max Audio Effect, nothing at all in a Max MIDI FX.
In the upcoming Live 12 public beta, live.routing works for me for MIDI in/outputs in audio effects like so:
MIDI in/outputs in MIDI devices are known not to be routable in the current API because only one midi in/output can be routed at the moment and in MIDI effects these are already hard-wired to the Live in- and output.
Indeed I also recently found out that audio outputs in MIDI devices cannot be routed to another audio effect, I'm not sure why that is. I logged it so hopefully this gets added in the future.
@roman, if you would be able to make a list of things that you consider partially working, that would be of great help, we can then compare notes and I can make sure anything that is not yet tracked gets added to the list.
@Mattijs : currently, polling for midi inputs or outputs give me no results at all (in an Audio Device).
And yes, Midi ins/outs within Midi devices has never worked, and it's something I wish so bad. I was hoping [live.routing] would "solve" this. As I witnessed that it's only "partially working" (see me example), I kept this hope. I still do...This would open so many possibilities.
@chapelier, I think you need to provide an index to live.routing before it outputs anything, see the patch I attached in my previous message. Does that work for you?
Hi all, I think I know what may be the issue here. It appears that this only works if the index attribute is set after setting the port attribute. In the example of Chapelier, since both are triggered with a loadbang/loadmess, the order is undetermined (or rather, determined by which object loads first), probably triggering the index attribute first.
I'll log this in Ableton's bug tracker system, hopefully this can be changed because I agree this is unintuitive. For now, as long as you send the index value after the port value (e.g. with a trigger object), it should work. If not, please let me know.