My Perspective on Integrating Max and Live


Nine years ago, Robert Henke told me about the edit button.

Robert was in Anaheim, giving amazing Live 1.0 demos non-stop at Ableton’s first NAMM booth, and before the last day of the show, we were chatting in the topiary-enhanced parking lot of Stovall’s Inn. Having used Max to prototype some of the first effects included with Live, Robert told me he wanted to be able to reprogram his effects on the fly, without stopping the music, just the way everything else worked in Live. At that time, the reality for Robert would have involved translating his revised patch into a C program and rebuilding a new version of Live. This was not exactly the real-time development cycle he was used to as a Max user.

At the time, I told Robert I thought his idea was cool and that we should make it happen. But then I began to immerse myself in the details. Would we have to shoehorn the entire Max environment into Live? If not, how could you edit a patch without hearing it? None of the alternatives seemed terribly attractive. Brooding quickly set in. Fast forward nine years. After a lot of negotiating, specifying, and programming, Robert’s dream is becoming a reality. It seemed appropriate to reflect on this newest evolution of our software, and why it has caused my mood to brighten considerably.

The Max device (Degrader) comes with an edit button, unlike the built-in Live device (Erosion).

From Ableton’s perspective, Max is the meta-feature. Live’s limits are now your imagination’s limits. (Who knows, maybe they won’t need to add any more features!) But what it does Max for Live mean for Cycling ’74 and our users? And why did we want to integrate Max into another piece of software?

I’ll answer the second question first.

As someone whose primary career interest has been software user interfaces, I have to say at the outset that Live has been an ongoing inspiration since I first saw it. The important thing for me is Live’s recognition that fluency was a fundamental goal in an interface for creative work. Particularly with version 5, we’ve tried to incorporate lessons from the Live interface into Max.

After Live appeared, it soon became clear that I wasn’t the only one who was impressed. Live has become a preferred tool of many Max users. Live’s performance orientation attracts a similar community to ours. And here’s a insider tip: Ableton’s MIDI, plug-in, and ReWire implementations were always the most stable we dealt with, and we actually knew people who were using the two programs together with success, on both Mac and Windows. That meant we could imagine that an integration project would have a good chance of actually working when we were finished with it!

Ultimately, it came down to this: my Cycling ’74 co-workers and I have come to believe the unique thing we have to offer the world is fundamentally about programming. In other words, we want to make edit buttons, and if we can put them in places where they have never existed before, all the better. It was clear to me that Ableton understood what it meant to have the Max environment work with their software. They weren’t just talking about more plug-ins.

We’ve been working with Ableton for more than two years to bring Max and Live together. From the outset, our goal was to create the concept of a dynamic Live device that would make the application itself seem editable. The result is not just another plug-in specification but an entirely new kind of workflow that manages to combine the interactivity and fluency of both applications without compromising anything.

Working on a complex task with another company separated by over 5000 miles and a nine-hour time difference has been an interesting challenge. Time and distance were not the only issue, however. Even though we respect each other’s software tremendously, the cultures of Ableton and Cycling ’74 are, within the narrow confines of audio software companies, pretty divergent. I suppose I should be careful in making comparisons between the two organizations, but I think it would be safe to say that Cycling ’74 operates in a manner that, by comparison to Ableton, could be characterized as complete and utter chaos. Yet for me at least, the experience of getting to know another company and its people has been intensely rewarding. Since December 2007, when Max for Live was first demonstrated (and yes of course, it crashed!) at an Ableton company meeting, the Cycling ’74 office began to receive requests for Max authorizations from Ableton employees. That was an encouraging sign for me that maybe we were on to something. For the past several years, we have actually managed to infiltrate the Ableton office in Berlin with one of our developers, Jeremy Bernstein. In retrospect, even with all the other pieces of the puzzle falling into place, it’s hard to imagine how we could have accomplished this task without Jeremy being in the right place at the right time.

Even Max users who never end up with Live have benefited from this project. In addition to some of the Live-influenced changes we made to the UI design, there were features we developed for Max 5 specifically to address challenges of Live integration. For example, given the size constraints of the Live device view, we needed a better method for displaying a compact interface that wouldn’t distort the logical structure of a patch. The result was presentation mode, which turned out to be a dramatic improvement for UI design for any patch. The task of integrating Max into Live has already prompted a number of innovations within the Max environment and I can confidently predict more will be forthcoming.

Finally, I want to leave you with a Max-centric perspective on what this project represents.

The most obvious thing you are probably seeing as a Max user is the ease with which you can get your Max stuff into Live, as well as share it with a new user community. But that is not the whole story. Instead of thinking about what Max is going to do for Live, think about what Live is going to do for Max. With this integration, a programming environment has just gained a set of powerful composing and performing tools. In Music-N terms, Max supplies the orchestra while Live holds the score. The “score” however is not just MIDI notes. It can be audio, triggered and manipulated in all the sophisticated ways Live provides. Or it can be automation, drawn inside Live and fed to Max as sample-accurate audio-rate ramps if you like. It’s equally possible to work the other way, where Max represents the score and Live represents the orchestra.

Those are just the raw capabilities. The real magic happens when you see how it can all work together. Because we started with the requirement to support dynamically changing devices, your “score” and your “orchestra” will evolve together seamlessly. For example, if you edit a device and add a parameter to it, you won’t lose the automation data you’ve already created for the device’s existing parameters. Then there is something we have been calling preview mode. Preview mode pipes audio, MIDI, automation, and timing from Live to Max (and back to Live) while you are editing your device. The result is a sound design process that feels completely integrated from the highest to the lowest level in a way nothing else has before.

Once you experience this integration, I think you will see how it has the potential to change the typical usage patterns of both applications. Max is the ultimate workaround for out-of-the-ordinary things you need to do in Live, while Live supplies the sampling and granular audio triggering Max users often find themselves constructing. Our new Live-inspired Max UI objects, with their effortless parameter management, tie everything together, and then you save it all into a single document, ready for tomorrow’s creative explorations.

That, in a nutshell, is why I like edit buttons.



Yeuda Ben-Atar
June 12, 2012 | 10:39 pm

(3 years later)

Wow… you were so right…


Viewing 1 post (of 1 total)