Articles

An Interview with Jeff Rona

It's easy to get caught up in the excitement of the creative and esoteric possibilities with Max. We've all had that surprise of watching the sun come up while putting the finishing touches of some cool new esoteric patch that will rock your world. There's a satisfaction and pride that come with such creatively monumental projects. But, sometimes it’s the little things that bring long-term gratification. One simple solution, that solves a nagging problem, can open the flow.

It's easy to forget that Max, like a good Swiss Army knife, can be a practical tool for solving the prosaic problems as well. You know, all those little, routinely mundane tasks that make you mutter under your breath "there's got to be a better way…"

I’ve had the pleasure of knowing film composer, Jeff Rona since the late eighties and it has been a pleasure to see his career evolve. [ed: Hard work and talent pays off kids!] He has an extensive background of sound design with Max, from his innovative musical laptop performance use in the mid-90's to his complex algorithmic rhythm and texture devices used in top Hollywood productions. But I thought it might be interesting to take a look at the practical applications that Jeff has built with Max to solve some basic workflow annoyances in his studio.

Photo: JeffRona.com

I know you grew up in LA. Did you study music when you were young?

I played flute in school and in bands, but by college I was actually studying fine art. I was interested in being a photographer, as was my family’s vocation coming out of Hungary. My father and uncle were both photographers. My other uncle was a painter and sculptor and very successful, and I grew up really steeped in visual arts, but always loving music.

I was always fascinated with sound. By the time I was barely a teenager, I was finding little amplifiers and circuits and playing with them and getting them to make weird squelching, squeaking noises and finding that utterly delightful.

When I was in college, two things had happened that had an impact on my view about electronics and technology and music. I’d been working a lot with dancers. I was writing a lot of music for a couple of different dance companies, so I was attending concerts at different theaters around L.A.

I was at a concert and I overheard a guy behind me talking about music and computers, and this is still in the early ‘80s. Anyway, he turned out to literally be a rocket scientist, from the nearby Jet Propulsion Laboratories. He developed some of the most sophisticated computer data analysis systems in the world, but his hobby was getting home computers to improvise music.

So, I turned around and introduced myself, and we became friends for many, many, many years. And he was developing a computer system to improvise, using a desktop computer. When this was virtually impossible, when nothing commercial existed that was anything like this.

I worked with him in the development of a sophisticated real time computer-music language. And that got me introduced to all kinds of interesting people. I spoke at the very first TED conference and created a music system for the conference center that reacted to people’s movements through the area. I ended up spending a few days as a visitor at IRCAM, when David Wessel and Miller Puckette were still there, and Max was still just this code at the time. That’s also where I met David Zicarelli for the first time.

Where did you go to school?

A small university here in Los Angeles called Cal State University at Northridge. I was studying art, and I was studying a lot of music. I was studying composition, conducting, some electronic music classes, and film.

One of the professors there got a grant to get a serial number ‘1’ of something simply called “The Dartmouth Project.” The Dartmouth Project was a non-real time computer-music system based on FM synthesis and a custom algorithmic music language. It was an experimental one of a kind machine, from a group of professors and grad students and Dartmouth who were developing several digital music systems, and I was the first outsider to use it.

They eventually spun it off into a business that they called New England Digital, which eventually became the Synclavier.

So I was kind of between the rocket scientist from JPL, this new system, and an analog music lab based on a pair of EMS Putneys. I just got steeped in the idea of building systems, building my own experimental music tools, while I was starting to compose music for dance companies, art-gallery installations, and collaborating with choreographers and visual artists, with everything from tape loops and synthesizers, to fully digital systems.

This was all intensely interesting to me. I got really sucked into it. I had an accidental meeting with somebody from Roland in 1982 who said, “Look, we’re looking for somebody to write software for some of our new, secret projects. Would you like the job?”

I’m not a real computer programmer. I never was. I was just working in these high-level kinds of custom music systems, where other people were doing all the low-level hard coding work. But I took the job. I lied, I took the job, and I learned how to code really fast.

Is that how you got involved with the development on the MIDI standard?

It is. I ended up being the one engineer from all the synthesizer companies who had time to help the engineers who had gotten MIDI started in their labs. What started as about ten people sitting around a table turned into a worldwide-accepted music protocol and true phenomenon. I created software and instrument design at Roland for four years, and continued to help foster MIDI for three more years.

But in the meantime, my love and passion became creating music. That led me to quitting Roland and working as a synthesizer and drum-machine programmer on a lot of different records, but mainly I was doing synth programming for film composers. Without realizing it, that became my music school on how to score films.

I was working with a number of LA-based composers including Basil Poledouris, Chris Young, Mark Isham and a quite a few others. Eventually, I met Hans Zimmer, and then he and I ended up having a lengthy and very close working relationship for eight or nine years.

Hans introduced me to directors, and I started scoring projects on my own — I’d been ghostwriting so much for several composers for years. I did a bit with Hans, but also he introduced me to many of the projects that got me started scoring for film and television. Mark Isham was also very helpful in me finding my own projects, after I was helping him as a co-writer and as a programmer on some of his work.

You worked with Ridley Scott as well, didn’t you?

As a writer, I wrote the score for White Squall for Ridley. Which was a big hit overseas. Not so big in the U.S.

Coincidentally, I just started a movie this month for the screenwriter of White Squall. He’s doing a terrific movie called The Phantom, with Ed Harris. I think it’s a very cool movie. I’m just getting started on that. It’s a very electronic score. Very unlike the White Squall score, which was mainly symphonic.

I worked with Ridley and with Hans on Black Hawk Down as well.

Didn’t you work on Gladiator?

I did work on Gladiator. A little bit. I worked on Mission Impossible 2 and Toys with Barry Levinson. I did Homicide: Life on the Street as the sole composer for his television series. I did Chicago Hope and Profiler.

In fact I did a whole lot of TV for several years, including documentaries, and TV movies. Then I took a break to do other things — films and a few video game scores. I just finished a project that’ll be out later this year called The Ropes, working with Vin Diesel’s production company and Fox. I did another Fox series last year called Persons Unknown, which ran for one exciting season. [Laughs]

How is scoring for a movie different than scoring for a television show?

It’s only different in a couple of ways, really. It’s not that significant. At the end of the day, music is music. So aesthetically, there typically isn’t a big, big difference.

With a film, you have to do it right the first time. You do it once. You want to give the movie a voice, you want to give it an emotional pitch, but you have to do it the first time. Whereas with a television series, you’re trying to sustain it over an arch, that for the audience goes over many months. So there needs to be something familiar about it without being repetitive.

You’re also under a much tighter deadline in television — you have hours instead of days, and days instead of weeks to do what you’re doing.

I know you collaborated with Lisa Gerrard on some projects, composing and performing as well.

I wrote a song with Lisa for Mark Pellington’s next film, Henry Poole Is Here. A very lovely, moving film. I had worked with Mark previously on his film, The Mothman Prophecies.

Lisa and I also did a beautiful movie for the Smithsonian called A Thousand Roads. And I ended up doing a tour with Lisa and her band, Dead Can Dance, along with a full orchestra. I was also a part of Jon Hassell’s band for a couple of years, and produced a couple of records with him, and Brian Eno. Working on some his cool, ambient, art-rock. So it’s been pretty eclectic for me.

Recently I’ve been collaborating with this very interesting British poet by the name of David Whyte, who’s become very well known around the world. And we’ve done two albums of spoken word with music. The second one just came out a couple weeks ago.

When did you first start using Max?

I go back to Max when it was still at IRCAM, so when was that?

The commercial release from Opcode was 1990.

I started using it more consistently not long after that. At first, just for my own little experimental projects. I used Max a bit when I was working with Jon Hassell. Not on stage, but in the studio to process some of the samples and loops. I started my own group called the Luxurious Brothers, with a guitar, electric cello, bass, and myself on laptop running a complexly Max-based system. Sadly, it was short-lived due to the death of my partner in that ensemble. But I began to realize Max could do some things that no one else was doing yet.

This was in the mid-‘90s, still pretty early days for doing things live on a laptop. But I wrote my own live performance systems in Max, to be able to do what you can now do in a lot of sequencing and loop-oriented programs these days. But I was doing it in a more free way, I think. I would take all kinds of music — from loops and ambient compositions I did myself to all kinds of rhythmic world music — all at various tempos, and I would not only beat match them but started to beat slice them, and make each slice have its own little world of pitch and time and filter, direction and gating. Impossible back then except in Max. With the tools available now, between granular synthesis, which hadn’t really come aboard yet, beat slicing, beat matching, time compression and expansion, it still had it’s own unique sound and approach. You can hear some of it on Jon Hassell’s City: Works of Fiction album and the score to the film Traffic.

You know, Max really allowed some of these things to happen way before any customized tools — like the tools now, like any of the tools for DJs that do beat matching of audio files. Or the granular-synthesis programs that allow taking a piece of audio and slicing and dicing it down to the wave, or in to smaller chunks. Or beat slicing like you can do in Ableton’s Live.

None of that existed yet. But I was doing it live, blending all kinds of effects, and multiple streams of information, and doing it with just a laptop, a little mini keyboard and one little external filter box for some weird effects. And best of all, remains the ability to create a music environment that is unique. When we all use the same tools we can start to sound alike. But in creating custom tools we can add distinction to our musical approach and sound. This is important to me.

When did you start integrating Max with your film projects?

In 2000, I was collaborating with Cliff Martinez on the score to the Soderbergh film Traffic, and Cliff said, “Well, you know, we are really afforded the opportunity to be pretty abstract with the music.” And you know, Soderbergh is a huge fan of Eno — that’s always an inspiration for him.

So, I wrote a Max program to start generating a lot of the rhythmic elements of the score to Traffic, and then giving it to Cliff to integrate into some of the synthesizer parts that he was doing. Then building some cues off of the rhythms, and some cues off of more synth work. But what was going on in Max became very elemental to a lot of that score.

I had a system with three sample players, each in sync, but running different processes on the sound. I took incoming MIDI Clock from the main sequencer and used that to lock Max to the rest of the music. Each player was able to slice the audio material and replay it with a lot of processing control. Slices could be played in order, based on tables and multiSliders. Each slice got treated with pitch control, a set of filters for light or heavy processing, panning, enveloping, reversing, changing rhythms, and a lot more. I liked the effect of slowing the audio way down but keeping it in sync with the tempo while grunge-ing up the sound a bit. The three loops were further processed in Max for final EQ, delays, and some granular effects as desired.

We were also working with legendary guitarist David Torn, He was providing these awesome textures. To make some of the textures take on a slightly more rhythmic element, I was putting them into Max along with other sorts of far-flung sources for building rhythms.

When I start a film, I like to spend a certain amount of time developing a palate, a sonic palate that becomes the orchestra that I write with. Whether it’s a fully electronic score or just the electronic elements, I like to start with fresh ideas.

In Max, I’ll go through sounds, either patches that I’ve created or patches that other people have shared that I’ve modified, and use them to create some interesting basic ideas. I then sample the results back into Logic or Pro Tools. For me it’s much easier to integrate the elements into a score in the form of samples versus having it run in real time while I’m having to write to picture, which involves incredibly precise shifts in tempo, and bar metering.

I see you use an AVID/Euphonics controller.

I do. I got it less for the faders, more for the programmable touch screen. I love touch screens.

Over on the side of my mixing desk, I have another little touch screen I bought on eBay for 50 bucks. It’s plugged into a Mac mini running a Max system I created to help run parts of my studio.

My current system is based on four computers. I have one Mac Pro running Logic. I have another Mac Pro just for my orchestral samples. I have a third Mac Pro dedicated Pro Tools, which is how I monitor and record my music, and it’s also the driver for the video. And then I have this Mac mini, with the eBay touch screen, driving this custom Max patch that I did.

Everything’s connected over ethernet for sync and MIDI. Over on the side of my writing table I have a small mixing console, which is really just a volume knob for the room, but it’s all in 5.1 surround. So I have a fader group for my surround music mix, a fader for dialogue, a fader for my click track, and a fader for any temp score that they put into the video they might want me to use as a reference.

So that’s my system. It’s four computers, linked through MIDI over ethernet, and audio via Light Pipe.

How are you using the eBay touch screen with Max?

I did a movie coming out this year called The Tortured, working with Darren Lynn Bousman, who directed some of the Saw movies. There were some sounds that I wanted to do which involved being able to have a lot more control over my plug-in synths than I typically have.

I wanted a way to control multiple parameters of a plug-in simultaneously. So I started off by creating a couple of these x-y pads in Max, so that I could be controlling filters or any two, paired parameters. I added presets with dropdown menus so I could choose to control tremolo speed and depth together, or I need to control a filter cutoff and resonance together. Then I created another pad dedicated to pitch bend and mod wheel, so I could control a sound bending up and down while modulating the filter. It allowed me, very quickly, to create some weird, seasick noises. Very handy. Very quick.

I used the pictslider object for the XY pads. I created some backgrounds in Photoshop and imported them in to the pictsliders. I played with adding grids or circles, but simpler was better. Regardless, it's great to be able to customize the look of an object so much. I look at this thing all day, so I spent a little time making it look nice.

Then I added a little sequencer, so there’s MIDI Clock coming back into a multiSlider that allows me to create little rhythm patterns that I assign to filters or panning, so that I could do some cool ‘sample and hold-y’ type stuff like that.

So, you’re using it mostly for plug-in control?

Actually, where it became even more important was for film mix monitoring. I was always having to reach over to my monitor mixer, which I don’t sit in front of most of the time, to adjust the level between music and dialog. I don’t really want to have to keep moving around so much to make small adjustments.

So in Max I’ve created these little red faders that give me control over the level of the music in the room, the dialog, any temp score and the click track. As I move this fader on the screen, it’s actually moving the physical faders, in sync, on my mixer. Then I have all these mutes to turn things on and off without changing levels.

In terms of operating my dialogue and temp music, I’m having to scale some things a little bit, scaling MIDI from touch controls so that it has the right usable ranges, and there’s a few little tricks behind it. But it’s relatively straightforward.

So, some of this Max patch is controlling the mixer and the rest is controlling Logic. I use it for some transport control and to jump to markers. I’m splitting things out to go to different ports. Basically, I needed to use my Avid controller for other things, and this ended up becoming a huge time saver.

Then I realized how to further the idea. When I have directors come by, I hate it when they’re sitting on the couch in the back of the room, and I’m sitting here up front working all the faders like the wizard of Oz. I don’t like that disconnect, I want to sit in back with them and watch the show.

So, in comes the iPad. Using TouchOSC, I wrote this little program I call Director’s Chair. First, I have to say; I loved working with OSC for the first time. It's great to have actual descriptive names for actions and to be able to define everything I need for a function. But, of course, this had to be converted into MIDI for Logic control—there would be no other way to do this except through Max.

I wrote a little Max patch. It’s relatively simple. It takes commands from my iPad and sends them either to my mixer, to Pro Tools or Logic, depending on what I need to do.

For example, I can sit in the back of the room and from here I can run the show, hit Play, I can adjust the level of the music and dialogue. One of the things that you’re forever doing is adjusting levels when you’re playing music back for a director, the mix between the dialogue and the music can make or break the meeting. How well they like the music depends sometimes entirely on how nicely it blends with the dialogue and sound effects.

The monitoring on playback can get tricky because as I’m writing it, I have to print all the cues close to full volume. But now, I can just sit back there with my director, hand him my iPad and I say, “OK, look. Here’s your dialogue. Here’s my music. You mix the movie.” And they love it!

I’ve added provisions, so that I can create and jump to markers. He’ll ask, “Well, can we go back to such-and-such a spot in the video?” And I can say, “Of course.” I can just go marker to marker, and as I jump markers, of course the video jumps with it.

I set markers for those key moments in the scene where I know he’s going to want to go. I’ve been doing this long enough to know where these directors will go, “Play it from where he walks out of the room.” I can do it hands free now, where we’re controlling Pro Tools, Logic and my little room mixer, and it’s all by remote control. And I did it all in about an hour and a half of Max programming.

That’s very cool. I was thinking we should start a ‘practical tools with Max’ column on the Cycling site. As opposed to the sound design and more esoteric stuff.

That’s kind of the thing about this, is that I solved a really kind of boring, routine aspect of something, that anybody who’s mixing, who’s composing music to picture with dialogue needs to do. And I’ve never found an elegant solution, before this. I’m not saying that they aren’t out there. I’ve just never found one — it’s not like I haven’t looked.

There’s really expensive ways of doing it. But this was incredibly simple. This is all off the shelf. It’s just a Mac and Max. I see that Lemur is now coming out for iPad. I’ve looked at it, and I’ve read about it a little tiny bit. It’s pretty amazing but I’m still not sure if it can mix OSC and MIDI through various ports the way this system is doing it.

But, the fact that you had a challenge and could design a solution there on the spot. And didn’t have to spend money!

Exactly! The fact that it was so easy for me to take a very small part of Max, and solve just this, it’s actually made a huge difference in how I work.

I would love to get Max running on a couple of computers. I think we can take it much further. To integrate more parts of the studio in ways that they just don’t off the shelf. And I think my studio is extremely similar to a lot of other people.

Your $50 touch screen fascinates me.

Some guy had a dozen of them that came out of a bank. Legally, I hope. That’s all I know. It has no brand on it. It doesn’t even have a back. Its just the electronics hanging out in the back. But I wanted to get the smallest touch screen I could find at the time.

At this point, in terms of the part that’s doing all my little synthesis tricks, this I could probably do now in TouchOSC or similar apps. But I don’t think there’s any way for me to replace what I’m doing, combining on one screen all of my kind of cool expresser controls with all of my studio automation for making my life go much easier. And the AVID controller won’t even come close on its own.

I guess this is the era of the customizable controller, isn’t it.
Especially as touch screens become more affordable.

I definitely see an iPad replacing my custom touch screen, but I don’t think I’m going to replace Max. In this case, in a way it’s sort of like a traffic cop. It’s taking in information, doing a modest amount of processing, but it’s mostly just a kind of a very smart router.

The controller actually works in two directions. It sends messages to various places, but it also gets MIDI clock from my sequencer, Logic in this case. I use that clock to drive a 16-step analog-style sequencer to send out MIDI control messages I can use for cool time-based effects. This allows me to do things on my touch screen to plug-ins that the plug-ins themselves don’t do. By dividing up the incoming MIDI clock into varying rhythm durations, I can create all kinds of cool little rhythm patterns out of any accessible parameter I can think of, on any plug-in, I have more control over.

And now you’re applying your film audio chops to the gaming industry?

I’ve been getting involved in some video game scoring. I did God of War 3, for Sony that came out last year. And I’m now doing a video game in China for one of their massive online games that they claim get played by up to 500 million people.

I did use my Max work, because they did bring me these rendered scenes. It’s not like a movie, in this case, because they are just showing me renders, so there’s no time scale. But there is dialogue and effects. Plus we did a trailer for them before we worked on the game.

The video game world does have a whole different set of rigors, for sure. But creatively, music is music, you know. Dark is dark, light is light…

Jeff would love to hear from you. Questions and comments can be sent to: jeff@jeffrona.com

Jeff's Website

Text interview by Marsha Vdovin and Ron MacLeod for Cycling '74. Video by Jeff Rona with additional editing by Ron MacLeod.

by Marsha Vdovin on March 6, 2012

liefellis's icon

Great interview! Love the intimate look at the setup and the practical applications that Jeff has managed to mold Max to. I think film scoring programs could benefit in teaching students how to utilize Max in this way. Great job.

Amy Knoles's icon

Noice Jeff!! Great to see your mug again... :0

andremartins's icon

I am a huge Jeff's fan since early days from his monthly column at Keyboard Mag. He knows better than anyone else, great interview, thanks!!!

Leon Van Bokhorst's icon