An Interview with Owen Pallett
I’m a sucker for well-trained musicians who push the boundaries creatively and aren’t afraid to experiment. Owen Pallet really blows me away with his talent and where he takes it. Classically trained on violin and piano he has backed the ‘indie band of the moment’ Arcade Fire and is now performing solo opening for such bands as The National with a combination of looping, live keyboard and violin. His album Heartland was included in NPR’s Best Music of 2010.
Did you grow up in Canada?
I mostly grew up on a small farm outside of Toronto. My parents split when I was very young and my father moved around a lot, and eventually settled down in rural Québec.
What was your first musical memory?
I had a little Fisher-Price tape deck and I’d carry it around with me everywhere. I had tapes of Bach Double Violin Concerto, Purcell Abdelazar, Holst The Planets and Bartok Music for Strings, Percussion and Celeste. I still have the tapes but they’ve been worn out, I listened to them constantly.
Do you have formal musical training?
I took violin and piano as a kid, as well as music theory, harmony, counterpoint, composition and analysis. I went to university. But the thing I benefited from most was teaching myself music notation software-–Encore in the 90s, Sibelius in the 00s.
So the violin was your primary instrument all along.
My older brother played the cello and so I wanted to start the violin. I took one year of Suzuki at age 3 but my mom decided instead to take me to a more traditional Ukrainian-Canadian teacher, with whom I studied until my mid-teens. I did great in competitions as a kid but also had some serious fundamental technical issues–-a bad bow arm and a sideways vibrato.
When I started playing in a youth symphony at age twelve, my ‘sheltered’ musical education and my technical issues made me an easy target for an abusive conductor. So I’d skip out of rehearsal and would instead go to the University of Toronto music library and look at new music scores and listen to CDs. I quit at age fourteen and started focusing on writing music instead. I also played in some high school rock bands.
In my mid-teens I took up country, Irish and Renaissance music and started playing in bars and Renaissance festivals for cash. I continued with this when I moved to Toronto. I was accepted to the composition program at University of Toronto on scholarship, and paid my rent by busking on street corners. I started playing classical violin again, worked hard and corrected my bow arm; I still play with sideways vibrato.
How was the music scene in Toronto back then?
Though Toronto has some fabulous composers and teachers, the new music scene is weak compared to places like New York, and I was fairly disillusioned by graduation. I was more interested in making ‘pop’ music, and started writing and performing some schizophrenic finger-style guitar songs under the name Les Mouches. I joined several bands, including the Hidden Cameras, Picastro, Internet, and The Jim Guthrie Quintet, as well as doing session work and side gigs with Arcade Fire, Royal City, Jane Siberry, October Browne and many others.
At the time, the Toronto music scene was in a pre-Broken Social Scene state of ecstasy. Unknown and awesome bands would play to hundreds of people. There were huge amounts of cross-pollination between the art scene and the music scene. I was playing seven shows a week, but I was broke, starving on my feet, and I took a day job at the CBC [Canadian Broadcast Corp.]. I quit all my bands, save a solo violin looping project I’d been working on that I called Final Fantasy.
Arcade Fire, who had supported the Jim Guthrie Quintet on our Canadian tours, and with whom I’d done work on their album Funeral, asked me to support their first full US tour in early 2005. I quickly slapped together the first Final Fantasy album Has A Good Home and went out on tour with them, with the encouragement of my employers at the CBC. Six months later I had received so much attention for Final Fantasy that I quit my job and started making music full time.
How would you describe what you do?
Over the last five years I’ve been balancing composition, film scoring and pop arrangement gigs with my own solo performances. I’m focusing now on solo performance. My solo show is a looping show–vocals, violin and some synths–where I endeavor to make the process visible to the audience, keeping it as much a part of the experience as, say, the lyrical or musical content.
When did you get into Max?
I learned Max in late 2007 in an effort to expand my looping process beyond the capabilities of guitar pedals and rack-loopers. I built and performed several ‘multi-phonic’ looping shows, where I tracked up to sixteen channels of violin using the free program Sooperlooper and routed the audio to a speaker array that surrounded the audience. I programmed several choreographed scenes where the channels of audio would move and dance around the listening environment. To do this, I used some objects that were created according to the principles of Ambisonics, which unfortunately no longer work in Max 5. This setup was inspired by a Janet Cardiff installation.
The setup of these Ambisonic shows was extensive and I found myself existing more as a programmer than a songwriter. Furthermore, although the techniques of these shows were impressive, I found that the performance bore more resemblance to a Bose surround sound in-store demo than a “concert”. [laughs.] So I simplified the setup, and now run multiple loops through amplifiers onstage to create a “rock band” feel-–”violin 1″ goes through the Princeton, the pitched down “bass” patches run through the bass rig.
How did you learn Max? Was there a learning curve?
I just sat down with the tutorials and crammed. It was Christmas holiday, so I had a week off. By Day 3 I had my patches up and running.
I can’t speak for people who get into Max through a class, but for me, the learning curve was steep. Seeing as I went into Max with a specific objective, I naturally wanted to learn only the tools I needed to accomplish my desires. I still have some inefficient “route 0 … 127″ objects in my patches which just look nutty. [laughs.]
Max can be both easy and very difficult. It’s easy to explore, but can be challenging to tune and debug. Seeing as I’m a better cook than a programmer, it didn’t come particularly naturally to me. The best resources, of course, are the forums on the Cycling ‘74 website. Searching those allowed me to find all manner of patching solutions.
Is Max part of your current work?
Max is currently being used strictly to interpret MIDI data and control Sooperlooper. It makes my stops, starts and undos all ‘click-free’ by dampening and scaling every action. For example, when I press “Stop”, rather than send a MIDI command to Sooperlooper, Max receives the command, reads Sooperlooper’s current volume as x, instantly scales Sooperlooper’s wet volume down to 0 using an OSC command, mutes the channel, then resets Sooperlooper’s volume to x. This makes my stops and starts entirely click-free. I’ve programmed similar commands to relocate my violin sound to different channel configurations. Clicks while doing live looping will be recorded and repeated, so it was important to me make these transitions as smooth as possible.
Although all of these functions are available in Ableton Live, I find that Ableton’s MIDI implementation is too ‘user-friendly’ and doesn’t allow enough detail… not to mention the limitations of Ableton’s non-clocked Looper function. So I’m sticking with Sooperlooper + Max, a great combination.
Do you extend the ‘live-loop’ technique to your recorded projects as well?
Given the live-looped nature of the live show, my albums are exercises in re-interpretation. Almost all the songs I write are written with the live show in mind. So, on record, I’ll attempt to re-envision these songs in other formats. One record I made was recorded entirely out-of-doors. Another was with a big band in a big studio. Another was a bunch of songs revised for string quartet. Another was meant to function as a symphony.
Do you have a main patch that you can share with us and explain?
I play the violin and use simple MIDI foot controllers to control a multiphonic live looping setup. Sawing away, I need a rig that I can use without a keyboard or a screen. One that I can just turn on and go, and control with only my feet.
I settled on using Sooperlooper as my looping base. It’s PWYC [Pay What You Can], the functionality of it is fantastic, Jesse’s support is legendary. The fidelity is determined by the quality of your sound card. Furthermore, Sooperlooper, which uses the Echoplex model of looping, is much more musical, to my mind, than the ‘lock-to-tempo’ model of some commercial loopers.
In order to seamlessly transition from one ‘state’ to another, whether it’s a delayed violin in the back of the room or a gigantic wall of distortion across the front, I needed to find a way of better informing Sooperlooper of the specifics of what I wanted. Changing a single MIDI command into a smooth series of MIDI commands. That’s where Max came in.
At first, I was just doing MIDI > MIDI, but then I switched to MIDI > OSC. As I learned more about Max, I was able to develop the Max patch so it would simply set up all the necessary software upon booting. Now I can run a six-channel, 400 second-limit looping rig off a Mac Mini, with all the flexibility of OSC control, but using the most simple foot controller. A press of the button has my violin oscillating round the room, orchestrating fades between loops, rate adjustments, whatever. It’s wonderful.
The patch itself is so simple that I made it visually appealing enough to be projected on a back wall. Each trigger I press will display the current state to the audience to make the whole process transparent and visible. It’s the antithesis of the mystery of un-labeled Lemur sliders and Reactables but I like letting the audience follow along with what’s happening.
How would you describe your patching style?
Trial-and-error. I’m a composer, not a programmer, and so I come to Max with specific objectives in mind. I would love to sit down and build some automated synthesis mechanisms – but not today.
What is your favorite Max object and why?
I use Max to change simple MIDI commands into complex, time-sensitive OSC messages. As a result, the “expr” object is the most useful to me. I love creating interesting algebraic curves and applying them to panning, volume and rate schemes.
My new favorite, though, is the ” ; ” message. It allows a message to be sent directly to Max application itself.
I’ve started running my entire rig, headless because of that wonderful little thing.
Visit Owen’s Website.