A Video and Text Interview with Luke DuBois, Educator and Musician
I first became aware of Luke DuBois when I heard his band’s CD Freight Elevator Quartet. Later, we met when he did custom MSP programming for the filmmaker Toni Dove. I was intrigued by the contrast of his knowledge and experience vs. his boyish demeanor. Luke DuBois might look like a college student but that impression changes the minute he opens his mouth. He is a Fellow at the Computer Music Center at Columbia University in New York City and teaches at NYU. Luke is an expert with Max/MSP/Jitter and the intersection of music and image interaction.
I began the interview by asking Luke about his childhood...
I grew up in New Jersey, moving to London, England when I was eleven. I have three siblings who are much older, so it was a bit like having five parents. When I was nine I asked my parents for a bike and they bought me a computer, so I guess my subsequent career makes sense.
What was you first music making experience?
My first music-making experience, I guess, was in elementary school when I tried to build a zither out of rubber-bands. Though that wasn't a particularly successful project, it did give me some insight into how difficult it is to create a musical instrument. After that, there was a pretty big lull until high school when I took up playing bass.
What kind of music did you play and listen to in High School?
I played in a lot of rock bands, as well as one pretty serious rave-n-roll (Happy Mondays, etc.) group. A lot of the music I listened to in high school (new wave, etc.) is disturbingly back in vogue now, so I've been spending a lot of time having deja vu when I go see live music in New York.
Where did you do undergrad?
Columbia University in New York.
What was your undergrad experience like?
It started out horribly, as I was a pretty miserable engineering student and had other priorities (dating, etc.) than learning everything there was to know about electricity. My break came when I discovered the Electronic Music Center (now the Computer Music Center) at Columbia and met Brad Garton. It literally changed my life; I jumped ship and started studying music there, eventually graduating ten years later with my doctorate in composition.
Did you have any significant musical experiences (making or experiencing) in high school or undergrad?
In high school the main thing that was important to me was the collaborative process of music-making, which I always work best within... I've never been completely comfortable with solo performance, and I think that's a result of having my formative musical outings in bands instead of alone.
In college, the CMC at Columbia (as I mentioned earlier) literally changed my life in so many ways. Being around a community of people who were literally designing their own tools (hardware, software, etc.) to make music was incredible, and Brad had created an incredibly generous and meritocratic atmosphere that was incredible to work in, so you never felt awkward about being a nineteen year-old working at a computer next to a graduate student or faculty member. It was all incredibly open and a wonderful place to be. When I got there, the former directorate of the Electronic Music Center (Mario Davidovsky, etc.) were just leaving, so when Brad took over in 1995 we were able to completely transform the place from a tape studio to a computer music facility, while still retaining the charm of the old center and it's amazing history. I started out in electronic music playing Buchla and Serge modular synthesizers (hence my affinity for Max!), and that practice of working with hands-on electronic instruments turned out to be a big factor in a lot of the work I do now.
When were you first introduced to Max?
I first saw Max at a demo given by the theater tech department of the Royal Shakespeare Company in London... their lighting engineer was using it to control a serial-controlled lighting console. That would have been around 1992.
When I got to college, I saw a demo of Max in my sophomore-year MIDI production class. At that point I was using analog equipment, so it didn't seem that useful for me, though that would change soon enough.
When did you start to explore it intensively?
When MSP came out in 1997 I was a first-year graduate assistant and learned the software in order to teach it at the Computer Music Center. In 1999 my old band, the Freight Elevator Quartet, was starting to play pretty extensively and I was having a lot of electromechanical problems with the older synths (Buchlas and Serge Modular units) that I was using, and at a certain point I decided to try to switch to laptop, so I bought a Powerbook G3 and started using MSP.
What was it originally that Max/MSP did for you?
Originally I tried to make a close replica of my analog rig, but I realized pretty quickly that Max/MSP could do more useful things than that, so I created a pretty idiosyncratic synth interface driven by a Wacom tablet that I could use onstage. The thing I like with working this way was that, similar to the analog equipment, it was very difficult to do things precisely the same way twice, so the system had a fluidity that I couldn't really get with a normal digital synthesizer. Max programming is extremely addicting, so before long I was doing all sorts of crazy things...
How have you incorporated Jitter into your work?
These days I use it for pretty much everything, from doing visuals for music to working with generative systems for composition to developing new ways of transcoding media (going from sound to image and vice versa). What appeals to me most about Jitter is its general approach to data... it doesn't treat video as 'video', but simply as a type of information that you can map however you like. This distinction lets me work with media in a way that no other system allows.
You received a lot of press about your Billboard project. Can you go into detail about it?
Sure. Billboard is based on a technique I developed called time-lapse phonography, which is basically a technique for aggregating frequencies into a single sonic image, much like long-exposure photography, only for sound. The idea is that I can then radically compress sounds into shorter time frames while preserving their harmonic content. Billboard takes every #1 song that was on the Billboard Hot 100 chart and shrinks it to one second for every week it was at the top of the chart. So in 37 minutes you get the entire history of the Billboard Hot 100, told through the average spectra of its #1 songs.
Do you create your own Max/MSP objects?
Yes. In addition to writing a bunch of the standard Jitter objects as well as a couple of Max ones ([bline], [router]) I co-authored (with Dan Trueman) a set of Max/MSP objects called PeRColate, which is an open source collection of signal processing objects for doing things like physical modeling and wavetable manipulation.
Interview footage by Sue Costabile and editing by Marsha Vdovin and Ron MacLeod for Cycling '74.
by Marsha Vdovin on April 27, 2007