An Interview with Dan Trueman

We often resort to the old fairytale riff about the trail of breadcrumbs when we try to summarize an artist's biography. If you try that with Dan Trueman, you may come away with the distinct impression that you're chasing two very different people who just happen to have the same, somewhat uncommon name.
For the garden-variety Max user trainspotter, the lights will automatically go on - he's one of the guys who worked on the PeRColate set of Max/MSP objects.
Those more inclined toward laptoppery in general will know him as the founder of PLOrk, the very first laptop orchestra. Fans of the Cycling '74 record label will know his work half of the duo interface with Curtis Bahn (their c74 release ./swank comes highly recommended).
Oh yeah. He is also that kind of composer who writes down instructions for other musicians to follow (the Brentano, Daedalus, Cassatt and Amernet string quartets, the Non Sequitur ensemble, and so on). More on that in a minute.
And then there's the other Dan Trueman, adept on the Hardanger fiddle, half of the folk duo Trollstilt, and a veritable Catherine Wheel of fiddle tune production in outfits as diverse as the Brittany Haas and Dan Trueman band, and the largely unclassifiable acoustic outfit QQQ.
The arrival of his latest release on Cantaloupe Music of the piece Neither Anvil Nor Pulley, written for So Percussion gave me an excuse to talk to both Dan Truemans at the same time.
I sometimes find myself marginalizing the technological side of what I do, even though I absolutely love it...
Since I know you a little bit and know how much of your musical life over the last couple of years has been involved with making tunes and playing the hardanger fiddle, I wanted to start by asking you about where and how you situate work of the sort that you’ve made for So Percussion….
It’s ongoing, that thing of being able to sort out things out in terms of situating my work in software or engineering or new technology. I really do have a lot of hangups with it, and it’s an ongoing problem – some of it having to do with the fact that when I was learning electronic music, I hated most of it. This was academic electronic music in the early 1990s. I wasn’t interested in most of the academic electronic music available to me at the time. It’s less so now, but there’s also an ongoing sense that if you do electronic music, you’re somewhat marginalized or somehow not a “real musician.” Even though I wonder how this could possibly be an objection since I play fiddle every day, I’m still very sensitive about it. I sometimes find myself marginalizing the technological side of what I do, even though I absolutely love it, and even though so much of what I do these days really grows directly out of an engagement with the technology.
This piece for So Percussion would not have happened without a full-on wrestling match with new technologies and programming – it just wouldn’t have happened in any other way. It’s not a piece that could have existed without that. It’s always at the heart of what I do, but the hangups are still there – I’m putting that front and center in terms of how I do things here….
So here I am to talk to you about that programming and technology stuff….
So I had to write up this technical brief about the piece that the record company asked for, and working on it I was thinking, “Wow. There’s some cool stuff here. I should be celebrating this!”
The thing that started me thinking about method even before I heard a note of the piece was the imagery of the title – a title like "Neither Anvil Nor Pulley" seemed to have something to do with the idea of force or work being hidden in some way. But I didn’t imagine that you were thinking of the work in terms which were quite so physical. Those anvils and pulleys are Rube Goldberg devices, not metaphors….
That’s right – even in the first big movement of the piece (the second section), they’re whacking on these pieces of wood the way you’d whack on an anvil in order to reset the phase of these digital metronomes that they’re playing with. But it’s NOT the same thing as an anvil, and in the second half they’re using these tethers – Have you tried these?
Hmmm. No.

This is the tether, the magic instrument used in laptop orchestras everywhere. It’s my favorite controller ever. It’s basically two 3d joysticks whose cables stretch something like 12 feet with a little bit of pull on ‘em. Six axes on ‘em. They’re magnificent – made for a golf game, originally. They’re 15 bucks, and you get 6 high-resolution signals from it. The PLOrk has maybe a dozen pieces now that use these – we get the whole group using them together, doing this sort of choreography inspired by the instrument and the way it’s mapped for controlling sound
[a pause ensues, during which the interviewer geeks out waving his arms around around, laughing, and imagining control data]
So Percussion uses this in the first movement – all four players are performing this composed kind of sonic choreography. Invariably, people really connect with them in performance – “What was that thing with the strings on it?”
It sounds like their response is somehow connected to the interface making sense in the way that the physicality of percussion performance works – the use of the interface “fits” somehow.
Exactly.
...make something that really got to the heart of what [incredible musicians] can do and what computers can do.
So, if the piece isn’t possible without software, where did the idea come from? Was it about physical movement first?
I spent three or four months trying to start this piece. I couldn’t find the thing that felt physically and rhythmically engaging in a way that I’d feel like asking these incredible musicians to play it.– it was terrifying. I didn’t want to just give them something where they’d press a button and stuff would just happen. I wanted to challenge them, to challenge their training, to challenge their musicianship – to make something that really got to the heart of what they can do and what computers can do.
So about three months in, I was trying various things and I made this little instrument – a piece of wood with a piezo on it – I’d hit it, and there’s a click going by at 120 bpm and my hitting it would reset the phase of the click, creating this little hiccup in the phase of the metronome. I spent something like 3 days just playing that instrument, just getting lost in the thing and exploring the rhythmic possibilities. I realized that I was on to something, because if I could get lost in this – breaking a sweat playing this instrument for a couple of days – well, there’s something there. So that’s where it came from – it came from trying to build something and feeling like I now had something I could see them REALLY being challenged by in an interesting way. In the piece, there are four players working with this thing – I’ve got these bits where they’re playing these hocketed rhythms tekkaTEKkaTAKketa on the wood blocks and the metronomes pick it up and go with what they play and then they’re trying to engage with that metrically. And if you think of that idea of resetting the phase of the metronome, if you keep hitting the block of wood late consistently, you can slow down the tempo in a totally smooth and fluid way – until you hit a Nyquist of sorts and you’re suddenly going too slowly and it catches up and hits double-time again.
So there’s nothing particularly mindbendingly complicated there in terms of the patch or the software that does what you describe, is there? Is it as simple as it seems to me to be?
Well, sort of it’s done in both Max and in ChucK. In fact, the whole piece is both of them running in tandem and communicating via OSC. The timing is done in ChucK. One of the things that’s a little tricky – I know that I could have done this in Max, but I was programming quite a bit in ChucK at the time – there’s sort of an appendage to the process I just talked about: you’ve got these clicks going, and each has a pair of tuned pipes with pickups on them. I’ve got this system where when they hit the pipe, the laptop looks at where it is in the current phase and it says, “I see that we have 400 ms. until the next click. So, for 200 ms of that time, I’m going to sample the sound of the pipe, and for the other 200 I’m going to play it backwards so that it reaches its peak at the next click. So you get these reverse delays, but they’re synchronized with this phase of this metronome that you’re resetting through another separate mechanism. So where you hit the bar relative to that click, you’ll get a different kind of swell. That was also an extraordinary amount of fun once I’d gotten it set up the first time – I had it set up with this flowerpot. I’d halve the tempo so that it was going at 30 bpm, so we’re looking at like 2 seconds off – you can hit it just before it or just after it and get these long swells or really short swells. These swells are crucial for helping the players feel the metronome's cycle, especially when the tempos are really slow; you can really feel where the next beat is going to come, rather than trying to just internally predict it.
All of that timing stuff – I didn’t do that in Max. I did that in ChucK; in part because that timing stuff in ChucK is so stupid easy to manage. I’m using Max with bonk~ to do all the input tracking, and they’re talking back and forth. This is pretty much how I work now, with Max and ChucK in tandem.
These sorts of combinations are super powerful, with Max acting as a sort of glue.
The timing stuff stays in ChucK, then?
Yeah. ChucK’s got this notion of NOW built into it, and it’s also not vectorized, so everything runs at the sample rate – it’s very inefficient, but it also makes it very transparent in terms of dealing with time. It’s very easy to make timing measurements and to schedule things in the future in a way that I could perhaps do in Max, but I’ve always found a lot more cumbersome – so things related to scheduling and timing are all done in ChucK. I leverage a lot of the other stuff in Max – bonk~, of course, but I also do all the spectral stuff. For instance, the tether instrument in this piece uses this phase vocoder instrument where I can freeze-frame though a sample in each hand – it’s got some processing built in – a little smoothing, a little companding – so that you can bring out noisy or more pitched elements. Basically, in this piece they’ve got a fiddle sample “in each hand” with particular pitches and transitions between the pitches and I can specify “at this point in time, you should be at the transition between these two notes”… you know how the sound is in a phase vocoder where you’re sort of freezing yourself at a point where there’s this great pitched material and other points where there’s this gritty transitional material. That’s all Max all the way – the spectral stuff in Max is very efficient and it’s easy to go in and do some nice processing….
And obviously, I just Max when I want interface stuff – display of information, sliders. Oh yeah – I love the fiddle~ object for feature extraction, too.
And some of what I’m doing is sort of the opposite of the Max for Live thing; instead of embedding Max in a large GUI-based time system, I embed a text-based timing system inside of Max. These sorts of combinations are super powerful, with Max acting as a sort of glue.
One of the reasons I wanted to talk to you about your use of Max has a lot to do with you being one of the co-creators of the PeRColate objects with Luke DuBois. I expect that there are a lot of folks out there who primarily are aware of you for precisely that reason rather than your life as a composer or player, and while those objects are great and still work fine, you may not have been on their radar for a while – they’re left trying to infer your life and work from a collection of external objects. [laughs]
They’re ten years old now.
The individual breadcrumbs you dropped along they way have individually become tourist locations in their own right….
Well, to answer your question, there are two things that happened since I worked with Luke on PeRColate: one is that I was using those objects when I was playing with Curtis Bahn. And then we started the Princeton Laptop Orchestra here. One of the things that the laptop orchestra forced or invited me to do – depending on what kind of mood I’m in – was to be more “generous” with what I make so that people can play them – instead of making things that were entirely idiosyncratic in a way that I and only I could play them – no one could look at the patch and have an idea of what to do.
There needed to be a kind of transparency in the things you made. Transparency beyond the kind of refpage description of things. More the whole “helpfile as an example of what you might do” approach.
In some ways, that’s something that we often don’t get from composers. In a way, it may wind up being one of the great gifts to the rest of us of the era of laptop orchestras, over and above the music itself.
It think that’s absolutely true – to me, it’s been transformative to have been “forced” to do this. In part, the reason it’s been relevant also has been that my efforts now, instead of being focused on making another munger~, have been focused on making pieces and interfaces and things that other people can use. That’s where lots of my cycles since PeRColate have been spent. And finally, this piece for So Percussion would simply NOT have happened without that change in focus and effort.
So you’re describing the invisible and intermediate steps from PeRColate to "Neither Anvil Nor Pulley", aren’t you?
Yeah. So I’m learning to make these things for other people to play, and this percussion group comes to me and wants a piece. But I’ve got 4 of the most virtuosic players on the face of the planet, so what am I going to do to engage them?
Does this piece precede your earlier "5 ½ gardens"?
No, this is a lot later. That piece had a little laptop component, but it was just me controlling the thing with footpedals. So Percussion didn’t do any laptop stuff in that at all. Jason from So Percussion is the drummer in QQQ, a band project of mine. So I’d worked with them before, but it was more from the standpoint of the acoustic composer or fiddler/collaborator perspective. We’d had So Percussion play with the PLORK several times before. The first time was our first concert, which included Zakir Hussain. I invited SO to join us in part just to deal with the having the world’s best drummer playing with the world’s first laptop orchestra – no offense to the students in that group, who were terrific. It was just sort of inevitably problematic, since we’d never done this before. So Percussion was great – for them, it was a thrill to play with Zakir, so it worked all around.
So after all that, they came to me and actually wanted something with some more serious laptop component to it. I guess I was at a point where I wanted to reach outside of PLORK in this process, too. – I wanted to see what it would be like to make these kinds of instruments and have them be in the hands of musicians of the quality of So Percussion.
Don’t get me wrong. PLOrk is great – it’s a student ensemble with new students every year – so there’s only so much you can do. There’s a lot of turnover, and there’s only so much you can expect of them in terms of practicing and learning new music and new ways of working and so on. So this was kind of a “next level” or next thing for me to strive for. SO this was one of the first times I got to take those ideas from the laptop orchestra and put them to use in a piece for a professional ensemble.
So, are you in this piece yourself?
Only secondarily. It starts with this drop of a needle (which uses Ms. Pinky, by the way) and it’s got this faked antiqued recording of me playing a fiddle tune that was antiqued using a version of Luke DuBois’ old noise gate patch – you’d look at each bin in an FFT, and if the signal in any bin was above a certain level, you’d zero it out. I’d made into a reverse noise gate where you output the part you’d normally zero. I ran it on an old Brahms recording, which left me with just the noisy hiss, and added my fiddle tune - which I’d seriously bandpassed - to it. That was all done with Max, and it was really fun. So I’m present – three of the tracks start with these fake old recordings that were done that way. But I don’t play on the piece in performance at all. There are times – on and off – where I feel like I need to write more pieces in which I don’t play. So Percussion wanted a piece they could play without me, so that they could travel more with it. I wanted to write some more music where I wasn’t required there. But it’s kind of hard for me – I like being up there with people making music.
That’s not a very difficult idea for a Max person, is it? You’re invested in that software “instrument” you’ve made, and it’s only natural to want to be one of the ones up there enjoying interacting with it….
That’s right. The question of maintenance – being around if things overheat – is lots less of an issue now. Before, whenever I did electronic stuff, I was always there on stage playing. As a fiddler, I’m always up there playing. To try to be the “composer” and try to make a piece for somebody else has always been a little bit of an uncomfortable thing.
Is the rest of "Neither Anvil Nor Pulley" driven by similar ideas of time and rhythm and behavior? Do the other component parts reflect the idea that you have another different set of processes that you’re going to explore? What you’ve described so far has to do with developing and giving form to some simple ideas – the piece is what happens when those simple ideas become interactive processes. That said, where do the other bits come from?
One of my favorite pieces in this piece is actually at the track change from 1 to 2. Track 1 is a good old fiddle tune that I wrote. I just literally gave it to them – it’s a little tune with maybe a couple of counter-lines and some foot stomping patterns that I do when I would play the tune. I gave that to So Percussion and said, “Figure out how you’d like to play this.” The love that sort of stuff – using steel pans and vibraphone and all kinds of stuff. The tempo’s about 120 bpm, but it’s a fiddler/drum ensemble’s 120 bpm – the end of it’s got this four-on-the-floor thing, Jason’s just got it goin’ in the kick drum, and as soon as it ends, the 120 bpm clicks start for the metronomes in that second movement. There’s this crazy moment when you go from this really grooved 120 bpm to this absolutely solid electronic tempo… it’s like “Whoa. What just happened here?”
You just turned the swing algorithm off in real time….
Right [laughs]. That moment highlights one of the things I’m interested in with writing the piece. When I wrote this piece for them – it’s got laptops in it and all that – I just wanted it to be a piece that they would want to play. In my time with them before, the fiddle stuff was just always something we’d done. I remember at one point playing with Adam Sliwinski, who’s this great marimba player. Adam’s maybe the most “classical” guy in the group – he just loves to play Bach on marimba and work on phrasing. At that time, I had this one fiddle tune that had some notes in it in that old-fashioned style, and in rehearsal I could see that he was really getting into it, whereas a lot of the other pieces, he was just struggling with playing drum machines and doing all this weird counting. When it came to this piece, I felt like I wanted there to be more in it for Adam. So these other fiddle tunes in the piece just came out of that impetus – just wanting them to spend time playing the music and not having to wrestle with the technology and the ideas I had for the rest of the piece. As we were working on the piece, having those two things together going against one another really put in sharp focus the tensions that arise when people play with machines and how machines deal with time very differently than people do. In terms of the piece, I felt that it gave them more to do, and it gave them a way to play together that really contrasted to how I was asking them to play otherwise, and that actually made the piece more powerful and effective when you saw things juxtaposed that way.
What strikes me about this is that it seems so often that when composers or performers work with those ideas, they become the basis of a kind of polemic about one of those or the other – “human” swing vs. unforgiving machine time. At a certain point, we get ought to be able to move past the polemics of it, where it becomes merely a compositional choice you can make or not. I think there could or should be a lot more pieces out there that have those ideas in dialogue with each other in the presence of an audience.
Yeah – I do feel that way in the sense that when I’m writing pieces I’m pretty shameless about using everything at my disposal without the usual boundaries. This piece reflects that - I like to code things up and play with them, and if I find something that I find really compelling musically, I’m going to use it. But that doesn’t mean that I’m not going to have some fiddle tunes in there as well. I want to use whatever is at my disposal.
As an interviewer, this makes me happy – I hate those situations where it seems like I have to stop the conversation to ask, ‘How did you use X to do this?” It’d bug me to do what with any software. I much prefer talking about the software as a tool in your composing toolbox – even though the piece couldn’t exist without the technology, the technology isn’t the reason for it.
Right. Can I be brutally honest with you for a moment?
Sure.
The fact that I’m using Max in this piece at all is a bit of a recovery for me. I’d worked with Max for a really really long time, to the point where I really couldn’t stand working with it any more. The reason was that I’m a very capable Max programmer with very bad habits… I found I’d write Max programs that I couldn’t revisit a week later because I didn’t encapsulate soon enough. I’d have these messy, nasty patches – let alone going back a year or more to work I’d done before. So I had this sabbatical in 2007 where I thought, “I am either going to quit computer music completely, or I have to find some other place to work.” That’s when I started learning ChucK because we were using it in the laptop orchestra – Ge Wang had made it, in part, in the context of a laptop orchestra. Our students were learning it. So I figured I’d better learn it. And I loved it – in fact, my first major project was porting the munger~ object to ChucK. I spent three months trying to make ChucK work like Max, and then I realized that that wasn’t the point. And then a quarter to half way through the porting I realized, ‘Wait wait – at this point in the port I stop here. I don’t port any more – I give it this basic functionality, and then I work with the rest of it in this other way where – instead of building all this functionality into this object, I have something where that functionality is part of the programming process or environment. But I did go through this point where I didn’t want to see Max at all.
Well, thanks for being so polite, but I don’t think of that as particularly brutal thing to say – you’re talking about the personal process of finding where a tool works well for you and where it doesn’t. As someone who uses software, that’s a good thing and a choice you absolutely get to make for yourself. I’m happy when people use Max to make cool things, but I’m happier when the world fills with cool things of any kind
This was actually a positive experience for me. Over the last couple of years, there’s been this process of discovering the things about Max that I really DO like – this kind of toolbench model where it’s been most effective for me. I’ve rediscovered the things that Max is really really great at – for example, the spectral stuff. Max is really great at feature extraction – pitch and attack detection, things like that. Max is really really great for building musical interfaces that carry a lot of music information or interface with other devices. Max is really good at those things.
And teaching - I’m teaching a grad seminar right now – a grad seminar on comparative programming practices where we’re looking at Max and ChucK and Supercollider. At first, I was brutally honest about my feelings about working in these three environments, and then I had some great programmers come in to teach the students these three languages. Konrad Kaczmarek, who’s a really beautiful Max programmer, and someone like Jascha Narveson to teach Supercollider. And at the halfway mark when I asked them what they wanted to work on, they all wanted to work on Max. They all wanted me to show them PeRColate. The last couple of weeks I’ve been programming in front of them, building some things. And man…. Some things are just so much easier to do in Max than anywhere else! I built one whole thing in a week in the seminar with Max, and the next week I did it in ChucK, except that I got stuck because the documentation’s not nearly as good and I couldn’t find this call that I knew was there…. Eventually, after the class I went and hunted through the source code for ChucK in order to find the damn methods. And then I finally built it and told the students, “You know, for a lot of you, you really want to or have to learn some really old-fashioned programming to work in ChucK. You have to want to learn about functions and classes and things like that in a way that you just don’t have to when you’re learning Max. So I’m never NOT going to work with ChucK - at least I think I won’t – and it’s not like I’m going back to using Max exclusively….
You’re working in a multilingual environment. A multicultural environment too.
Well, it’s true. It’s also so that we have FRIENDS in those languages, too. I have a lot of Max friends. I think I only have a few Supercollider friends [laughs] Paul (Lansky) is a supercollider friend. And Jascha writes supercollider in way where I can actually read the code and make sense of it – he’s trying to write readable code rather than trying to impress me with the impenetrable concision of his code. But people are different, and they’re drawn to different things and they find their own ways.
For me that choice of path is interesting. And of course, it’s always interesting to know about the things that impeded progress, if you’re involved in creating software other people are going to use.
My issue with Supercollider and Max is basically the same – the large vocabulary is great for people who are programming in either language every day in the sense that you retain a familiarity with this huge library and what it does. I don’t program every day; I might spend a week programming intensely, and then go 6 months without doing any programming at all. For me, smaller vocabulary tends to be better. So I can write for loops and if statements in ChucK, If I need a pattern library, I can make it. That’s easier for me than trying to remain versed in the large library of possibilities. I do encounter this a little bit in Max, but Max is easy to work around this problem in – you can just option click “Oh, and there are all these other objects, too….” Even if I’m out of shape – and I do get out of shape with languages – it’s pretty quick to get back on the horse. “See also” makes such a big difference – it’s just boom-boom-boom. Oh. That’s right. I’ve gotten to where that’s how I do anything. I use something that’s sort of like what I want and then option click and hunt around until I find what I’m looking for.
Yeah. That’s one of the things we found with working with Max 6 – it’s some of what drove the development of autocomplete and the refpage stuff that supported it. The see also is part of the autocomplete now….
Yeah. It’s awesome.