An Interview with Jasper Speicher

    Emerging technologies have blurred the boundaries of art and science, music and coding, sound and image. With a background in both the visual arts and engineering, Jasper Speicher's strong interest in form and function has led him to the creation of interactive objects and experiential environments for public space and educational displays. In this interview he explains his use of Max in prototyping large projects and creating tools.
    You have your B.A. in Engineering and in Visual Arts? Did you have this duality growing up?
    I’ve always been making things. When I was growing up my mom was doing medical illustration and graphic design. She was working at Reader’s Digest when they switched over to computer systems from physical layouts, so we had a Mac computer, and probably the first versions of Photoshop and Illustrator in the house when I was growing up.
    I guess I kind of got familiar with computers growing up because my mom was using them for her work, and specifically with creative stuff in computers because that’s how she was using them.
    I also had an Amiga computer, and I messed around with it a little.
    Those were cool computers!
    They were really cool. I had an Amiga 500. So I messed around a little bit with some basic, simple, simple programming on that. Nothing too serious, though. And I made a lot of paper airplanes.
    Looking back, I guess in most everything that I was doing, I was interested in all aspects of the functions of things and what you might call the spirit of an object. I was really interested in aesthetic considerations, but rarely was that exclusively what I was interested in.
    I was interested in things that were beautiful and worked.
    I started incorporating electronic music and Max patches into the sculptures that I was making, so I started making musical instruments.
    When did you start to learn serious programming?
    While in high school, the internet was just kind of taking off, and I started getting into web programming a little bit, and a little bit of robotics, and a lot into video and special effects and things like that. Still not any serious programming, though, I was still doing a lot of art.
    But then, when I had to apply to schools, I knew that engineering was more likely to get me in somewhere. I knew that I was more likely to be able to get a career in engineering, and I thought, “Well, as long as I still think engineering is kind of cool, I guess I’ll give it a go.”
    I got into school, and I hated it. I felt like the professors were just so frustrating. We were doing problem sets that just didn’t relate to real life at all. It just felt like a total grind, it didn’t really feel like anybody was really paying any attention to the relevancy of engineering outside of the classroom, and I just got really bummed out with it.
    During my freshman year, I was living next door to this girl who was a few years older than me, and she had created her own major in industrial design at Brown, where she was taking engineering and art classes. She was a really strong woman who was willing to push her way into any situation. She told me I should to do the same thing.
    So I thought, “Well, it can’t hurt.” So I went to the art department, and I got myself into a sculpture class. I skipped all of the prerequisite studio requirements just by being persistent.
    How did it go with sculpture?
    I totally loved it! I fell completely in love with sculpture; fell in love with the process of creating and critiquing work. It was all I wanted to do. So for a while, that took over my life, and my engineering classes kind of suffered from that.
    Eventually what came around was, Richard Fishman in the art department — I was doing their honors program at Brown — put me in touch with a lab in the engineering department that I could do engineering and art projects with.
    I’d been doing sculpture, but I had taken an initial CS class, so the guy in engineering, David Cooper, was open to basically letting me do a sculpture project in computing. So that was when I started really doing programming seriously.
    So you applied your computer skills to creating art.
    But alongside of that, I was also taking computer music classes. That was how I got introduced to Max. The class wasn’t going to work for either of my majors, but I still took it anyway, because I’d been making music my whole life.
    So when I was a freshman, I used to do all-nighters for engineering, and then the next night I would stay up all night again working in Digital Performer and making songs. But none of my engineering skills ever really made its way over into music until I took my first computer music class at Brown, with a guy named Barry Moon.
    We quickly moved into Max, and I picked it up fairly quickly. Then I started taking other upper-level courses with Max, and I had a friend who was doing a lot of DJ'ing with NATO [Nato.0+55+3d]. We started collaborating, and we started doing networked sound and visuals, where the visuals were controlling the sound and vice versa.
    Did you carry on with sculpture?
    Yes, actually. I started incorporating electronic music and Max patches into the sculptures that I was making, so I started making musical instruments. I started making these soft silicone objects with pressure sensors and things in them, and I could squeeze them and manipulate them, and control sound with them. They had LEDs in them that would light up.
    So I ended up doing this thesis in undergrad as my final Engineering BA project, where I’d written a program that let me take 3-D scanned data on a point cloud and manipulate the data with a stylus using 3-D input data and Fish Tank VR. I then would print that out as a physical object, and cast that with the sensors in it. Then I chopped up this old guitar so I could plug this object into it. There was a Max patch in the back that was kind of holding everything together.
    How was it presented? Like an interactive museum object or… ?
    I did a performance where I manipulated these objects and played music for people. It was a whole combination of sculpture, electrical engineering, and 3-D shape programming.
    Instead of using a normal A to D chain then, I actually used my audio interface as the input for the sensors. So I created this AM modulation patch where I could get extremely high-resolution data from the sculpture into the computer using a 24-bit/96khz audio interface. So Max was obviously crucial to something like that.
    Then you went on to get a Masters?
    I went on to do a Master’s in Computer Music at Brown after that. At that point I was doing a lot of the physical sculptures and manipulating them, and more of the sound and vision works. It was basically an extension of all the stuff I did in undergrad. Once I was in that program, I was patching in Max for a couple of years pretty much every day.
    It’s [Max is] a really great environment for throwing something together and then changing it on the fly.
    It sounds like you took to Max fairly naturally?
    At first I kind of hit a brick wall with Max, where nothing seemed to do exactly what I wanted, and it was totally confusing. That’s often the way it is when learning to program. But I became proficient in it. I had spent so much time working in it when I was an undergrad and in grad school that I really got to know Max and Jitter.
    I’ve been able to use it in all kinds of projects for Tellart. I’m in bands now, and I’m producing music, so I’ve got Max for Live and I’m using it for our live shows as well.
    So I kind of learned it a long time ago, but it keeps coming up and it keeps being useful.
    How did you get involved in the Universal Orchestral project for the Chrome Web Lab exhibit?
    I’ve spent the past seven or eight years at Tellart as the lead engineer. When we developed the Web Lab Project with Google, we went through numerous different phases. Eventually we ended up deciding that we wanted to design it so that people could create music in the museum and collaborate with people on-line.

    Web Lab

    So we made this ensemble of pitch percussion. I worked with my friend Adam Florin, who I was at Brown with. So that project had a lot of different moving parts. In that project, Max was integrated with things that were going on in Pro Tools, and in Node.js, as well as operations that were going on in people’s clients, on the web browsers. There were kiosks in the museum running Chrome, and Max was hugely useful for the prototyping of all that, initially. It’s a really great environment for throwing something together and then changing it on the fly.
    Also at Tellart, we have a combination of designers and engineers, but very few people at Tellart are really 100 percent one thing or another. So Max is a great language because you don’t have to be a programmer to use it.
    I think that it allows a lot of people who couldn’t otherwise program to get something really interesting done, and it helps them to actually see the way something works a lot more clearly.
    Good for prototyping across disciplines or departments.
    I think that Max kind of has its sweet spot there. I think that when projects get large enough in Max, it almost doesn’t matter whether it was made in Max or written in code. In some cases a written language is even better. But there’s this small-to-medium-sized project, and especially something that you’re building from scratch and you’re experimenting with, where Max is really well suited. And this was one of those projects.
    So we did our first prototype at the Boston Science Museum, and we had a bunch of kids and parents come through and try out these Monome controllers that were hooked up to this crazy accordion. It was making this horrible music, but it worked.
    They were all networked and communicating with each other, and then there was a tablet PC that could be used to control them as well. We actually learned a lot from that.
    Then we took that prototype and kind of re-implemented a lot of that into web languages in Chrome. But without Max as the first language to prototype it in, I think it would have been much more difficult. I don’t think the idea would have floated as easily.
    So then in the actual Web Lab, Max is in a workhorse position of scheduling and triggering actuators. In the final incarnation of the Web Lab, I think the scheduler in Max was really the most useful thing.

    Universal Orchestra: Hardware Demo

    A lot of the interface ended up being in Chrome, because that was kind of the centerpiece of the whole project.
    Were there any particular tools that facilitated communication between Max and the network?
    Florin wrote a Python daemon, which ran on the same machine as the Max patch, proxying WebSockets messages into OSC messages for Max.
    I’ve read that a lot of the code for the Google Web Lab is open source.
    It is.
    It [Max] can actually convert somebody from a practicing artist to a tool maker, a multimedia tool maker.
    Did that change how you worked, knowing that other people would be looking at the patches?
    It was a big group project, anyway, so it was very important to make things orderly, and comment them. Adam Florin, who really owns that portion of the project, is an exceptional programmer. He’ll do an incredible job of making things clear and commenting them, no matter what.
    Adam's Web Lab patch
    There is always that temptation with Max to create spaghetti. Whenever you go to a conference, there are the people who have this kind of cavalier, bold attitude of throwing a horribly ugly Max patch up on the screen, and sort of relishing in how ugly and confusing it looks.
    I think that’s a fabulous part of Max, is it’s not really tame, the way that a lot of other languages are. It’s still totally wild. It doesn’t pretend to be optimal. It’s this really strange, extremely versatile tool in so many ways. It’s versatile in terms of all the different functionality that it offers. Versatile now in terms of all the different ways that you can control and interconnect that functionality.
    I mean, you can write Ruby code and have that running inside of objects in your Max patches. I’ve done that.
    It’s also versatile in terms of who can use it. People with technical or non-technical backgrounds can both kind of meet there. So because of that versatility, to me, it’s never going to be the little cog that fits in to the big machine. It’s way cooler than that.
    It’s like this crazy living, breathing beast. It’s constantly evolving, and I think that’s what gives it its power.
    Do you find in your personal work, and when you’re working musically, is your patching style messier than in your commercial work?
    Yes, it is. Definitely. I can tell you, it’s gone through different phases. Like when I was in grad school, I made my patches really neat. I made lots of abstractions, and I embedded things in sub patches and things like that. I made all of my patch cords rectilinear.
    I did that partly because I was sharing with a friend of mine, and because I was hoping to release it to other people, let them use it. It some cases I did that, and it worked well.
    But I think that I was also real interested in making tools then. And I think that that’s another place where Max is really kind of strange and different. I think it attracts the kinds of people who are artists, and they’re making work, but by the time they’ve been thoroughly introduced to Max, they become interested in making tools. It can actually convert somebody from a practicing artist to a tool maker, a multimedia tool maker.
    I’ve seen a lot of people go through those phases with Max, where they’ll start off with a vision for what they want to do, for their first Max project and they’ll just start to do it. Then what happens with a lot of people is they start to think of all of the different things that they might want to do in addition. They start making a patch that could do all of those things and pretty soon, their whole frame of reference has shifted from, “How can I make this piece of artwork?” to “How can I make this tool that will allow me to make artwork?”
    I think when that happens, the patches tend to be a lot neater and tidier. Because the focus is on the craft of the actual patch itself — the patch as a tool for making work in the future. It feels like the patch is going to have a life of its own.
    But I think when people are more focused on the actual artwork or the piece of music, the tendency is for patches to get a little bit more messy. Because people aren’t as focused on the craft of the patching as they are on the end product that their audience is going to see or hear.
    So there are these patches that are totally hairball, crazy, messy things that sort of fly off as a result of a creative process that’s leading toward some goal. And then there are patches that are the goal in themselves, and I think those ones tend to be a lot more orderly.
    Do you have a favorite Max Object?
    That’s a great question. It’s funny, for some reason the first one that comes to mind is slide. I think it’s just a really elegant way to average things — great for smoothing real-time sensor data. The math behind it is really simple, and watching the output can be relaxing. I just like that.
    What was the Sound Affects project?
    It was a project that we did with Mono and Parsons, to encourage creative engagement with the city environment. We constructed this wall at the corner of 5th Avenue and 13th Street in New York City and it had all these cameras and sensors built into it to collect a variety of data input.

    SoundAffects at The New School

    We used Max, MSP, and Jitter to collect real-time data from analog sensors: weather feeds from the web, sound from the neighborhood, cell phone radio noise as well as information about people and colored objects moving along Fifth Avenue. We sent that data through MIDI to LIVE, which enabled flexible routing and scaling.
    I worked with Adam on this one as well. He used this system he created called Loom, which is a bunch of Ruby code running inside of Max that produces generative music within Live. The result was a nonstop, 10-day-long piece of music that was accessed via headphone jacks on site as well as being uploaded and available in near-real time to the browser on your phone.
    We also used Max to collect a data-snapshot of the entire system every second and upload it to the project site in real time, along with sound and web cam images.
    ... we did a show where I hooked a bicycle up to our synthesizer.
    You mentioned that you’re currently in a band, and you’re doing music. Do you think that was inspired by the orchestra project?
    Actually, my interest in my band Animal Friend was inspired by feeling a real sense of alienation when people played laptop sets. It’s really alienating to have somebody standing in front of you, mousing around on a track pad. It feels like there’s no connection between what they were doing and what you are hearing. I think basically the problem is that it’s not even considered a problem.
    I feel like what I was able to do in school was to break out of that a little bit through sculpture and create physical objects I could interact with. But what I realized is, I wasn’t really even feeling satisfied with the level of musical expression I had with those objects. So I decided I should learn to play a musical instrument better and be in a band. So I just kind of did that.
    So what are you playing?
    I spent a few years playing bass, but I’m playing bass less these days and I play more synthesizers and I sing. Over time I’ve been integrating a little bit more of the electronics and multimedia stuff into what we’re doing.
    In March we did a show where I hooked a bicycle up to our synthesizer. We could play notes into an Arduino that was on the bicycle, and then it would send those notes to the synth. I also had a Jitter patch running inside of Ableton Live, and it would basically accept notes and then change the colors of some DMX lighting on stage and change some projections. So as I was pedaling the bicycle there was all kinds imagery and color coming out of Jitter and through Max to some DMX lighting. The faster I pedaled, the more crazy shit got.
    I did a show last year where I used a light bulb and a light sensor. I had a light sensor and a microphone, and I could play keys and when I brought the light bulb closer to the microphone it would open up the filter on the synth that I was playing. That was a really fun, simple gesture.
    Simple is always best for live work!
    I have been feeling like the simple things are really what I’m most interested in using the technology for right now. I used to make these vast systems of Max patches that could communicate over the network and do all kinds of crazy stuff. But these days, I’m really more interested in creating a simple, fundamental thing, just something raw and basic and well done, then integrating that into the music.
    So that’s what I find Max useful for these days, just kind of building those kinds of tools into our Ableton Live set that we use as a band.