An Interview with Rob Ramirez
Hi Rob! Can you tell us a little about what you do at Cycling ’74?
I primarily work on Jitter development and support. I helped create most of the new GL features introduced with Max 6, and focused primarily on the video engine overhaul in Max 7. I also created the jit.phys family of objects, the GL multiple render target and shadow implementation, and other minor tools like world, cornerpin and path. Recently I assisted with the new Video and Graphics tutorials, and am currently working on the viddll engine that was released this year as a beta package via the Package Manager. My day usually begins by going through the cycling74.com forums and Facebook pages to try and catch any Jitter-related queries that go unanswered.
What got you started in working on programming with visuals, and especially working with 3D graphics?
This goes back to childhood, when my uncle gave me his super-8 film camera and taught me how to make stop motion animations. I pursued that through college where I studied a variety of subjects related to manipulating video and sound. My dream was to use different media elements in a live performance setting, the same way that I used vinyl records when DJing. Meaning any particular element could be played and manipulated at any time during the performance. At that time this was unheard of, but after discovering Max I knew I’d found the magic ticket. After college, I moved to New York City and continued pursuing that objective of real-time media performance in grad school and various artistic collaborations.
One such collaboration with the artist Kurt Hentschlager necessitated heavy immersion in 3D graphics development. He had an amazing installation that involved 3D bodies floating around in a physics simulation, acted on by forces and generating sounds from their motion. Kurt developed it in the Unreal game engine, and he had reached a point where the limitations of the game engine were impeding his creativity. I created an app using the OGRE rendering engine and Bullet physics engine that communicated with Live via a Max patch (Max for Live wasn’t around yet). This experience lead to my graduate school thesis, a Max external that loaded and animated 3D models. I started working for Cycling ’74 shortly after graduating, and again used my experience from that project to develop the jit.phys objects. If you’re a fan of any of those objects and ever get a chance to see Kurt’s work, you should thank him for being indirectly responsible!
You interact a lot with the forum and with helping people get on top of their coding. Have you ever thought about teaching? Or do you teach now?
I taught a Max class for a few years at Parsons School of Design. I love interacting with people learning Max, hearing about their crazy ideas and problem solving ways to achieve them. While I enjoyed teaching, I much prefer the informality of workshops, meetups, online communication, or just hanging with folks over some beers and helping them out. There’s a certain rigidity in working at a university that my personality doesn’t really mesh with, not to mention the incredible financial burden students these days are subject to. I’m always happy to guest lecture, and do quite often, but I have to leave teaching to the professionals. It’s hard work!
What are the kinds of things you do as part of your personal artistic practice?
I create works of performance that involve interactive media elements. My primary focus as an artist is something I call video puppetry. This has taken on many different forms through the years, but involves a human performer augmented in some way with video and sound technology. I collaborate with dancers, sound and video artists, and theater makers to create characters and situations that would be impossible without the technology. One of my collaborators, Phil Soltanoff, dubs this post-human theater. For me, it’s all about using my digital media skills to spark the imagination and reach beyond traditional methods of making performance works, but at the same time never losing touch with the human component.
I am also a DJ, and about 6 years ago that practice morphed into DJing with music videos. I hesitate to use the word VJ, as that has other connotations. This is a perfect example of what I described above. I started out DJing vinyl with two turntables and a mixer, and through the years my setup has changed to incorporate digital technology, thereby allowing me to manipulate music videos along with the audio tracks (as well as many other augmentations). However, the performance is still very much rooted in the techniques I learned as a pre-digital DJ. I see younger DJs today who use timecode vinyl but have never learned how to manipulate an actual piece of vinyl, and it depresses me! The technique developed from the medium, and there’s so much nuance that gets lost if you’ve never trained on that medium. Sorry for ranting – If you’re in NYC, come see me DJ music videos on the first Monday of the month at Trophy Bar in Brooklyn.
Internally, you shared a recent review for a performance called “Steve Of Tomorrow”. Can you tell us a little bit more about that particular work?
This work is the brainchild of my collaborator David Commander. He had been chewing on this idea for a long time of a visitor from the future who disappoints his host by his total lack utopian characteristics. There’s also a corporate sponsored man-made hurricane. David came to me with this project because he had a vision for creating the characters as puppets with video faces. Obviously, this was right up my alley! I developed a system for controlling the puppets using the actor’s voice, midi controllers and cues, and I performed the voice of one of the puppets (my first acting gig). Naturally, everything was made in Max. In developing this technique, I took inspiration from the animation style of an old Adult Swim show, Tom Goes to the Mayor. This play is a lot of fun and highly accessible, but still intelligent and insightful. Tonally it stands in stark contrast to my last major work (a philosophical treatise on art and technology as told by the video puppet version of Captain Kirk from Star Trek), which usually had a few people walk out during the performance.
We just finished up a short run of this piece, and will be remounting it in 2017 at the resurrected Collapsible Hole performance space in Manhattan’s West Village.
Thanks Rob! Can’t wait to see more of your work – both code and art!