Christopher continues to work and modify his original patches, and I was curious to hear a little more about them and where he saw the work going. He was kind enough to engage in a long-distance chat....
The patching that underlies both the innards of your Drone Creator and Super-Looper patches is clear and well laid-out. I'm curious as to what it was that started you on creating this series of patches - was it the investigation of playback as you became a better programer, or did you begin from a very specific compositional intention?
Your question on my interest in playback led me to think about my musical past reaching back to my childhood, where I owned a tape player and was fascinated manipulating it in a way that made it play tapes at very low speeds. And I still love the effect today when I play a vinyl record and forget to switch from 33 to 45 rpm. The music is still the same, but suddenly sounds completely unfamiliar. Using different playback speeds as an effect or compositional technique has nonetheless never been part of the music I produced. I think, this is mainly because there was no organic implementation of it in the equipment I used. Of course pitch-shifting and time-stretching are widely used today and can be performed with most digital audio workstations and audio editors, but it always needs a time-consuming non-real-time calculation, which is unnatural to my working process.
When I started discovering Max/MSP, sfplay~ was among the first objects I got to know. It is very easy to handle, and the right inlet gives you direct access to the playback speed - which is something I unconsciously always missed in other programs. So I did some experiments with several settings and audio files and was surprised by the results you can achieve with very simple means. In particular, fast and energetic popular music tended to morph into soothing soundscapes when played at about 20% of the original speed. The basic idea of the patch I then made involved playing two sound files at a time and letting the individual playback speeds be picked by a random selection of values between 10 and 30%, which slowly faded to the next value. This created constantly changing sound colors and polyrhythmic layers.
I was very surprised about the amount of interest and positive feedback I received after uploading an ambient piece whose source material was nothing more than a drum and bass track I had produced previously and then loaded into the patch. Besides applying a highpass filter to the original file to avoid an over-representation of low frequencies when transformed by the patch, no editing was at all was done.
A member from the Max community asked me if it was possible to use this in a live performance, and I had to admit that the patch was of little use for this purpose. But I hit on the idea to use variable playback speeds in a more flexible way with audio files or live recorded material - instead of using sfplay~, I integrated buffers and the phasor~ object using loops of adjustable length. It was very important for me to keep the idea of creating floating polyrhythmic layers, so I did not include any global meter or tempo control. This was the start of a work in progress project that reached its first stage in December 2015, when I had decided to use eight layers of loops and already added some other features such as filtering, delay, and the reduction of quality for each looper.
I tend to divide the patching world into two basic categories/approaches - those people for whom "the patch is the piece" (a specific patch is intended for a single use, and then the next composition requires another patch), and those people who patch by "creating an instrument." Where would you place yourself in regard to your practice?
I totally agree with your separation of patching for pieces and creating instruments. But for me, this only holds true for the finished patch. I am very interested in both, and I often build drafts of patches unrelated to a certain purpose. That means that I start with the idea of an audio transformation and sketch out different ways to realize it in Max/MSP. These processes then provide me as with a steadily growing data base of modules - which can be used either as functions in instruments and effects, or in compositions using live electronics.
What led you to choose the controller that's "wired into" your patch? Did you arrive at the controller arrangement in terms of order by a process of trial and error, or does the arrangement reflect your conceptual approach to how the sound is processed?
I am a big fan of hardware gear for one specific reason - namely, the haptic feeling when producing music of any kind. Using a digital audio workstation instead of tapes or software instruments instead of analog synthesizers might sound cheap and comfortable, but working on a screen draws a lot of your attention to the visual feedback when you are actually supposed to be listening. I currently use two USB MIDI controllers - a Behringer BCF2000 and a Novation Nocturn. They both have their individual pros and cons, and it turned out to be reasonable for me to use the Nocturn as a global volume control and the BCF2000 for detailed adjustments. Together with the laptop and sound card, they offer a lot of possibilities while still remaining light weight - which is perfect for performing in different locations. At the moment, I am planning to implement the hi object for controlling the Super-Looper patch with a customized external computer keyboard. This will give me the greatest flexibility using the patch in the smallest places.
The Super-Looper seems to be a refined and well thought-out instrument for live performance, and you've said that you will extend or add to the basic design based on what you want to do. Can you talk a little bit about the direction those extensions have tended to take - different post-processing? modifications to the basic Super-Looper functionality in some way? More/fewer channels?
For me, the Super-Looper is an instrument that you have to learn. I am still in the process of getting to know it better and exploring the possibilities it offers.
Sometimes this makes me realize shortcomings, such as the lack of the start/stop function. Recording an audio file live was achieved with an individual MIDI CC message for each channel triggered from buttons on the controller, but there were situations when using this function by hand was not comfortable. So I added the option to activate one or more channels and starting/stopping the recording with a trigger pedal.
Another example would be a concert that I played with more than one musician. The original version of the patch was designed to draw all live recorded material from audio input 1, so I had to add a little splitter that allowed me to choose from different sound sources. But these are just minor changes, really.
I was overwhelmed by the interest from the Max community in this patch and also received some advice how to improve it. Among the things I want to include in a future version will be a display of the audio file using waveform~, the option to choose from different audio inputs as mentioned above, and the possibility to record the performance in an eight channel audio file with an individual channel for each of the loopers. I am also currently thinking about replacing two or more of the eight layers with a recursive delay system - feeding the output of tapout~ back into tapin~ leads to an effect very similar to looping, and it might provide an additional source for creating interesting sounds.
I'm sure that everyone who looks at a clear and readable Max patch begins to imagine modifying it in some ways - For example, in my case I started wondering right away about using pattr to either create presets or as a way to record performance gestures as a series of reproducable preset changes, and I also wondered right away about using the new Max 7 time stretching options (which could be applied to both the drone and Super-Looper patches). Have you thought about these kinds of things?
Using pattr sounds interesting, but I personally would only use this to reset all settings on one channel or globally. From my point of view, the Super-Looper does not work with presets - each file you load in requires its own very specific treatment. It would, of course, be possible to produce an audio file, transform it in the patch and store the settings for this sample to reproduce the sounds at a later pont. I would be very interested if someone came up with a convincing concept for that, but my personal approach to performing with the patch is never to know in advance what will happen. Planning as little as possible before a live performance made me realize that a maximum of attention and concentration leads to the ideal workflow. While this kind of real-time composition is exhausting, it is very different from working in the comfortable studio atmosphere. Besides implementing a preset storage, you suggested adding time stretching as audio effect. I really like the stretch~ object in Max, but I think that its calculations are too CPU-intensive and unpredictable for use in a live performance. In particular, greater stretching and shifting applied to larger files can take several minutes at the expenses of flexibility. I try to avoid detached calculations. The audio material should be recorded or loaded as a file and directly be ready for further transformations. For that reason, I would refrain from using this object in improvised live situations. On the other hand, it might be interesting to include the pitch shifter that comes with Max. In combination with the variable playback speed of the Super-Looper you can achieve effects similar to time stretching in real-time.
Where do you see your work headed in the future?
Loops, chance operations and playback speeds have become central aspects of my musical creation, while improvisation and performance are only a minor part. In the future, I want to focus more on so-called “traditional composition“. But even there the patch for creating drones and the Super-Looper can be of great use, when designing sounds or transforming audio for later implementation in a work with fixed media.