The Wavefondler Max/MSP patch was developed in order to provide a unique form of control potentially with a number of iPads running simultaneously; using your fingers to touch the waveform.
There is broad acceptance of the benefits of MT for performance and mixing operations, but it is clear that direct interaction with an audiovisual transformation on a desktop/laptop is yet to come of age, and that at the time of writing there is no way of using MT control on a visualization of audio that resides on an Apple Mac computer. The release of Mira in Summer 2013 offered new possibilities for implementing this. Mira is able to display a number of Max/MSP objects, however Mira does not currently include the waveform~ object, which is normally used within Max/MSP for audio visualization. As such, this did not represent a complete solution in itself, however Mathieu Chamagne produced an innovative Jitter-based abstraction (based on his MMF-Fantastick) that represented the audio waveform using the multislider object, an object that can be displayed on the iPad using Mira, thus facilitating this sort of control.
The video shows the patch in action moving through various modes, mostly glitching beats, although only with a single iPad…
The video only uses a four bar loop for simplicity, but when in M4L, the patch can work with real time audio. It works with multiple iPads, but my second iPad, a v2 was too slow at screen redrawing to feature in this vid.
This patch was first presented at the Innovation in Music conference in York, UK in December 2013; a great place to publish interesting patches. The next one is in Cambridge, UK next spring.
As Max or M4L in conjunction with Mira
Do you remember the first Max patch you ever made? What was it?
I developed an algorithm that could remove microphone spillage. To test it, I created a Max patch; there was lots of sample-accurate multi-channel stuff to be done, and it also hosted Altiverb. I only had a week to make it and attempt to use it before a conference presentation… It was a huge week though! The patch only half-worked at the time, but one of my postgraduate students, Steve Massey developed it a couple of years later and proved that it did work. I keep meaning to revisit things and take it to a conclusive stage, but somehow I always seem to get distracted with a new project instead…
How did you come up with this project idea?
I've always wanted to be able to touch sound. I still remember the amazement when I first saw a visualization of audio – just a picture in a magazine, but I knew that it was the future. Of course, we have been mousing with waveforms for ages now, but multi-touch is the way forward for the moment until everything gets retina and brain-wave driven. I started this as a laptop project actually. Of course, there are loads of iPad Apps that host audio out there now, but I think that this is the first chance anyone has had to interact with an audio native on a Mac with multitouch control of its visualization; even the commercial systems that boast multi-touch only offer single point control for editing audio waveforms at the moment. Judging by the new GUIs in Logic Pro X, Apple are preparing for the leap to multi-touch across the board.
What sorts of problems did you have to solve?
There were many layers on the GUI – it was hard work trying to develop it, since even in these days of presentation mode, it is easy to get lost and confused during development when overlaying multiple transparent sliders, all of the same size. Not owning an iPad at that point did not help, and Mira was not out either, although even after they were in the mix, there were still issues. It sounds a bit trivial given the complexity that we all typically get into with Max, but I certainly found it fiddly, perhaps because it was the first time that I had done anything like this at all.
If there were one person who you would want to see your project, who would it be?
Isaac Newton, back then of course. I don't think that he had an iPad and it would be the only way I could maybe impress him…
At the conclusion of this project were you:
b) ready to do a new one
c) thinking of ways to expand it
d) [other, please describe]
The project took me quite a long time, but it was off and on over that duration. In the last few weeks, I had to get it ready to demonstrate live at the Innovation in Music conference in York, UK. Hitting this deadline was quite tiring, and I had to shelve loads of functionality that I had been planning. Immediately after the conference, I had to get on with writing the accompanying paper for publication, and then after that other things happened and I lost momentum with this project; I guess that I had made my point though. I am currently planning my next Max project, and this one is pretty exciting – working as part of a team to develop a prototype of a really innovative App. I have a paid gig in London for anyone who speaks Max and is good with Objective-C as well… get in touch if you are interested!