Articles

An Interview with Bruno Zamborlin

When I first saw videos of Bruno Zamborlin’s Mogees project, I thought "cool, someone has made an app for that old contact-mic trick."  I’ve seen plenty of performances done with contact mics and drummers have been trigger-progressing sequencer patterns from pads since the 80's. But then I began watching the expressions of wonderment and joy on the faces of those trying out Mogees.  From non-musician to experienced percussionists, that look of entrancement revealed that there was something much deeper going on here.  Mogees give anyone nearly transparent access to a percussive world of the sounds and objects around them while hiding the incredibly complex algorithms it takes to get them there. That's when I realized that its real magic is in taking some pretty esoteric “computer music” technology and making it resonate with people on a street level.

Bruno, can you please explain, what is Mogees?

I came up with this technology that I call Mogees.  The idea is to turn every physical object into a musical instrument.  To somehow inject musicality into the acoustic properties of the objects that are around us.

The technology is composed by a piezo transducer sensor and software.  The idea is that the piezo transducer transforms the vibrations that we make when we touch the object into an audio signal that is then sent to the computer or to the phone.

Together with my friend Carmine Emanuele Cella, we wrote the software that analyses such vibrations and extracts some meaningful information such as frequency, amplitude, time decay and so on.  And then applies machine learning techniques to estimate how we are interacting with the object in real time and decide the note to play.

In terms of sound synthesis, Carmine and I employ a technique that we called ‘physical-inspired’. Which means that Mogees sounds something like a physical object, and the sound engine is fed by the real vibrations of the object, but instead of attempting to recreate exactly the behavior of a real-world object with complex physical equations, we create a virtual one based on musical rules.

Can you give us a little background on yourself?

I’m 30 years old.  I’m Italian.  I studied computer science in Italy, and then I moved to Paris, I worked at IRCAM, which is the music research center where Max MSP was born.

I worked there for a little more than three years as developer before starting my PhD between IRCAM and Goldsmiths.

What kind of work were you involved with at IRCAM?

Before IRCAM, I was already familiar Max. I learned it a bit by myself and then by working at an institute in Florence called Centro Tempo Reale, funded by the composer Luciano Berio.

But my big education came when I moved over to IRCAM.  That’s where I really started specifically studying gesture analysis and audio synthesis in a much more scientific way.

The project I worked on there for three years is called Gesture Follower.  It’s a project by Frédéric Bevilacqua, who is the head of the real-time musical interaction team at IRCAM and would later become one of my PhD supervisors.

He taught me everything I know about Max, gesture recognition and interaction design. Together with Norbert Schnell, they really have been my mentors for the whole time, like they were my parents really.

So the Gesture Follower existed already by the time I got there, but it was a Max patch that used a lot of FTM, which is a bundle of externals for Max MSP developed at IRCAM by Norbert Schnell.

It was a big, complex patch, quite hard to decode and to understand.  Basically my job, at the beginning at least, was to move all this code over to C++ code, coded into one single Max object called gf.

I did that during the first six months of the project, and then for the rest of the time I’ve been improving the technology, adding new features, and working a lot with artists and musicians.

Can you give us an example of how the Gesture Follower works?

Sure.  First, you can use any sensor you want to actually capture these gestures.  It can be an accelerometer, a gyroscope, a video camera, an audio signal, et cetera.  Then you basically teach to the system a bunch of gestures that you want the system to learn.

So, let’s say you teach like five gestures.  And then, when you perform, and the system will tell you how similar you are to each of the gestures you recorded previously. So it tells you, “I think your gesture is 80 percent similar to the first one and 20 percent to the second one,” for example.

And it also tracks the temporal position, so it tells you where you are in the gesture — at the beginning, middle, or the end of the gesture.  Then you can use this information to time-stretch audio files and video files and control them through your gestural performance.

Violinist Mari Kimura is the only person that I’m aware of, that used the Gesture Follower by training the gestures to the system in real time, during the performance itself.  She was training the Gesture Follower during the concert and then using it as a sort of a looper whose tempo was perfectly in sync with her performance.  It was quite impressive. She is quite a performer.

And then we did a few sound installations, most of them using the Gesture Follower.  It was actually quite interesting to develop one technology and then apply it to so many different domains.  I used it for dance, percussion, trumpet, interactive installations in a park — different contexts, but all using the same core technology.

That's similar to what I’m doing with Mogees.  If you watch the videos, they are all quite different but they actually rely on exactly the same technology.

The whole point of Mogees is working in the street, finding the object that you like, stick the microphone onto it, and then start playing this object.

So, you're based in England now.  How did that come about?

After three years working at IRCAM, I wanted to do a PhD.  I got a full-time scholarship in London at the Goldsmith University of London's. I managed to do a joint PhD, in Arts and Computer Science, so I had two supervisors at IRCAM's Real-Time Musical Interactions team and two supervisors at Goldsmiths' Embodied AudioVisual Group.

So Frédéric Bevilacqua had been my boss for three years at IRCAM, and then the supervisor of my PhD together with the other two supervisors at Goldsmiths.

That's a lot of work!

I was going back and forth from Paris to London on the Eurostar train, every month for years!

How did you ever have time to develop Mogees?

Well, about the second year of the PhD, I came up with the basic idea for Mogees, and started just using them on my own gigs, my own performances.  Then when I started uploading my videos on YouTube, it got quite a few views.
The very first video that I did was in 2012, and it’s the first one that basically helped me decide to open a company and to make Mogees become my full-time activity for the last two years.

This first video was actually a video with a prototype that I developed in Max MSP.  I remember that I uploaded the video and then went back to Italy for the holidays.  When I got back to London about a week later, there were almost 300,000 views on YouTube, and my inbox was full of emails, “can I buy one?” or “can I help you to sell it, can we commercialize it?”

At that point I decided, OK, I really want to try to make this, from a Max patch that was perfectly working but was kind of complex into something which was super simple.  So I started the porting to the iPhone app.

It was really important for me to make it work on the iPhone.  The whole point of Mogees is working in the street, finding the object that you like, stick the microphone onto it, and then start playing this object.  So being mobile was definitely a requirement.

But basically everything started from that video, really.  And that’s one of the reasons why I still haven't finished my PhD!  [Laughs]

So, you originally designed and prototyped Mogee in Max?

Yes.  But the final version is for iPhone, so it’s pure C++ and Objective C code.

How did you manage start-up costs?

I founded a company, which is limited, and then basically at the beginning I got some funding from my parents and friends. Few months later David [Zicarelli] decided to support the project and become one of the directors of the company.
Then we decided to do a Kickstarter in March, which lasted 30 days, and it went really, really well.  We smashed our target; we got like $160,000, which was excellent.  We presold 1,635 units.  That really was the point where basically everyone said, “OK, let’s go ahead with this.”

So we placed an order from the factory.  The app is ready.  But, of course, we’re continually trying to improve it and to add new functions.  We are hoping to start shipping at early July for the beta testers.  And then maybe the end of August for everyone else, on iOS for now, then we’ll try to do the porting for Android.  Which is a pain, but we have to do it.

Are you still using the Mogees for your own performance work?

I play with a band called Plaid — they're on Warp Records.  We play quite regularly together.  For our concerts, we use Max for Live, so all my C++ code that is used for the iPhone app is also wrapped to a single, nice Max external called mogees~.  That’s what we use live.  It’s super experimental and a bit buggy.  It’s for internal usage only.

Any plans for releasing that external?

Well, for now I just don’t have the time to finish and support it.  It would be too much work right now with everything going on, but I hope to one day!

When designing your own objects, were there any other objects that have become vital in what you’re doing?

Well, I use Mubu a lot, which is a set of externals that were developed at IRCAM.  It’s a small set of Max externals that is used basically for sound descriptors and motion-capture data.  I used them a lot on this project.

I didn’t want to waste time to write all the bricks that I needed.  I want to spend time on the idea, on the interaction design.

What first inspired you to learn Max?

I studied computer science, so I was pretty good with C++, and I loved music. I was really into electronic music and listened to old electronic music stuff, kind of experimental stuff.  I wanted to do music myself, but when I used commercial sequencers like Cubase or Pro Tools, I always found so many things that I couldn’t do.

Like every time I used a MIDI mixer, there were always things that I just could not figure out how to do in the software.  And it’s not possible to hack this kind of software because they’re too closed to being hacked.  You can’t just change them.

I was looking for a solution.  First, I spent maybe less than a month, with C Sound.  But I found it really boring, mainly because the process was really slow, because it was not interactive.  You need to compile, and then you listen to the sound that you generate.  It was very far from what I wanted to do.

Then, when I discovered Max, it was unbelievable.  I remember I fell in love, mainly because it was not about reinventing the wheel.  I didn’t want to design a new filter, for example.  I didn’t want to waste time to write all the bricks that I needed.  I want to spend time on the idea, on the interaction design.  Max was just perfect.  It was just so fast.  Everything was so much faster than writing in C.

So for prototyping it was just perfect. It was a great discovery, and I have never left Max since then.

I like when you find a complex shape, like a tree, for example, where the branches are quite thin, so the vibrations can work really well.

What’s your patching style like?

Super-organized.  I’m very computer science oriented.  I'm very thorough with all the comments.  I could reuse a patch that I wrote ten years ago — easily.  That’s the only way you can write code in C.  If you’re not organized with your comments then you’re never going to understand them years later.

Do you use any hardware controllers?

Theremins!  Actually, my first project with Max was a prepared Theremin. I made a few using these little boxes from IKEA and I built a Theremin with antennas inside each.

I used a bunch of these Theremins, like five or six, independently.  The audio that was coming into the sound card was used as a control signal to actually control other parameters in Max.  That was my very, very, first Max project, like 2004.

What’s your favorite object to use the Mogees on?

Everything that is big.  Everything that is complex.  I like when you find a complex shape, like a tree, for example, where the branches are quite thin, so the vibrations can work really well.  And with many branches, you have a lot of different frequencies in the tree.

It must be fascinating to watch what other performers are doing with your technology.

It really is! I just saw Rodrigo y Gabriela using Mogees in their show at the Royal Albert Hall in London and that was a real satisfaction. Imogen Heap is also using Mogees for various projects.

Mogees Website

Text interview by Marsha Vdovin and Ron MacLeod for Cycling '74.

by Marsha Vdovin on June 23, 2014

Stephane Morisse's icon

Ho Bruno this looks like an amazing project. Any plan to port it to iPad ? (I dońt have an iPhone and can't afford to buy one...

vichug's icon

Super interesting and in-depth, thanks !

Christoph Mann's icon

Any chance to get a release of the gf object?

A Gaffney's icon

Hi from much later! I'm looking for a copy of the Mogees VST software for use with a newly purchased sensor, but it appears the mogees.co.uk website and accompanying pages have been down since the product's discontinuance.

I have a new and unused registration code; hoping to use the sensor + software for an upcoming theatre project and would like to take advantage of the VST program in my patching.

Would anyone be willing to share their software copy?