My Wii Life…


My friend Mike Metlay recently asked me to perform with him on a live StillStream webcast scheduled for June 30, 2007. I enjoy working with him, and this sounded like an opportunity to get out of the “office” and work up a decent performance system. Rather than using one of my normal systems, I decided to build an instrument from scratch, and also chose to put some specific stricture in place — hoping that some limitations would help spur a new creative direction.

Goals

My limitations and development goals were pretty specific:

  1. I wanted to make a system for interactive performance, rather than for setup-and-run.
  2. I wanted to have the system be reactive, so that I’d be able to play off Mike’s playing.
  3. I wanted it to be as portable as possible, since I’d be playing off-site (where I could get high-quality Internet service).
  4. I wanted the system to be based around a step sequencer — a familiar and comfortable tool for me.

I’d been reading a lot about people using the Wii Remote Control as a performance controller, and found a number of tools already available that would let me use it with my MacBook Pro. After trying several options, I decided to combine Masayuki Akamatsu’s aka.wiiremote Max object (available at http://www.iamas.ac.jp/~aka/max/) with some Max/MSP programming, and use them to drive a synthesizer from Cycling ‘74’s MODE.

Using the aka.wiiremote object was simplicity itself — the object’s help file is a robust display of its capabilities. Like many users, I seemed to have problems getting consistent connection with the Wii Remote, but this appears to be related to the Bluetooth implementation on the MacBook Pro. I was hoping to use the add-on Nunchuk controller with the Wii Remote, but I never got a good connection, so it sits in the desk drawer waiting for a far future date when I can put on my Ninja suit and do some serious debugging and meditation.

Wii Implementation

Developing and implementing a system using the Wii Remote as a musical controller was a challenging programming problem. If you aren’t familiar with the device, it is a small handheld unit that connects to the computer via Bluetooth, and features a trigger, a 4-way directional button and several other utility buttons. It also has gestural output based on movement of the controller — something that would provide the physicality and reactivity I wanted for this system. I needed to build a performance application, then map the trigger, button and gesture outputs to control the system.

The first step was to create the application that would interface with the Wii Remote. My application had several parts:

  1. I built a 32-step sequencer (using the multislider UI object) with some randomization and displays for the current step selection and loop points, and tied it all together with a bit of javascript programming to maintain the current state of my application.
  2. The output was sent to another bit of javascript programming that would imply a musical “key” to the output.
  3. The result was then sent to a VST softsynth mode.mono and then to the mode.wash-1chan audio plug-in for delay processing. I chose these two VST devices because I’m very familiar with them, and they would be easy to manipulate in a live performance.

My first mapping was for the trigger. Since I wanted to perform (rather than just launch) my sequence, I decided to use the trigger to step through my sequencer. While this is great for “playing” the sequence, I also had something else in mind—I also wanted to fire off stepped playback in ways that are typical for step sequencers. I decided to use the “A” button on the Wii remote to do this, since it is easily accessible and could be quickly switched on and off. Tying this to a metro object allowed me to switch between clocked and manual sequence triggering.

The 4-way switch was a natural for the step selection and transposition functions. I use the left/right switches to move horizontally through the sequence (i.e., to select notes or to pick positions for loop points), and I use the up/down switches to move vertically—to transpose the sequence in half-steps. It’s worth nothing that I’m performing the transposition before doing the javascript scaling function so that I can do in-key transposition during the performance. This proved to be an important way to create movement in the otherwise static output of the step sequencer.

The “+” and “-“ keys were naturals for use as controls to select direction; the plus key will set (or return to) forward stepping, and the minus key sets or returns to reverse stepping. These mappings are particularly useful in performance, since forward-to-reverse switching can add interest to small sequence loops. The “Home” key is my “reset” button: it eliminates all loop points, and returns the current step position to the first step. Since the button is somewhat recessed, I don’t have to worry about accidentally hitting it during a good performance.

The “1” and “2” switches are rather inconveniently placed (in my hand, they sit underneath the base of my thumb), so I wanted to use them for settings that don’t change often. I decided to map them to the loop start and end position, since I don’t change those very often in the heat of performance.

Finally, there is the gesture control. I was originally going to do extensive mapping of the controller movements to application parameters, but I found a few problems in the course of working with it while developing my patch:

  • The accelerometers aren’t very precise, so gesture control wasn’t a good candidate for setting detailed parameters such as step values.
  • Since the rotational control has only four output values, it didn’t really map to anything useful at all.
  • Multi-dimensional control got too confusing, and I didn’t want to spend all my time worrying about dimensional space!

After some thought and experimentation, I decided to go with a single gesture mapping technique in which left-to-right transitions would control the velocity output of the generated notes. This turned out to be even more interesting than I first had thought — since nobody worries about exact velocity output, most synthesizers are tolerant of “close enough” note velocities. In addition, the mode.mono synthesizer includes velocity control of the filter settings, so there was a natural connection between the performer’s gesture and the synthesizer’s output. By foregoing any other gesture mapping, I knew that a sweep of my arm would affect only a single parameter.

View a larger version of the performance screen.

The Playing

As always, the proof is in the playing, and I’ve found that the patcher/Wii connection I’ve described here is a very controllable playable performance instrument. And it’s also quite a lot of fun. While I need to use the computer keyboard for some initial setup (the steps, the scale and the VST settings), most of the actual performance can be handled using the Wii Remote by itself. I find that I’m able to find interesting phrases using the 4-way switch, trigger and loop settings, to use them as playable sub-sequences, and to alter them with directional and transpositional adjustments.

The patch/instrument I created not only met the goals I set for myself and stayed within the limitations I decided to set for myself, it’s enjoyable to work with. If you’d like to hear my patch in action, please tune in to the June 30th StillStream webcast, or check for archived recordings of the result if you’re reading this after the date of the performance. Hopefully, it will turn out as well as (or better than) planned.

Download the Max 4.6 performance patch.

(Thanks to Masayuki Akamatsu for his work on the aka.wiiremote object; without that work, none of this would have been tackled!)