Intuitive design ideas for use with the Wiimote?
Although far from professional, I’ve created an external (Win32) which emanates several Wiimote features to Max, including:
– Nunchuck (Accelerometer, Analog Stick, Buttons)
– Balance Board (Untested)
- IR Sensor (Untested)
For testing purposes (as well as for fun), I’ve decided to create a MIDI synth to be controlled by this external. But my mind is drawing blanks toward the question "What should control what?"
My first thought was to use the wiimote’s Pitch to control the pitch/frequency of a note, and have the A button control note on/offs. But then I realized how difficult it would be to accurately scale more than one or two octaves at a time, since you would have to bend your arm backwards to get to the other notes (or otherwise set the range to something higher where hitting a specific note would prove more difficult).
After a few more design fails, I decided to ask the community what your thoughts were. How would you prefer to have a Wiimote (+nunchuck?) control your musical performances? Possible things to consider are:
- Musical pitch
- Oscillator types
- Filters (EQ, for example)
- Loop-record / Overdub
- Audio Samples
- (Your thoughts here)…
I’d appreciate any sort of advice from anyone, thanks!
Would definitely be into something that improved localization – ie the "ir" aspect – a smooth data transfer to max would be awesome. I use the Wii for spatialising in 3d space as well as pitch change etc.
the number of responses to this questions is inversely proportional to the number of possible solutions! :)
I would begin by deconstructing familiar (and not so familiar) musical ‘gestures’ such as bowing a violin, plucking a guitar string, playing a theremin or Eigenharp, to determine what the more intuitive sound parameters are and how they are mapped to a given gesture feature.
You have begun with a good list of control-gesture input features (Wiimote) and sound-parameter output (synth controls); now decide on a hierarchy of sound features – which are primary (eg, pitch, loudness) and which are secondary (filter Q?). Try modelling a conventional instrument first perhaps (eg, wind).
Of course, digital technology allows us to bend and break existing mapping strategies, so the mapping strategy is entirely up to you. And do please keep us posted on your progress.
Marcelo Wanderley’s "Control and Interaction Beyond the Keyboard" is a good place to start.
I just tested the IR capabilities and they appear to be unresponsive at the moment, so it may take some time before that feature becomes available. Also, I’ll likely need to buy a separate (possibly USB-powered) sensor bar in order to gain more portability and accuracy.
Yeah, I had hoped for a little more feedback than what has been offered thus far… =P
I can technically "record" state changes, per se. But this is only a native feature from within the external itself and has yet to be extended into the realm of Max, since I have yet to figure out a decent way to make these recorded gestures "dynamic". In other words, a gesture would need to be performed exactly as originally recorded. Also, I haven’t figured out an elegant way to allow max users to state exactly what to record (accel, pitch/roll, buttons, a certain combination of each, etc). If anyone can think of how (from within Max) you would like this done, I could easily update the external to include such a feature, and then figure out how to allow precision adjustments from there in such a way that it wouldn’t conflict with other recorded gestures. Perhaps I could even include a standard set of gestures sometime down the road, or perhaps a separate "wiigesture" external that could plug in to this one?
I like where this is going so far. Keep the suggestions coming, and I’ll keep on coding this external in a fashion that suits the community! =)