For part of my MA thesis I’m looking at different methods of using 3d accelerometers (such as the ubiquitous Wii Remote) as controllers for Max (or anything else really). My particular area is for movement capture – I’m a drummer – but I’d like a little bit of a survey into what else is going on, and figured here would be a useful place to start.
So if you wouldn’t mind me asking; how are people using their 3d accelerometers? Augmented instruments? Bespoke controllers? And what about the data? How is that fed into Max? To control what parameters? Etc etc etc.
Anyone with anything useful to share, be it patches, links, thoughts or anything at all, would be extremely helpful.
|robin wrote on Mon, 27 April 2009 06:38|
How are people using their 3d accelerometers? Augmented instruments? Bespoke controllers? And what about the data? How is that fed into Max? To control what parameters? Etc etc etc.
Lots of ways to use them, one great one besides the wii is the iPhone, I use it with an app called TouchOSC but there are several others. It uses udp to send the data as OSC which is easy to parse in Max. There’s also LilyPad from Arduino, some people use this for embedded sensors like accelerometers in clothing, for dancers or other movement-based performance. Cheap and relatively easy to work with, though a bit more coding needed to get it into Max via the [serial] object. The nice thing is that your basic accelerometer as a sensor is cheap, but you need to do the wiring etc. yourself, rather than have it nicely packaged like the iPhone or wii.
Obviously you can control any parameter you want with it, but intuitive ones would be pitch/volume/timbre of basic synthesis, or maybe three similar effects for video like brightness/contrast/saturation. Or two video FX and a crossfade on the third axis. Using it in a 3D space like OpenGL also makes a lot of sense, it could be tied directly to motion of the camera, or of certain objects.
TouchOSC is great and easy to get going, there are several patches on the forum to get the data in, and the range is good, though you need a wireless network to use it. I also like the four toggle switches you have at the bottom of the screens (on Simple Layout), since you can then easily have 16 different combinations of where a certain parameter goes. With some more fiddling you can also work out 2-way communication, which is great not just for interface update, but also to use the features of the Max UI. So for example, there’s an 8×8 grid of toggles that’s kind of tough to use on the phone since they’re so small. But have 2-way communication and you can use a preset in Max to store whatever patterns you want, then make small adjustments on the phone. Also you can have one of the buttons on the phone simply set everything to default values (again, trigger a preset in the patch), which would take a long time to do by hand.
The sky’s the limit, especially when you combine the Max UI with it. It’s a lot of fun experimenting with it for things like gesture recording, so you could look into [mtr] for that…also I think there’s a Gesture Recorder on the forum somewhere for this sort of thing, though I haven’t used it. Sounds like it would definitely be worth a look for this application.
if you are using a mac and interested in getting accelerometer data from a wiimote, i suggest looking at the aka.wiimote external for max, as it handles all wii related fun.
i have made a patch to alter playback speed of a buffer and affect filter and other effects parameters using the accelerometer in the wiimote.. its really easy to do. as ever, getting the data is the easy bit.. its what you do with it ;P
otherwise, i think the above reply is very comprehensive.
Seejayjames: Have you had any audience feedback on how the phone as a controller works in a live situation. I have the app, but I’ve never taken it that seriously. Doesn’t it just look like you’re waving a phone around?
Rather than aka.wiimote, I use JunXion, from Frank at STEIM. It’s a wicked bit of software that basically does the job that [hi] does, but with loads and loads of extra bits, such as OSC, Wii Remotes, audio and video to MIDI. It allows rules such as variables and conditions, and loads of scaling.
I have a hacked Wii Nunchuk, which gives me a tiny board with the accelerometer, buttons and joystick mounted on, and I’m working on a glove so I can analyse data from the movement of my hands during drumming.
the_man: if it’s how you use the data that counts, care to enlighten?
I dunno if I was a bit vague, but it’s not that I dont have any experience myself, but rather I was after other people’s views. Kind of like research…
i was really just making the pretty obvious point that getting the accelerometer data into max is the easy bit and that your algorithms for processing it are the more challenging part.
a lecturer i have is working on a project which a conductors movement and detects where the beats occur, analyses their conducting style and provides 3d graphical feedback on their pattern etc.. i think they also have it running to analyse the performance of violin players in a similar way
Not all accelerometer data has to be gestural or constant.
I just use the accel info on the iPhone/TouchOSC as an eraser for drum patterns. Shake the phone violently, start with a blank slate. Doing much of anything subtle with it just frustrates me but for a good example of something more practical, that shaker/tambourine app @ the apple app store would be a good start. Maybe you could model a MSP instrument where shaking side to side yields one sound, end to end another, and allow morphing of the two (might go try this actually, sounds fun… a perfect driving motion for some of the STK percussion models).
tape a wiimote behind my guitar and rock out!