Early prototype of a movement based music generator. Instructions below if you want to try it out yourself.
The goal of this project is to create a method of generating sound or music using a movement or gesture based system. It is based around a patch for Max Msp/Jitter that will analyse the movements of the user through hand-mounted sensors and colour tracking. At this halfway point in the project the focus is on using colour tracking to create a musical output from Max.
The next section of the project that will measure strain and acceleration of movements to articulate and manipulate the musical output will follow in the second half of the project as it is considerably more complicated and could not be presented as a stand-alone object.
The desired outcome of this project is for the user to feel that their movements alone control the sound and should not be conscious of the cameras or sensors following them. Ideally the key relationship should be between the user’s hands and the sound, and as such they should be shielded from or at least unaware of the technology or software behind it. The reasoning behind aiming for a non-technical appearance and operation is that the finished project could be used for sensory therapy or educational uses as well as a performance tool.
**Instructions for use (windows)**:
What you need. A camera or webcam, A red or green light both is great, some sort of DAW with midi instruments is ideal, midiyoke for linking midi to DAW from max
1. Download the max runtime for windows if you dont have max/msp/jitter. Then download and install midiyoke.
2. Send me a quick email, mattwestwick(at)hotmail.co.uk and I will send you the program. Or download from
3. Open the file with the runtime window
4. Hit the “Gevdevlist” button to find your video devices i.e. webcam
5. Check the square box at the top left and then press the open button. You should now see a video feed on the screen.
6. In the green red and blue boxes on the screen you will see little rectangles that say noteout and ctlout: set these to output midiyoke channel 1 for the green and blue boxes and set the noteout and ctlout in the red box to midiyoke channel 2.
7. Open your DAW with 2 midi channels set up and change their input to midiyoke 1 and midiyoke 2 respectively and select whatever midi vst instrument you want.
8. Turn off the lights in your room and use your red and green lights or colours to play the instrument. The sliders on the screen should follow the colours but if not then adjust the brightness, contrast and saturation levels until they track them well. Turn down your monitor brightness not to shine light onto yourself.
Max tracks and inputs all variables which are then processed and sent out as midi messages.
Do you remember the first Max patch you ever made? What was it?
Clever little ear training tool that quizzed the user on chord structure and basic music theory such as note intervals.
How did you come up with this project idea?
I designed a board game last year using colour tracking and sound synthesis to build a sonic relationship between different coloured pieces on the game board.
I felt the core idea of movement and interplay between colours manipulating sounds was quite unique and could be developed greatly into a performance tool which could be refined and mastered much like a phsyical instrument.
What sorts of problems did you have to solve?
One aspect of this project was to mimic the function and sound of a theremin but this is somewhat tricky with midi as each note has to be quantised. Typically, midi note generation requires a set message with a duration however this was troublesome given that colour tracking as an input is often tempermental.
To overcome the problem of mimicing a basic midi controller with colour tracking, various conditions and rules were put in place to keep it behaving and to negate any unintentional movements or light sources causing havoc with the synthesiser.
Also if using a video feed to monitor your performance it is a lot easier to mirror this instead of seeing your movements going the opposite direction on screen!
If there were one person who you would want to see your project, who would it be?
Any of the developers of the new sony move. I only just discovered this last week and realised how similar some of the techniques used in my project are to theirs. A pat on the head would be nice.
At the conclusion of this project were you:
b) ready to do a new one
c) thinking of ways to expand it
d) [other, please describe]
Well, it's not over yet and the possibilities with Max are endless so I can forsee myself being continuously exhausted over the next few months.
Now that I have the basic sound generation running the real fun starts with gesture recognition and using unpredictable phsyiological responses as a form of modulation.