Colour Motion


Early prototype of a movement based music generator.  Instructions below if you want to try it out yourself.

Aim

The goal of this project is to create a method of generating sound or music using a movement or gesture based system. It is based around a patch for Max Msp/Jitter that will analyse the movements of the user through hand-mounted sensors and colour tracking. At this halfway point in the project the focus is on using colour tracking to create a musical output from Max.
The next section of the project that will measure strain and acceleration of movements to articulate and manipulate the musical output will follow in the second half of the project as it is considerably more complicated and could not be presented as a stand-alone object.
The desired outcome of this project is for the user to feel that their movements alone control the sound and should not be conscious of the cameras or sensors following them. Ideally the key relationship should be between the user’s hands and the sound, and as such they should be shielded from or at least unaware of the technology or software behind it. The reasoning behind aiming for a non-technical appearance and operation is that the finished project could be used for sensory therapy or educational uses as well as a performance tool.

**Instructions for use (windows)**:

What you need. A camera or webcam, A red or green light both is great, some sort of DAW with midi instruments is ideal, midiyoke for linking midi to DAW from max

http://www.midiox.com/index.htm?http://www.midiox.com/myoke.htm

1. Download the max runtime for windows if you dont have max/msp/jitter. Then download and install midiyoke.

2. Send me a quick email, mattwestwick(at)hotmail.co.uk and I will send you the program. Or download from

http://rapidshare.com/files/440805228/matts_colormin.maxpat

3. Open the file with the runtime window

4. Hit the “Gevdevlist” button to find your video devices i.e. webcam

5. Check the square box at the top left and then press the open button. You should now see a video feed on the screen.

6. In the green red and blue boxes on the screen you will see little rectangles that say noteout and ctlout: set these to output midiyoke channel 1 for the green and blue boxes and set the noteout and ctlout in the red box to midiyoke channel 2.

7. Open your DAW with 2 midi channels set up and change their input to midiyoke 1 and midiyoke 2 respectively and select whatever midi vst instrument you want.

8. Turn off the lights in your room and use your red and green lights or colours to play the instrument. The sliders on the screen should follow the colours but if not then adjust the brightness, contrast and saturation levels until they track them well. Turn down your monitor brightness not to shine light onto yourself.

9. Enjoy

How did this project use Max?

Max tracks and inputs all variables which are then processed and sent out as midi messages.

Do you remember the first Max patch you ever made? What was it?

Clever little ear training tool that quizzed the user on chord structure and basic music theory such as note intervals.

How did you come up with this project idea?

I designed a board game last year using colour tracking and sound synthesis to build a sonic relationship between different coloured pieces on the game board. I felt the core idea of movement and interplay between colours manipulating sounds was quite unique and could be developed greatly into a performance tool which could be refined and mastered much like a phsyical instrument.

What sorts of problems did you have to solve?

One aspect of this project was to mimic the function and sound of a theremin but this is somewhat tricky with midi as each note has to be quantised. Typically, midi note generation requires a set message with a duration however this was troublesome given that colour tracking as an input is often tempermental. To overcome the problem of mimicing a basic midi controller with colour tracking, various conditions and rules were put in place to keep it behaving and to negate any unintentional movements or light sources causing havoc with the synthesiser. Also if using a video feed to monitor your performance it is a lot easier to mirror this instead of seeing your movements going the opposite direction on screen!

If there were one person who you would want to see your project, who would it be?

Any of the developers of the new sony move. I only just discovered this last week and realised how similar some of the techniques used in my project are to theirs. A pat on the head would be nice.

At the conclusion of this project were you:
a) exhausted
b) ready to do a new one
c) thinking of ways to expand it
d) [other, please describe]

Well, it's not over yet and the possibilities with Max are endless so I can forsee myself being continuously exhausted over the next few months. Now that I have the basic sound generation running the real fun starts with gesture recognition and using unpredictable phsyiological responses as a form of modulation.



MBD73
December 10, 2010 | 10:26 am

Would it be possible to make a maxforlive version ? :-)


December 11, 2010 | 7:41 am

Most likely yes.

I’m working on making a sealed up presentable model of the patch so far, that I can send it to friends/family/anyone else interested. This won’t be for a few weeks, I have some big exams coming up..

When the project is fully completed in May I will release it in its entirety.

Regards



So. California
January 5, 2011 | 11:12 am

Remarkable Matt! Not the project but you as a person. I did work in the early 1970’s when the digital age was coming to life. I worked on projects to help people with limited communication skills, some with "only" a finger movement or blinking. Today I study the theremin to share a musical outlet with people that are developmentally disabled or had a traumatic experience in the war. If the University misses the deeper meaning of your art don’t worry, you have greater rewards waiting for you in your future. I also applaud the work of your family.



Toon
January 27, 2011 | 8:44 pm

Great software, I want to experiment with the program and I was wondering how did you get to the color codes for red and green?


January 28, 2011 | 6:36 pm

Oh thats a piece of cake with the findbounds object in max. It has four values for brightness, red, green and blue values each with a maximum and minimum threshold. So to track Red you make it’s threshold values say maybe 0.1 to 1 but for green and blue you would do something like 0 to 0.1 to find only red.

If you want to alter the program you need max/msp/jitter which you can get a free trial of. If you already have it apologies for stating the obvious! :)

This tuturorial is a great starting point. It’s where I started off about a year ago.

http://cycling74.com/docs/max5/tutorials/jit-tut/jitterchapter25.html

Regards,
Matt

P.S. New demo coming soon



Toon
January 31, 2011 | 1:30 am

Alright, thanks for the info! I’ll be looking forward to the new demo. Thanks again.


February 15, 2011 | 7:24 am

hi there. this project is awesome. i m interested in every aspect of connecting sound and color. but i work with max msp for about and year now and to create such a complex patch still looks to me impossible. still, i like it.


February 15, 2011 | 9:37 am

Thanks! You can view the image up top for some ideas. The core max patch is fairly simple. Where it gets complicated is linking it with midi or getting good frame rates. That’s why the big companies have not yet made video based instruments, as the processing delay makes it very very hard to play in time. Also there are calibration aspects that can be very demanding.
Feel free to download my patch or play around with the tutorial I linked above.

Regards,
Matt


January 5, 2012 | 1:42 am

Would it be compatible with mac ? I guess I should have to change the grab object but is that the only difference ?


January 5, 2012 | 1:44 pm

Yes works fine with Macs. Just change the jit.dx.grab to the usual jit.qt.grab

Just another example of windows drivers being a pain.

Matt


January 8, 2012 | 5:36 pm

In one word, RESPECT ;). Thought the project was phenomenal and am starting out to play with Max only now and starting bottom-up … but hey, only one way to start. So was venturing other people’s projects (actually taking a break from going through Max tutorials – hoping my computer engineering degree background will make it that much easier ;)), and came across yours and love how you melded both your own controllers to produce the sound. I’m sure there are many such a similar sound producing concept but loved how you melded the colour controllers to the mix.

I’d open the patch and analyse but a bit too early quite yet but gonna bookmark yours and hopefully come back to it sooner than later.

So I lied, in yet another word, kudos ;).

Salah


January 9, 2012 | 9:00 am

Thanks very much indeed. I sent you an email with a simple colour tracking patch I made before this one. It’s fun to play around with and might make analysing this contraption a bit easier.

Matt


February 8, 2013 | 3:31 am

Great project! I thought it was very inspiring :)
I have a couple of questions and would be grateful if you answered them. firstly, how did you get the colour tracking in Max to be so smooth? There are no glitches at all, certainly none that can be heard. And secondly, apart from using Max and led lights, did you use any other external hardware/software for your project?

Thank you in advance.

Marco


February 8, 2013 | 9:18 am

Hi Marco,

I haven’t used Max much since I made this so there might be more up to date methods for colour or even just movement tracking around now.

To tell the truth, it takes some effort to calibrate the contrast/brightness/saturation of the camera to smoothly track the colours.

The smoothness I achieved was through using primary colours and a consistently lit environment. Pure red, green or blue is easiest to track because you can set the findbounds thresholds to almost completely ignore any undesired colours from being tracked. Half of the battle is making sure there are no distracting colours or light sources in the room. For the video example above I used a fairly dark room so the lights from the LEDs were very strong against the dark background. Also I made sure I wasn’t wearing anything red or green. haha.

In terms of glitches I found that when the tracking loses the desired target it tends to give a very high value or zero. So I just told the patch to ignore these and stay on the value it was previously on until it reaquires the colours. There was a lot of conditional statements involved..

You could smooth out the tracking with low-pass filtering but I didn’t have the time to experiment with this.

As for external stuff, it’s just what I described in the instructions above. The lights I used were just some leds hooked to a switch and a battery, all attached to some old skateboarding wrist-guards.

To get the midi information from Max to Cubase (you can use any midi capable program) I used midiyoke, which is like a driver which runs in the background and could be recognised by most digital studio programs. You could also ignore any external software and use your computers in-built midi sounds which are accessible directly through Max – but these sound crap.

I used a few VST instruments in Cubase to get the sounds, but these can be anything you want. There’s a slight latency caused by the camera, because cameras don’t normally need to be as fast as midi instruments and only take about 40 frames per second. So with this in mind I used some slow attack synthesisers that are more suited to big open sounds.

The green light sent chords to a warm-pad synth and the red light sent notes to a theremin-like tone that would flow between the notes (portamento control).

I hope that answers your questions, you can send me an email if you want more info and I can send you a couple of earlier patches that very simply demonstrate the colour tracking techniques.

Cheers,

Matt


March 11, 2013 | 11:42 am

Hey Matt,

Sorry for the late reply. Been working on my patch but decided instead of colour i am using the Kinect divice to track gestures. I was wondering though, how did you manage to create the nice sounding chords? Anything i try and build which resembles any kind of chordal formation, sounds pretty rubbish haha. Basically just a bunch of cycle~ objects with some phasors. Kids stuff.
Any ideas?

Thanks again,

Marco


March 11, 2013 | 5:27 pm

The chord sounds were from a nice midi synth in cubase. All of the output from the patch was just midi note values.

The actual midi data was a big table of note values with individual noteon and noteoff values for each note in the chords. This was probably very innefficient but was the best I could do at the time.

If you download the patch and open up the ‘patcher chords’ or ‘patcher noteon’ objects you will see what I mean. It’s just a big selection of conditional statements that checks where the tracked colour is and turns on the relative set of notes, turning them off when it moves to the next point or back to zero.

If you are not using midi at all you would probably need to work out the frequencies of each note that makes up the chord and program that in to the cycle object.

Hope that helps, anything else just email me.

Good luck with the kinect, hope it works out better than colour tracking.


March 16, 2013 | 5:19 am

Thanks for the help Matt :)
Ill let you know if there is anything else.

Thanks again


March 17, 2013 | 3:47 am

1 thing i actually just thought of. How did you import the synth chords from Cubase into Max MSP? The only way I can think of it to have lots of different sfplay~ objects and different sound files?

Any ideas on this?

Thanks :)
Marco


March 17, 2013 | 12:04 pm

I didn’t. The patch sends midi values via the noteout object to Cubase.

To get between max and cubase I used midiyoke. Which takes the midi values from max and acts as a midi input within cubase.

On a Mac you don’t need midi-yoke, just pc.


March 17, 2013 | 3:09 pm

Ah okay i understand. I use a mac, but how were you able to get Cubase and Max to communicate via the noteout object?


March 17, 2013 | 4:26 pm

The final noteout objects have a drop down menu you can choose where to send the midi values to. On pc you just right click ‘noteout’ and choose where you want to send it.

For me, I select a midi-yoke channel, then use this as my midi input within cubase.

If you have something like garage band or any midi capable digital studio it should work quite easily on a mac.
Have a quick read of this to see

http://cycling74.com/docs/max5/vignettes/core/max_and_other_apps.html


March 20, 2013 | 9:00 am

Thanks for the help :) I have it working now with logic pro


March 20, 2013 | 9:18 am

Hey that’s great. Getting the two programs to communicate is always a fun breakthrough moment.

Let me know how you get on, send me a link to your project when it’s done.

Matt


Viewing 23 posts - 1 through 23 (of 23 total)