Remarkable Matt! Not the project but you as a person. I did work in the early 1970’s when the digital age was coming to life. I worked on projects to help people with limited communication skills, some with “only” a finger movement or blinking. Today I study the theremin to share a musical outlet with people that are developmentally disabled or had a traumatic experience in the war. If the University misses the deeper meaning of your art don’t worry, you have greater rewards waiting for you in your future. I also applaud the work of your family.
Oh thats a piece of cake with the findbounds object in max. It has four values for brightness, red, green and blue values each with a maximum and minimum threshold. So to track Red you make it's threshold values say maybe 0.1 to 1 but for green and blue you would do something like 0 to 0.1 to find only red.
If you want to alter the program you need max/msp/jitter which you can get a free trial of. If you already have it apologies for stating the obvious! :)
hi there. this project is awesome. i m interested in every aspect of connecting sound and color. but i work with max msp for about and year now and to create such a complex patch still looks to me impossible. still, i like it.
Thanks! You can view the image up top for some ideas. The core max patch is fairly simple. Where it gets complicated is linking it with midi or getting good frame rates. That's why the big companies have not yet made video based instruments, as the processing delay makes it very very hard to play in time. Also there are calibration aspects that can be very demanding.
Feel free to download my patch or play around with the tutorial I linked above.
In one word, RESPECT ;). Thought the project was phenomenal and am starting out to play with Max only now and starting bottom-up ... but hey, only one way to start. So was venturing other people's projects (actually taking a break from going through Max tutorials - hoping my computer engineering degree background will make it that much easier ;)), and came across yours and love how you melded both your own controllers to produce the sound. I'm sure there are many such a similar sound producing concept but loved how you melded the colour controllers to the mix.
I'd open the patch and analyse but a bit too early quite yet but gonna bookmark yours and hopefully come back to it sooner than later.
Great project! I thought it was very inspiring :)
I have a couple of questions and would be grateful if you answered them. firstly, how did you get the colour tracking in Max to be so smooth? There are no glitches at all, certainly none that can be heard. And secondly, apart from using Max and led lights, did you use any other external hardware/software for your project?
I haven't used Max much since I made this so there might be more up to date methods for colour or even just movement tracking around now.
To tell the truth, it takes some effort to calibrate the contrast/brightness/saturation of the camera to smoothly track the colours.
The smoothness I achieved was through using primary colours and a consistently lit environment. Pure red, green or blue is easiest to track because you can set the findbounds thresholds to almost completely ignore any undesired colours from being tracked. Half of the battle is making sure there are no distracting colours or light sources in the room. For the video example above I used a fairly dark room so the lights from the LEDs were very strong against the dark background. Also I made sure I wasn't wearing anything red or green. haha.
In terms of glitches I found that when the tracking loses the desired target it tends to give a very high value or zero. So I just told the patch to ignore these and stay on the value it was previously on until it reaquires the colours. There was a lot of conditional statements involved..
You could smooth out the tracking with low-pass filtering but I didn't have the time to experiment with this.
As for external stuff, it's just what I described in the instructions above. The lights I used were just some leds hooked to a switch and a battery, all attached to some old skateboarding wrist-guards.
To get the midi information from Max to Cubase (you can use any midi capable program) I used midiyoke, which is like a driver which runs in the background and could be recognised by most digital studio programs. You could also ignore any external software and use your computers in-built midi sounds which are accessible directly through Max - but these sound crap.
I used a few VST instruments in Cubase to get the sounds, but these can be anything you want. There's a slight latency caused by the camera, because cameras don't normally need to be as fast as midi instruments and only take about 40 frames per second. So with this in mind I used some slow attack synthesisers that are more suited to big open sounds.
The green light sent chords to a warm-pad synth and the red light sent notes to a theremin-like tone that would flow between the notes (portamento control).
I hope that answers your questions, you can send me an email if you want more info and I can send you a couple of earlier patches that very simply demonstrate the colour tracking techniques.
Sorry for the late reply. Been working on my patch but decided instead of colour i am using the Kinect divice to track gestures. I was wondering though, how did you manage to create the nice sounding chords? Anything i try and build which resembles any kind of chordal formation, sounds pretty rubbish haha. Basically just a bunch of cycle~ objects with some phasors. Kids stuff.
The chord sounds were from a nice midi synth in cubase. All of the output from the patch was just midi note values.
The actual midi data was a big table of note values with individual noteon and noteoff values for each note in the chords. This was probably very innefficient but was the best I could do at the time.
If you download the patch and open up the 'patcher chords' or 'patcher noteon' objects you will see what I mean. It's just a big selection of conditional statements that checks where the tracked colour is and turns on the relative set of notes, turning them off when it moves to the next point or back to zero.
If you are not using midi at all you would probably need to work out the frequencies of each note that makes up the chord and program that in to the cycle object.
Hope that helps, anything else just email me.
Good luck with the kinect, hope it works out better than colour tracking.
4 years on and still getting requests and enquiries for this project. Thank you to everyone for getting in touch and sharing your works. I never expected this fun little patch to spread as far as it has.
I have updated the download link but I enjoy the correspondence so don't hesitate to get in touch.