Assigning Live Instrument sounds to Visual Functions in Jitter
Hello Fourms!
I have worked in Max since the end of this summer, and now am making my first stabs at using Jitter. I'm interested in creating something that will:
1. Sense pitches and intensity played on a live instrument (say, a piano that plays middle C at x volume)
2. Can assign the pitches and their intensity to a visual element that will be projected while the live instrument is played. For example, when middle C is played at a certain volume, a visual element--say a color--is displayed. As the sound from the note(s) played decays and gets softer, I would also like the color or other visual element projected to get softer--essentially creating an exact visual parallel with what is happening in the live instrument.
Can anyone help me get started or pointed in the right direction? Your help is much appreciated, thanks!
Hi!
I am very interested to know if you have made any progress with this.
Pitch detection is a complicated matter for all but the simplest cases. If you can take those datas from midi controllers (kbds and such) this will be much more precise. You can try [analyzer~] if you want to try true pitch and amplitude detection.