I am working on a project at the moment and I want to analyse an audio signal and use it to modulate something I am making in jitter. I am a little out of my depth though… I basically want to be able to specify a frequency band to analyse (eg. bass frequencies 50hz to 150hz) and analyse its amplitude into a signal between -1. and 1. and then use that to modulate parameters of the visual patch I am making.
I think I need the fft~ object to analyse, and the capture object….but I don’t understand how to turn that into a constant signal…
I had a look at the fft~ tutorial and got quite lost :/ any advice anyone can give is GREATLY appreciated!
The thing is though, I want to be able to analyse multiple bands of the audio signal, but I don’t want to run a heap of objects that will chew up my processor….wouldn’t that rule out those objects, as I would need to run a whole bunch of them in parallel?
"Here is a BasicFFT object with openFrameworks visualisation that will output the raw FFT data. It’s sent by osc to openFrameworks. The full openFrameworks project is full Xcode openFrameworks Project is here."