Combining Ableton Live with Blender for audioreactive visuals via Max4Live
Jan 10 2022 | 5:03 pm
Dear Max friends,
As you can see described in the Topic i am using Max 4 Live to trigger control Parameters in my 3D Software of choice ( Blender). I have quite a lot experience in using Ableton (for 13 years now) and less in Blender (about 4 years) , but i am new to max. I was doing visuals for music without max, since im using blender ( https://www.youtube.com/channel/UCdOD4P8BSbznZ6fZaRaJFpQ ) but it was a pain to do all of it offline with bounced files ... As i discovered that i can controll parameters in realtime via Max and an OSC server i was completly hooked. At the moment im a diligent student of Max and i am already able to control blender with the amplitude of audio signals, comming from Ableton.
But i think there are a lot more things i could analyse from my audio signals and visualise them in Blender.
the most important for me may be:
- the pitch of the signal
- the RMS level
- Stereo width
And for sure many more i am not aware of at this point. And that is my actual question to you:
Does someone know any comparable projects or tools i could learn from? Does someone of you got good ideas wich analyzing data could be interesting for me as well?
Most of the tutorials out there are about processing the sound in max and not analysing it.
Is there something you would tell a max beginner with aims like i have?
As you can see, my question is quite unspecific, because i really dont know exactly where to start.
For sure i am learning the max basics at the moment but maybe someone of you could point for me in the right direction. The best case scenario would be to have something like izotope insight or another analyzer tool, where i just can steal the analyzed data from and send it as values to Blender.
I promisse to be more specific with my Questions in the future :)
thx
Forrest