Combining Ableton Live with Blender for audioreactive visuals via Max4Live

Forrest Mau's icon

Dear Max friends,
As you can see described in the Topic i am using Max 4 Live to trigger control Parameters in my 3D Software of choice ( Blender). I have quite a lot experience in using Ableton (for 13 years now) and less in Blender (about 4 years) , but i am new to max. I was doing visuals for music without max, since im using blender ( https://www.youtube.com/channel/UCdOD4P8BSbznZ6fZaRaJFpQ ) but it was a pain to do all of it offline with bounced files ... As i discovered that i can controll parameters in realtime via Max and an OSC server i was completly hooked. At the moment im a diligent student of Max and i am already able to control blender with the amplitude of audio signals, comming from Ableton.
But i think there are a lot more things i could analyse from my audio signals and visualise them in Blender.
the most important for me may be:
- the pitch of the signal
- the RMS level
- Stereo width
And for sure many more i am not aware of at this point. And that is my actual question to you:
Does someone know any comparable projects or tools i could learn from? Does someone of you got good ideas wich analyzing data could be interesting for me as well?
Most of the tutorials out there are about processing the sound in max and not analysing it.
Is there something you would tell a max beginner with aims like i have?
As you can see, my question is quite unspecific, because i really dont know exactly where to start.
For sure i am learning the max basics at the moment but maybe someone of you could point for me in the right direction. The best case scenario would be to have something like izotope insight or another analyzer tool, where i just can steal the analyzed data from and send it as values to Blender.
I promisse to be more specific with my Questions in the future :)
thx
Forrest

Anthony Palomba's icon

You may want to look in to zsa.descriptors. It is a library that does real-time sound descriptor analysis.
http://www.e--j.com/index.php/what-is-zsa-descriptors/
It can be found in the package manager.

Forrest Mau's icon

Anthony, u are a genius! Thats exactly the stuff i am searching for. I wasnt sure at first, but at the moment i study their paper and i looks really promising. I have to figure out still a lot (i dont even know how to implement new libraris to max^^) but its definitely aiming in the right direction. So thx a lot <3
If anyone got more for me like that i would apreciate every contribution.

Asher Simon's icon

Hey Forrest, how did you set up Live 11 with Blender in the first place? I'm trying to combine them to make concert visuals.

Forrest Mau's icon

Phuu there is a lot to explane^^ But like i mentioned i am sending OSC data from max to blender. I recieve the OSC data via a blender plugin called add routes by jPefP. it was a lil bit of a hustle to set it up but in the end it worked 4 me. You can also try Node OSC.
In Blender i mostly working with an empty, where i conect all the OSC data and then use drivers to conect them to the things i whant to react on sound.
In the Max patch i recomend you to use a line node to smooth out the amplitude.
I am trying the same as you btw :D. I was a livesound engineer mainly before the Pandemic. So if i can help u further with anything dont hasitate to ask. But over all you can figure everything out by watching tutorials 4 Node OSC & Add Routes.
This tutorial here was the best base for all i did after:
https://www.youtube.com/watch?v=ssVcU8xsRT8
but like i mentioned, i switched then to Add routes because it worked out better 4 me.
Good luck with ur Project :D

Asher Simon's icon

Thank you so much for your help!

Pedro Santos's icon

Were you able to install the addroutes add-on in Blender 3.1 or are you using an older version? Thanks!

Forrest Mau's icon

Yeah, the last blender version i got it running on was 3.0 :/

Wes Smith's icon

Hey friend tangential but I think relevant…there is a series of code level tutorials using python that get pretty deep into integration of Ableton Live and TouchDesigner. They used it to create the TDAbleton interface and I believe subsequently the Ableton Link integration present day. They are on YouTube from about 4 years ago…and others that follow up to 2023. You might find that journey useful as Blender also utilizes python scripting.. good luck.