Hi there, I’m new to the forums so please forgive me if this topic has already been brought up. I have a project idea and was wondering if you guys could help me out. I’m interested in making a max patch in which the volume faders for live audio tracks also control the opacity of videos so that the videos will also be mixed together in time with the audio. The result would be a performance tool that creates a video collage out of audio mixing. A little bit more specifically, raising the volume of one audio track would increase the % opacity of the corresponding video. I like the way the 4 channel vizzie mixer does this, but I just want to find a way to have those sliders connected to audio. Let me know what you think!
That’s most trivial once you realize that the value of your mixer fader/slider can be sent to the audio gain object ( [*~] ) just as well the opacity control of your video ( [jit.alphablend] or whichever method you prefer).
Did you do the Max, MSP and Jitter tutorials? This should be obvious to you then.
Oh awesome! Thank you! Yeah, I’ve been watching the Jitter/Vizzie tutorials on the cycling74 youtube channel, but it’s been hard for me to think about objects and signal flow in a more abstract way (coming from programs like Logic/Pro Tools). I was getting to caught up in it and didn’t realize it could be so simple. I also just started learning Max this past month for school, so everything is brand new, exciting and frustrating at the same time haha.