Project Advice : MIDI Trigger videos & FX
New User here. I would like to create a patch that would let a user:
let a user create/save/delete a song/project which would in turn let them add videos to the project and then assign a MIDI note or channel (or any MIDI message frankly) to each video. The videos need to have an option to be one-shots or loops. In addition, the user could assign FX to be applied, to be triggered as well. The video needs to include the sound.
In the end, a user could open projects (which are really songs), and either play them by clicking Play or triggering them via MIDI. Its also important that if one video is playing and another is triggered, the first video continues playing after video 2 finishes (assuming its a short one-shot, looped videos dont apply). I think of it like keyboard polyphonics, if that makes sense.
that's really it, multiple videos, triggered (perhaps) via MIDI with chosen FX as options.
I've looked at paid solutions like Resolume, EboSuite, TouchScreen, Ableton, Reaper, virtualDJ, Usine Hollyhock, and RemixVideo (almost bought it). Most of the solutions are frankly too expensive and overkill IMO. Others don't quite fit or appear to be not in development anymore.
I have programming experience with Arduinos and some work related stuff (SQL, Powershell) but MAX is very foreign to me.
PureData (where I was able to play a video w/no sound) , watched some video tutorials for Max 8 on youtube, read about Vizzio, looked at the VPT examples (but I can't make heads or tails of it to be honest) and I'm not sure where to start really. I'm aware that I'll probably need to utilize 'poly' to make sure the videos respond well to MIDI and I think I'll probably have to use jit.movie. But is there any reason I can't use Vizzio?
at some point, I figure I'll start simple & go from there. but some advice sure would be appreciated. I've considered offering this up as a paid job as well, but IDK if anyone would be interested.
i would start looking at the jit.playlist object instead of poly~, much simpler to get started with. see my reply here.
enable output_texture on your jit.playlist object, and send to Vizzie modules for effects processing.
my only concern is that if it's a playlist then they'd potentially play one after another, which isn't the desired effect. I would guess I could somehow turn that feature off though, if desired.
I couldn't help but also notice the mc.jit.movie~ , looks interesting.
I'll check out the sample you posted, thanks!
that's not how playlist functions, so you should be fine.
some more questions:
1. Is there another way besides using playlist that would let me 'build' lists of videos to play AND use the audio from those videos, to be affected by effects?
the reason I ask is because while I can get the audio only from the clips, making them synchronized (if they're long) seems to be an issue. I seem to be in the minority in that I actually want to use the audio from the videos. The other issue seems to be loop point synchronization. setting loop points in the video doesn't obviously set the respective loop points in the audio. thanks!