Music video sync and triggering video events
Hi there, so tried searching and using AI for this (which gave me some insight and also threw me into some long dead ends), so thought I might just ask here.
I'm creating some music videos with Max Jitter, so some very long audio and video files and along the way I'm wanting to trigger smaller videos to start and stop on select beats. There are quite a few smaller videos that play simultaneously so all my vids are in hap. The videos are all trimmed on beat to a certain tempo.
So I have sfplay~ as my master sync source right now and video is sync'd to the position messages output of that. To trigger videos I'd like to be able to select a bar/beat start. Currently found a way using the same position messages out of sfplay~ into snapshot (set to 50ms)... then into translate to get the bar/beats. This has added a good chunk of CPU % though... which I'm trying to keep down. Anyone have other sync methods that might work here that are more efficient?
sounds like you're on the right path. you're saying sfplay~ into translate is causing a significant cpu increase? I would first make sure that's really the culprit. are you simply syncing start points or are you trying to sync jit.movie frames to the output of sfplay~?
hard to offer much without seeing a bit of how you're syncing. perhaps you can isolate a small test case of what you're doing that isn't performing as well and post here.
Awesome, thanks for the info @Rob Ramirez. Great to know this is generally the right way. AI was giving some strange answers at one point. Basically I’m just trying to keep the CPU down as much as possible because I’ll be playing around 8 hap videos with gradient masks simultaneously. I opened up the project again from scratch the next day, and the CPU wasn’t taking nearly as much of a hit- maybe I had some extra windows open or something else was happening in the background.
I guess I am syncing more than just the start points of the movies since I need them to stay in sync and not drift over sometimes many minutes (could be 10 min sometimes). There’s one main video sync’d to audio and the other videos are sync’d relative to the main video (just delayed precisely in time).
Currently, I have one snapshot object polling every 5 sec or so to make sure the videos are in line… and another snapshot creating my custom "tempo" to start the videos on exact beats, but for that one I found I need to poll it something like every 15 ms or else it could look off-sync initially. Is it common to only poll this frequently momentarily? For example, set it up to poll every ms starting the beat just before I trigger the video (which currently I'm doing via enabling/disabling a videoplane), and then go back to a much less frequent polling?
UPDATE- I've revised the patch for just a single snapshot to just update the transport and jit movies in case of drift
I'm guessing it might be a little more complicated than necessary? It does seem to be working and low cpu, but I'll test through time... at one point I was getting a beat delayed transport