Methods for Syncing Audio and Frame Buffer

Matt Romein's icon

I'm trying to record to a continuous audio and frame buffer that are in sync with one another. Basically grab the last 2 seconds of audio and video information and be able to offset them in order to create various delay effects. I'm a little lost on the best way to do this though. I'm using poke~ and count~ to write to the buffer~ object and then converting the count~ signal to an integer and scaling it to the equivalent frame index for driving the jit.matrixset (I'm doing this with a GPU/Texture method but included jit.matrixset in the example because it more simply exemplifies my problem).

In converting the signal to an integer my problem is that many frame indexes get skipped over or dropped leaving quite a few junk frames in the frame buffer. Normally I would just index each frame based on receiving a unique frame from the jit.grab object but since I'm syncing it with audio buffer this option isn't available. I understand why my current solution is less than ideal but am a bit stuck on other ways to approach it. Any ideas?

Max Patch
Copy patch and select New From Clipboard in Max.

Vincent Goudard's icon

Hi Matt,

what about using a delay ?

Max Patch
Copy patch and select New From Clipboard in Max.

Rob Ramirez's icon

hey Matt, I believe you're going to need to synchronize your read and write triggers. currently you have them driven by two different snapshot~ objects. I think you need to drive them by either the metro triggering your jit.grab, or by the jit.grab output (my guess is the latter). i.e. remove the snapshot~ arg, and trigger output via a bang.

might want to throw some deferlows after your snapshot's if things are still screwy.