I’m posting here but I’m not a programming person. Just wanted a quick opinion on whether I’m in the right area here.
I’ve had a longstanding issue with outputting Jitter frames to an external broadcast card.
I get a stuttery picture if I try to output via jitter. I think this could be partly to do with the fact that Jitter is running on a different clock to my video card.
Searching around I found a QT routine called "clockgettime" which allows you to get time from the video card’s clock (strangely enough).
Is there a way of using this clock to drive my metronome, and if so would it increase the likelihood of Jitter spitting out frames at the right time for my videocard to accept them?
Or have I just got terribly mixed up?
Log in to reply
Let us tell you about notable Max projects, obscure facts, and creative media artists of all kinds.