metros & qmetros in jitter, and how to throttle data

    Nov 07 2008 | 6:27 pm
    Hi all, I have a patch for a new show that's doing a whole bunch of fun little things. It was mainly set up to do video tracking (bounds, centroid, blob tracking, etc. etc.)...however one day I put the IR image of what I was tracking up on the big screen and everyone got excited about using the visuals in this show. Jump forward a month and I'm doing lots of stuff with masks and mattes and mirrored images and dancing blobs and crazy's been quite a journey, especially for a sound guy :)
    OK, so my first question/scenario is concerning metros/qmetros, and whether or not I am following the best method here: I have a main qmetro @ 5ms...this goes to jit.qt.grab where all the fun begins. I have it so low since at certain times with tracking and video control on at the same time, my frame rate drops quite a bit from the 70fps/80fps to the high 30fps/low 40fps. Still great for what I need. If I decrease the qmetro to say 33.3...I'll see the frame rate drop to the low teens...not so great when the performers are interacting with the I keep it high. All the pre-rendered images/movies are driven by their own metro as needed @ 30ms. Everything gets rendered to several layers in gl (via, with a separate qmetro @ 20 for So to summarize:
    qmetro 5 >> jit.qt.grab >> patches for video tracking & displaying of live images >>
    metro 30 >> >> some image manipulation patches >>
    qmetro 20 >>
    Does this seem right? Should everything be fed off the same (q)metro? Am I using qmetro/metro in the right place? I've been going largely off of examples I've found here and in fabulous recipes, without necessarily understanding for sure what I am doing.
    OK, question 2: in a moment in this show I am using jit.slide to create a little ghostly effect. Clearly the quality of the "trail" depends on the rate at which the data is coming in...sometimes at 70fps...sometimes at 30fps. The images originate from the live camera feed (with the qmetro @ 5). We actually like the effect with the lower frame rate (sort of a stop motion look), so I was wondering what the best method would be for throttling the data coming into jit.slide? speedlim/qlim? data into a jit.matrix (@thru = 0) with it's own metro set at the desired frame rate?
    Thanks in advance for any advice you might be able to give. Hopefully my questions aren't TOO naive ;)
    Best, David

    • Nov 11 2008 | 9:06 am
    • Nov 11 2008 | 5:44 pm
      Hi JS, Thanks for your response...certainly appreciated.
      It's not really messy organization or efficiency that is my problem...I don't think ;) I'm a veteran Max user on the audio side...just a bit newish on the jitter side. I feel I have a good grasp of the concepts and mix of processing on the CPU vs GPU, and certainly have everything well organized having long been afflicted the "everything must line up and look pretty" patching disease. My question was more of a conceptual/architectural/best practices question wrt metros...where to place, type, and intervals. It seems like you are suggesting the same approach that I've been taking, which is reassuring...just slightly different intervals for (q)metros. WRT jit.qt.grab, it's not that I am trying to defy the physical limitations of my digitizer (although that would be lovely), but rather make sure I have a nice, fast stream of data feeding the rest of my processing patches further down the chain (and therefore always have a fps > 30fps on all the processing)...pretty much the same as your suggestion of a "side chain" metro of 16ms immediately after the grabber.
      Based on your reply, and the plethora of examples, I think I'm on the right page here...just wanted to make sure.
      Thanks again for your advice.
      Best, David