Hey all. I’ve been working on this rather large interface and I’m using the
[Metro 1] object to count milliseconds to determine buffer sizes for a window.
I swear that the metro object is not always the same. Using a metronome, I’ve
seen the biggest difference between when then DAC is on verses off. The metro
is almost twice as slow in my interface with the DAC on. It seems to fluctuate
widely also based on whether objects are imbedded in the master using bpatcher
or playing as a stand alone object.
If anyone has a suggestion it would be helpful, or please at least confirm that I’m
First off, make sure that you have Overdrive on (Options menu). Second, 1ms is right on the edge of the ability of the scheduler to maintain. If you are trying to achieve 1ms, make sure that the Scheduler Interval preference is set to 1ms.
Yeah metro is not the way to go there. You could do it all in audiorate objects, or if you have to go to max-land, maybe a phasor+edge so the metro is rock solid. Still lose precision when it translates back, but at least you know the clock will hold put.