First i want to thank for your fast answers.
I love this forum !
I did my tests with
"Scheduler in Override" : off
"in audio interrupt" : off
You are right, the "Signal Vector Size" does not have much effect for the timing.
Only the "I/O Vector Size" and the "Sampling Rate" have.
My question is:
Does the changing "I/O Vector Size" or "Sampling Rate" affect the stability of my standalone.
To describe more, what i am trying to achieve:
I have build a server/client videoplayer system.
One computer is the server, which generates the video frames and sends it with [udpsend] to the client computers.
The server is triggered by a professional media server hardware and have to play in sync with it over minutes.
The sync does not have to be "frame accurate", but should not run totally out of sync over a period of 5 minutes.
So I am trying to generate a master clock (timecode) with the most accurate timing possible in max.
My testpatch is the folowing : metroTest.zip