triggering audio grains timing inaccuracies-- how to fix?
I've made a prototype patch to do some granular synthesis of real-time stream (not sound-file based). It uses a delay line (tapin~, tapout~) as a buffer to read from.
The subpatch in this posted patch will form the guts of a poly~, but i am already experiencing bad timing problems even with this simple patch, as illustrated. Up to +-10ms inaccuracy is experienced (macbook pro os10.6.8). As you can see the timing of the metro that triggers the grains is fine; but the line~ object output is all wonky
I've tried all sorts of scheduler and audio settings and cannot figure out what the problem is.
I've actually already implemented this in a poly~, and also done a poly~fied gen~ version, and still there is the timing problem.
Any suggestions greatly appreciated.
T
Hi Terry
I can't look at your patch just at the moment, but I believe that signal-rate timing (eg, [phasor~] is more reliable than control-rate ([metro]); are you triggering the ramp very fast, ie
Brendan
Thanks for the post Brendan.
I'll try the phasor suggestion, but I don't think that's the problem. (hat ready to be eaten)
When you look at the patch you'll see that the metro is used to trigger various grain params as a list (transposition, delay, pan, duration etc), so the easiest way to trigger the list is to use a bang aka metro-- which outputs with +-0.25ms accuracy. I guess I could try train~ for that too, but it doesn't seem to be the source of the problem.
The thing is the grain-making subpatch has a +-10ms accuracy (or lack thereof) on my machine, even though the bangs are coming in at great regularity.
T
btw its irregular even at quite slow speeds like 50ms
Back again
I looked at your patch, and it does demonstrate the timing discrepancy as you describe it - however I'm not sure where the error is coming from (and also what issues this creates) - so, I duplicated the line function you've got, using the method I suggested above (phasor --> trapezoid) in the signal domain, perhaps you could bolt this on to your granule subpatch and see if the error persists?
HTH
Brendan
Ta, I'll have a look at it now-- I just tried replacing metro with train~ and ... it's just as bad, if not worse!
Well I made a crummy single shot ramp with phasor to replace line: no joy, just as bad. Need to rethink this. But it should work anyway!
Thanks again, that's a better way to make a one-shot, but it hasn't changed my original and persistent problem; I'd be interested if others have similar timing errors when they load my original patch-- and if they _don't_, I'd be interested in knowing what their scheduler/audio settings are...
T
Terry, trying it here, the biggest variables are i/o vector size (audio) and scheduler interval (prefs/scheduler) - smaller values yield better results in both cases.
Thanks Steve, reducing the IO vector size seems to have worked.
I have always been under the impression that IO vector size doesn't affect much except IO latency, and that it's the signal vector size which is more important for precise timing for internal processes. But apparently not so... I changed my IO vector to 32 resulting in a less-than-1-ms-jitter, (increases cpu from 1% to 7%). I've also enabled scheduler in audio interrupt, which seems to create a larger number of timing errors, but limits the overall range of error if you know what I mean (smooths out the error to some extent).
Now to see if repatching as a gen~ will reduce cpu use down to an acceptable level (7% per instance not good enough really)....
T
I notice that when I turn on audio interrupt, I get a very predictable pattern in the error field. Overdrive makes a big difference, as does the IO vector size for me. Does this affect the sound quality for you?
Some things to consider: once you ship it off to poly~, you may not have this problem, since you will be sending them to different targets, so I wouldn't troubleshoot this outside the context of poly~, because it will almost certainly affect things. The timing irregularity may also be due to when line~ outputs a bang. Is it immediately at the end of the loop, or is it at the end of the signal vector? (I suspect the latter, but I'm not really sure; I bet you could come up with a test by using a very large signal vector) Is there irregularity just regarding the end of the grain? (can you get a reasonably reliable comb filtering effect with a grain rate of every 25 ms?...)
Line~ is going to work better than the phasor~ combo because it's possible for phasor~ to loop before the change has registered. If you want something that can reset at signal rate, then I'd recommend +=~ and clip~ 0. 1. (you'll need to calculate the value going into the left inlet as 1/timeInSamples (e.g. 0.01 = go to 1 over 100 samples). It would be important, however, to calculate this at signal rate as control-rate will only give you 32-bit precision which is going to be inadequate.
Couple small things: I'm a big fan of using trigger and zl nth instead of unpack for these kinds of patches. It means that you don't have to worry about the order that things arrive. Also, you can replace the if/then object with "maximum 0." to create a single-sided clipping function.
Hi Peter,
I'm only listening to it through laptop speakers, so audio quality is not discernable right now, but obviously the timing errors are hearable, if that's what you mean. When I have audio interrupt enabled the timing errors between the source metro and the line bang are almost identical, which makes sense. This makes me think that using the line bang as a way of measuring the audio output 'grain timing' error is valid (to the extent of measurements>1ms).
OK, I just tested the timing of line~s bang output as you suggested, and its error does not depend on the signal vector size, but it _does_ depend on the IO vector size (with audio interrupt enabled). It's a real eye-opener to me that it's the audio IO vector size that basically sets the 'control' rate in max, not the signal vector. It also means I can't use this method to test audio event timing.
I think I'll go with your +=~ idea-- in fact I might just cut to the chase and try doing everything, including grain parameter production, in the signal domain and see how I go.
Thanks for the other tips too-- I should post my quandaries more often...
T
As a follow-up to what I said, (now having had morning coffee...) I'd change the bit about +=~ to do it like this:
sig~ 1
|
+=~
|
/~ yourTimeInSamples-1
|
minimum~ 1.
|
This way you're not having to do a division constantly just to get a double-precision value (since you can't do that at control rate), and you're just adding 1 for every value. There are several libraries out there for signal rate grains; maxobjects.com can save you some time.
I'm not entirely sure that line~ is going to (or can) give you an accurate measurement. Which scheduler thread is that bang running in? I'm a bit surprised that it's the IO vector size that makes such a difference, but that's apparently the case.
Do you have a particular usage case for granular that you're going towards? (are the grains and/or grain rate mostly uniform?)
Thanks for the low-level advice, makes sense.
I dont want to use 3rd-party objects, as the aim of doing it in msp is to prototype it for a gen~patch (there's an external called shot~ that I've previously used that works well as a one-shot ramp). Ultimately the aim is to make an external, but i want to go through these prototyping steps first.
I guess I could make sure the bang from line is high priority, by using del 0 but that timing issue is basically solved for me now-- just have to have it in audio interrupt and a low IO vector and it's as good as it's going to get...
Yes parameter control will be stochastically structured, but I want to have as fine control as possible (ie not have an error of +-x ms when my interonset mask is less than that...)
I'll probably have a dynamically controlled window shape as well (using some sort of trapeziod generator), rather than reading from a fixed buffer.