Sub-milisecond tapin/out?
I've got a working looper using tapin/out. And it works great, click free etc....but I want additional loops to be fractions of the length of the first, some longer, some shorter, all based on the length of the first recorded loop.
However, subsequent loops are drifting by a milisecond at a time because tapin/out doesn't interpolate.
It's working so smoothly apart from that, and I really don't want to go down the ~groove ~buffer ~rec route unless I have to - it's been a big headache in the past for me!
Could someone tell me if buffer recording is the only way? Or is there a synching thing I need to read about?
Any advice very gratefully received.....
Thanks everyone,
Lee Morgan
I was surprised to learn this, but it does look like tapin~/tapout~ doesn't interpolate with delay times < signal vector size. It does for those greater than the signal vector length, so I don't think this is the source of your problem. There may be some drift caused by the interpolation, however, since fractional delays are a compromise solution. It does take a while to manifest (depending on delay lengths), however, since the shift shouldn't exceed 1 sample per cycle. If that's still too much, calculate your delay in samples, convert it to an int, then multiply and reconvert to ms. You should get a non-fractional delay (or very minimally fractional depending upon some vagaries of floating point math? -- let's see if that summons Peter Castine) that will not drift. (the example on the right)
From your description, I think the drift may have a different source: I'm guessing you may be converting some of your delay times to ints accidentally via something like this: "* 2" (or "/ 2") instead of this "* 2." That would account for it happening in ms rather than in samples.
Thanks Peter - really helpful. And delighted if I don't have to abandon the tap in/out method of looping! Will try it later....
yeah, I'm having a blast of a time looping with tapin~ + tapout~. My performance patch goes from tuned comb-style sounds up to a four-bar loop, and it's really incredibly tight, I have no perceivable drift so far.
"tapin~/tapout~ doesn’t interpolate with delay times < signal vector size."
i am confused by this. does not interpolate? it should not even be possible to have delay times smaller than one vector, isnt it?
about the shift: wasnt there a bug which adds 1 ms or 1 sample to tapout~ in an earlier version of either max5 or max6?
-110
I was surprised, too, Roman. It's definitely possible to have delay times shorter than 1 sig vector, but apparently interpolation ceases. (At least it does in the code I posted!)
This is probably a good question for the C'74 folks...
Not sure on the bug; this code is from the latest.
Hmmm, on my system I don't manage to get delays smaller than sig vs.
Concerning the original question, I see drift when the delay time is provided as a signal. Therefor I don't understand this conclusion:
"However, subsequent loops are drifting by a milisecond at a time because tapin/out doesn’t interpolate."
FYI...