as you may be aware, i'm chipping away at this old granular playback engine of mine/ours; based on the idea of one core phasor + 2 [play~] objects 50% out of phase with each other. For an as yet unknown reason when the grain size is a multiple of 40 (ms), linear playback sounds damn near perfect: an A/B comparison of the file played back via sfplay OR my granulator reveals (subjectively anyway) NO artefacts or warbling or colouration. I'm not concerned here with grain clouds etc, just linear playback for the moment. To my question: has anyone encountered (in practice or in the literature) a reference to grain sizes which are multiples of N? If I use a grain size of 82, 121 or 45 ms then the old Dalek AM/RM warbling occurs. 40, 80 or 120 (divided by 2 obv, cuz of the shifted grain) produce no discernible artefacts at all.
I'm driving my grain window(s) from 1 phasor and the window duration is tied to grain size obviously. Why should it be that any deviation from 40, 80 or 120 produces warbling? I'm not posting a patch yet for 2 reasons: 1, it's a simple phasor --> play * window (+ phase shifted copy) and; 2, the algorithm doesn't offer any clues.
I'm familiar with the Curtis Roads text on this subject.