are there any DSP gurus here who know how I can get rid of the pitch dependant detuning when my damping averaging filter is active in this tuned comb filter/string resonator patch? I am not concerned about using all-pass interpolation for the main feedback loop (I don't care so much about a varying frequency response in the comb peaks due to linear interpolation), but I am trying to work out how to have a damping control that doesn't alter the fundamental frequency of the comb filter. Of course this is only really noticeable in the upper octaves, and before anyone says check your signal vector size... it's in gen~, so the signal vector size isn't a problem.
I've thought about adding a pitch dependant offset to the comb tuning / delay time, scaled based on the damping coefficient, but I expect there is a simpler and more elegant solution, like using a different filter for the damping. Anyone know?