comb filter + feedback damping filter - tuning issues
are there any DSP gurus here who know how I can get rid of the pitch dependant detuning when my damping averaging filter is active in this tuned comb filter/string resonator patch? I am not concerned about using all-pass interpolation for the main feedback loop (I don't care so much about a varying frequency response in the comb peaks due to linear interpolation), but I am trying to work out how to have a damping control that doesn't alter the fundamental frequency of the comb filter. Of course this is only really noticeable in the upper octaves, and before anyone says check your signal vector size... it's in gen~, so the signal vector size isn't a problem.
I've thought about adding a pitch dependant offset to the comb tuning / delay time, scaled based on the damping coefficient, but I expect there is a simpler and more elegant solution, like using a different filter for the damping. Anyone know?
thanks
INAG (I'm not a guru) but you'll get some damping naturally from not using allpass interpolation. How precise does your damping control need to be, and would it be feasible to do this as an FIR filter in the loop?
since there is no tuning problem when the LPF is inactive, i'm happy that the linear interpolation is not the cause. I have considered crossfading between some linear phase FIRs for the damping control, but actually I have got pretty close just by tuning by ear and offsetting the pitch accordingly (see new patch) . It's probably not worth the added complexity of doing it with FIRs i think. From posts I made on other forums it seems that if I want to stick with the IIR that I am currently using and calculate the tuning offsets precisely, I need to calculate the phase delay of the IIR LPF at the fundamental frequency of the comb filter and subtract that delay from the comb filter's delay time. Not currently sure how to go about doing that!
there is a good topic about it on the NI forums...
(whoops, hadn't seen your previous post before I sent this)
The FIR recommendation may be a bit batty, but it allows for linear phase. Probably too heavy, though.
This is what I suspect the problem is with the tuning: for your IIR filter, the phase delay is dependent on the frequency. You can find the delay for a particular frequency and use that to tune the line. (Maybe look for it in the 2.5-4 kHz band?)
Your delay (in the filter) is effectively being stretched by some exponential factor. If you send in a unit impulse, the amount of time it takes to decay to some level looks like RT = damping^n, where RT is an arbitrary non-zero decay level. (maybe -30 dB?) You might also be able to come up with a correction factor this way.
Here's an implementation of comb-filtering (with saturation) the I built based on a paper from ICST "Digital Sound Generation -- Part 2". It uses allpass delay. It doesn't have damping, but they suggest adding a first order lowpass and don't seem to be overly concerned with the tuning, so there's probably something there.
FWIW, I've also found the "Effect Design, Part 2: Delay-Line Modulation and Chorus" paper by Jon Dattorro to be pretty helpful.
i like your all pass interpolated comb - high freqs ring loud and clear. Have you noticed artifacts when modulating the tuning?
i added an lpf to your comb filter and put it into the patch.
just tried modulating the pitch. the linear-interpolated comb sounds much better with a lot of modulation. Would like to implement "glissable allpass interpolation" as described in this paper...
can i use optimal sharpening technique while designing a compensators of first order for comb filters to get good response