Spectral Delay Gen
Mar 15 2017 | 3:53 pm
I spent the day porting my (not so nice sounding) feedback spectral delay to Gen. After getting quite close to what i wanted but not really achieving it I looked at the provided gen examples dealing with this topic "spectraldelay" and "spectraldelay_feedback". Unfortunately the examples provided have the same issues I'm facing in my implementation.
If I send a signal through the delay, no feedback, equally delaying all bins, I should get a delayed version of the input. Right? Now this works fine in the patch without feedback implemented. If I do the same thing in the implementation which includes a feedback chain, the original signal sounds somewhat blurry. This seems to occur even if the feedback amount is set to 0. I have difficulties wrapping my head around this problem.
I also noticed that the example patches use two different ways to calculate the phase differences. I might have a bad day but from my understanding these produce different results. The way it's done in the no-feedback example, subtracting the current phase from the previous, is the way I'm familiar with. What's the reason for these two different calculations?
I have a feeling there's a fundemental flaw in my understanding how a spectral delay feedback chain works.
thank you! d