Tapout~ losing high frequencies?
If I define the delay time of tapout~ with a constant signal value, the output loses a bit of high frequency info.
If I just use a float , the signal passes through unmolested. What is going on? I've attached a patch that shows this clearly
crazy example, love this.
it has something to do with the input material but partially also with the update of the spectroscope...
try 50.1 ms!
also interesting: see what happens when you shorten the envelope to 30 ms at a delay of 50 ms. :)
Its definately not just a quirk of spectroscope~ i get the same result with Ableton’s EQ8, and its clearly audible in devices that use a signal to control tapout~, you can’t feed high frequencies back for very long. I noticed this also happens with comb~ and a simple gen~ delay. They all lose high frequency info
This is due to (not optimal) interpolation schemes to achieve subsample delay times. This acts as a low pass filter. IIRC in case you specify the delay time as a float, it will be truncated internally to an integer number of samples, so no interpolation is done and hence no high freq roll off appears.
Thanks i worked it out by playing with the @interp attribute of delay in a gen~. You can set to ”none” and you keep high frequencies but you get some artefacts when you change delay time, or use ”cubic” or some other method and you lose high frequencies but the delay time changes smoothly
of course volker is right. signals should be used only when modulation at signal rate is required. and it these cases, no interpolation would not make much sense.