How to smooth out the output when delay time of tap changes

    Apr 22 2015 | 4:08 pm
    Hey guys,
    I'm new to Gen and I'm building a simple delay. I've attached a screenshot of my gen patch. Basically out1 sends unmodified signal that comes into inlet 1 while out2 delays the signal based on the delay time (in ms) provided by inlet 2. In my test Max patch I have a slider sending values to inlet 2 of the Gen patch. If I move the slider to modify the delay time I hear crackling/glitchy noise coming out the gen patch.
    How do I go about smoothing this out? I tried interpolating the current output value with the previous one, but clearly that's not a correct way to do this, as it still sounds glitchy.
    Thanks for help Niko

    • Apr 22 2015 | 5:52 pm
      Hi no need to explain why this is happening, as you will already know. While I am not a gen-guru, the options I see are as follows:
      1. smooth out the changing values 2. only pass the destination value (ignore changes)
      The first introduces pitch shifting due to the Doppler effect, and the second relies on more mathematical extraction of velocity (than I can muster), but here is my humble example of both approaches:
    • Apr 23 2015 | 6:35 am
      Hi Brendan,
      Thanks for providing me with the sample patch. It's much appreciated.
      I think I will stick with the first approach, as I'm trying to design a tape delay type effect. Also, mstosamps is a useful operator, I was doing the conversion myself.
      Quick question. I want to generate C code from the patch. Say, if I'm designing a delay and I want a maximum of 1 second delay settable by user, but I don't know a sample rate until the runtime of the code.. is it possible to specify delay time dynamically as opposed to hardcoding 44100 or 48000 inside the gen patch?
      Cheers Niko
    • Apr 23 2015 | 3:18 pm
      In gen~ there is a macro called SAMPLERATE.
      Regarding your original question, I think maybe what you want is a quick amplitude crossfade between the old delayed signal (with the old delay time) and the new delayed signal (with the new delay time). You'd have two delayed signals, and fade the old one out as you fade the new one in. Here's a non-gen~ version of what I mean.
    • Apr 25 2015 | 4:13 am
      Hi Christopher,
      I didn't realise you could just just type SAMPLERATE as one of the arguments. That worked perfectly.
      And thanks for the link! It has a lot of interesting patches. I'll look into it.
      Cheers Niko
    • Feb 02 2017 | 2:45 pm
      Hey everyone ,
      I'm starting now to learn max 4 live and i am in the midle of making a simple delay and i have the same problem with the delay time. Can someone explain exactly how i can use this "SAMPLERATE" insade the gen~ ?