Dynamic Delay/Gate?
Hi.
I'm having a problem which I'm sure there is an easy solution to, but is completely eluding me.
I am receiving information from a switch (connected by an Arduino etc... this aspect works well and is definitely not at fault). This is then feeding a counter, which is displaying an animation in Jitter (which is basically a nice graphic version of what the count output is).
I need to design some sort of process whereby when I get several inputs within a short space of time, it spaces them out so they are 2000ms apart. An example of what I am aiming for would be: 4 bangs go into the counter within a period of 1500ms (this is just a random time which would cause a problem, the choice of 1500ms is not particularly significant, other than it exposes the flaws of my system). The first bang would be output as '1' immediately, the second bang ('2') after 2000ms, the 3rd (3') after 4000ms, the fourth ('4') 6000ms etc.
I've tried a number of solutions, and they seem very complicated (in a badly written sort of a way), and do not work - or only work when certain conditions are met (eg. if I have 3 inputs within the time frame it works, but if I have 2 or 4 it doesn't).
This must be quite a common problem, so I was wondering if anybody had encountered anything like this before?
Thanks!
Jamie
How's this?:
Alex
That's really excellent.
There's one thing: the output I'm looking for is numerical - eg. it is being triggered by the output of a counter, so it would follow the principle of what you have there, but output numbers. I thought an int object would work, but it would update before all the bangs had been output... Do you have any idea how this would work delaying a numerical input?
Can't you use this on the bangs going into the counter? Or will that not be satisfactory in this instance?
A.
Oh well - whatever, it's not too hard:
A.
Thanks! That's perfect. It's an elegant solution. I can understand how it works but I would never have come up with that.
Thank you for your help.