jit.catch framsize to bpm sync?
Hi Guys,
Is it possible to match framesize of the jit.catch to the BPM of the actual patch?
I'd like to use it to beat-synced monitoring.
(Let's say, 1 bar of the loop is showing on the jit.pwindow)
Another question is...funny, there is no mention to this anywhere.
Is it possible to reverse the "catch" direction?
Besides mode 0, jit.catch writes shows the signal from left to right. Is it possible to reverse this?
Yes, just drive the jit.catch bangs from the tempo object instead of the qmetro or jit.world framerate.
To invert the result of jit.catch, just use a regular jit.dimmap object, like you would to obtain a mirror effect in a normal image.
Hi there,
Thank you very much for the help.
Yes, these are helpful tips. The problem with the dimmap is the it literally reverses your waveform. What I'm trying to achieve is that it shows my waveform in realtime, tempo synced, but, from the opposite direction :D
Hi again.
I don't understand the distinction you're making between "reversing the waveform" and "opposite direction". If you reverse the waveform horizontally, the apparent movement changes its heading...
I also don't understand what you mean by showing the waveform in "realtime" and "tempo synced". It's either realtime or tempo synced. Do you mean to update the waveform only at certain musical beats?
Hi Pedro,
Yeah, I know, sometimes, I'm a confusing ba*tard :D
I have a commercial product, called Scopium, which a real-time waveform display for Ableton Live, where the waveform display is tempo-synced, so you can see a 1 bar loop, 2 bars loop, etc. on the waveform window.
I'd like to update that with jitter and gl, because right now, I'm using a lot of waveform objects with hide/show scripts there.
So, in jit.catch, if you check the waveform, it looks it starts to draw it from the left, what I try to achieve is that it should draw it from the right. Nothing special about this, it just looks more "normal" to my eyes.
About the tempo sync, yes, it should show a 1 bar or 2 bars long loop.
I know, maybe I could get better results with jit.buffer with jit.poke, yet, I have 0 experience with them...
(I mean on how to record into a jit.buffer, then update the gl.graph / gl.mesh window every X time...)
One another thing, what I just realised today.
What is the reason behind that jit.catch seems "inaccurate" in terms of processing the audio into jitter data?
Am I missing something?
Here is a screenshot of a drumloop :

And here is the one from the jit.catch :

Where are the hihats?