i’m tweaking some examples patches of jitter and working on one that uses jit.gl.mesh (i’m doing a sort of oscilloscope with graphics that respond to audio).
but i cannot figure out how to record the video output…
so, i beg you for any useful hint…
hi, there are a lot of methods for recording the output of a jit.gl.render object.(thats what you want to do right?) eg.
Look at jit.gl.asyncread
If you want to really record the output of a mesh object, try @matrixoutput 1
and jit.matrixset. or jit.qt.record(would that work?).
jit.gl.mesh does not support matrixoutput.
jit.gl.asyncread is what you want.
another option is a screencapture software like ishowu or fraps
Hello! Thanks both woyteg + Robert Ramirez for your answers.
Following your suggestion, I managed to get the generated video to a jit.qt.record BUT:
the resulting clip has a duration longer than expected.
What I’m doing is that I feed the patch with an audio clip, that generates the video (my aim is to generate a "videoclip" of an audio track), i tweak the jitter patch in realtime, while recording the video out.
But, as said, the videoclip is longer than my audio track… wtf???
So it’s impossible for me to superpose the videoclip to the audio track, the lengths do not match.
I tried to use the jit.vcr object, but that uses too much processor and real-time operation is not possible (stuttering + slowness).
C74 RSS Feed | © Copyright Cycling '74