jit.qt.record non-realtime issue
Hi,
I'm trying to record a very time-sensitive project using jit.lcd as the main object going to jit.qt.record (I created a piece that is exactly 676400 ms, or app. 11 min. 18 sec. long). I would love to apply something similar to the framedump method explained in JitterTutorial 19, but that message applies to jit.qt.movie, not jit.lcd.
I've included the patch if you'd like to take a look as it currently stands. When I record using the method I'm currently using, it ends up being about a minute and 1/2 short in duration. Any ideas on how I might get a high quality recording without dropped frames?
Many thanks.
I haven't looked at your patch (yet), but let's assume that you're
creating a movie at 25 fps. 676400ms/40 = 16910 frames [1000ms/
25frames = 40ms/frame]. This means, when you send a "write 25.
" message to jit.qt.record, you need to send exactly
16910 frames from jit.lcd to jit.qt.record. How you do this is up to
you.
jb
Am 08.01.2007 um 22:36 schrieb nb23:
> Hi,
>
> I'm trying to record a very time-sensitive project using jit.lcd as
> the main object going to jit.qt.record (I created a piece that is
> exactly 676400 ms, or app. 11 min. 18 sec. long). I would love to
> apply something similar to the framedump method explained in
> JitterTutorial 19, but that message applies to jit.qt.movie, not
> jit.lcd.
>
> I've included the patch if you'd like to take a look as it
> currently stands. When I record using the method I'm currently
> using, it ends up being about a minute and 1/2 short in duration.
> Any ideas on how I might get a high quality recording without
> dropped frames?
>
> Many thanks.
>
jb: Thanks for your quick response. I'm recording at 24 fps, with the metro at 41.67. You might want to take a look at the patch (which reminds me that I forgot to include the colls; see the new attachment). The jit.lcd object receives its qmetro bangs at a ramped increment, but then I'm gathering it at a constant rate from jit.matrix @thur 0. Maybe I'm doing something wrong with this setup though.
Thanks.
Would it be best to redesign the patch entirely and get rid of my ramps, and clocker? I'm not such a fan of starting over from the ground up, but I can't think of another way to do this. The processing starts to slow the frame rate coming through the jit.matrix once the qmetro ramps to 10.
Any suggestions?
Check out Randy Jones' rendernode.
On 1/11/07 10:55 AM, "nb23" wrote:
>
> Would it be best to redesign the patch entirely and get rid of my ramps, and
> clocker? I'm not such a fan of starting over from the ground up, but I can't
> think of another way to do this. The processing starts to slow the frame rate
> coming through the jit.matrix once the qmetro ramps to 10.
>
> Any suggestions?
>
>
Cheers
Gary Lee Nelson
Oberlin College
www.timara.oberlin.edu/GaryLeeNelson
Thanks Gary. I'm looking it over, and it seems way over my head, but conceptually seems to make sense enough. It might take a while for me to figure out how to integrate my patch into this, but I'll give it a good try!
Let me see if I understand render_node. You pump your jitter stuff into a texture that is sent to a window, and that window is the source that is taken and recorded into a .mov file (and synced with audio if you want)? I'm not as familiar with js, so I'm a little lost in looking under the hood. It seems to me that rendering is a particularly important feature of this recording process.
What I'm wondering about is whether my jit.lcd-centric video is meant for this type of recording situation (rendering), or whether this is geared specifically toward dealing with recording OpenGL.
I might have a lot of fundamental misunderstandings about how this particular recording method works. Any explanation or guidance would be very helpful.
I made Render_node mainly for recording OpenGL because there was no
easy way to do that. You could use it to record your jit.lcd patch
by sending jit.lcd's output to a texture, or now that I think about
it, right to the window should work too.
Render_node takes care of the timing issue you are having in a very
general way by using the nonrealtime DSP driver to implement offline
recording. It should work with your patch.
Or if it's too daunting, just use a non-realtime rendering loop
instead. replace your clocker by an accum or something, and advance
your time on your own after each frame. It looks like you just have
a couple of line objects, so you can replace these with exprs or
something to calculate the value you want your ramp to have at time t.
best,
Randy
On Jan 11, 2007, at 12:30 PM, nb23 wrote:
>
> Let me see if I understand render_node. You pump your jitter stuff
> into a texture that is sent to a window, and that window is the
> source that is taken and recorded into a .mov file (and synced with
> audio if you want)? I'm not as familiar with js, so I'm a little
> lost in looking under the hood. It seems to me that rendering is a
> particularly important feature of this recording process.
>
> What I'm wondering about is whether my jit.lcd-centric video is
> meant for this type of recording situation (rendering), or whether
> this is geared specifically toward dealing with recording OpenGL.
>
> I might have a lot of fundamental misunderstandings about how this
> particular recording method works. Any explanation or guidance
> would be very helpful.
>
Randy,
Thanks for your reply. I was literally just seconds away from sending you an email asking you the questions you just answered. I appreciate the response. I'll look into replacing the lines with expr and the clocker with accum. Those were my main hangups.
If I were to plug my pacth into the rn window, would it be best to drop that into the test_pattern patch or into the render_node patch? I guess it doesn't matter, as long as it goes to the right place.
Thanks again. I'll post what method ended up working best.
You want to open your own patch instead of test_pattern. Then change
your patch to draw to the rn window. That should do it. If you also
change your master clocker guy to use the grn_worldtime variable, you
should be able to do cool stuff like scrub around in your patch using
render_node's GUI.
best,
Randy
On Jan 11, 2007, at 2:14 PM, nb23 wrote:
>
> Randy,
>
> Thanks for your reply. I was literally just seconds away from
> sending you an email asking you the questions you just answered. I
> appreciate the response. I'll look into replacing the lines with
> expr and the clocker with accum. Those were my main hangups.
>
> If I were to plug my pacth into the rn window, would it be best to
> drop that into the test_pattern patch or into the render_node
> patch? I guess it doesn't matter, as long as it goes to the right
> place.
>
> Thanks again. I'll post what method ended up working best.
>
>
>
For the meantime, I ended up using Evan Raskob's recording loop and replacing clocker with a counter, then replacing line objects with scale objects. It's probably a little convoluted (and I'm sure there's a simpler way), but it did give me the results I hoped for. Randy, I'll also try render_node soon and let you know how it goes.
Here's the patch, and it works like a charm (the colls are attached earlier in the thread. Also, do people prefer to have the patch pasted in the text or attached?):