I am using Randy Jones his render_node patch to render some video's. It
switches max to non-realtime to render every frame. The problem is
max/jitter takes a very long time to render things that it can do in
realtime. I have a jitter patch running at about 60% cpu usage in realtime.
When I start recording it, it records at about 1 fps. The example patch that
comes with render_node is showing the same behavior.
The weird thing is that max itself shows 100% cpu usage, while my
taskmanager shows its doing almost nothing. The max interface is jammed
like it would be at 100%, but I can keep on using other programs without
problems. My computer even turns on its screensaver after a while, and it
doesn't interfere with anything.
I stripped down the patch to where its basically only rendercontext ->
to_texture -> jit.qt.record, bypassing the frame accumulations and plugins,
but without luck. Also the movie size and codec doesn't make much of a
difference. Rendering 320x240 raw against 800x600 jpeg, still about 1fps.
I also tried changing the dsp settings. As far as I know Randy doesn't know
what's up either, and I suppose its the same on mac as it is on my windows
system. I'm wondering if someone can tell me something relevant about this
realtime mode. Am I missing something? Does anyone have similar experience
with non-realtime mode?
I recently had to wait for 3 hours to render a 4,5min movie :-/