Forums > Jitter

getduration -> wrong time

July 28, 2009 | 11:45 am

Hi,

the movie I wanna play has a duration of 1h 42m 9s = 6129s (according to Quicktime 7.6.2 on OS X 10.5.7).

But when I send a getduration message to jit.qt.movie it tells me 3677s (in Max 5.07).

Actually, with a rate of 1 the movie plays at the same speed as in quicktime, but the time information (via getduration or gettime, in ms) from jit.qt.movie is wrong.

Is there a way to fix this? Do I overlook s.t.?

Link to the movie: http://www.podcast.ethz.ch/episodes/?id=1474 (the one of 326MB). H.264 encoded with an average FPS of 2. (its a lecture)

a very simple patch to try it yourself:

– Pasted Max Patch, click to expand. –

Best regards,
Samuel



Eli
July 28, 2009 | 6:11 pm

The ratio between 3677s and 6129s is approximately 2:3 (60%). Quicktime uses the duration of 1/60th of a second for its time unit, rather than milliseconds or frames- someone else can explain what’s happening in more detail, but I think that’s why the "getduration" message is eliciting a different time than you expect.

Eli


July 28, 2009 | 6:21 pm

the important thing to remember is that getduration doesn’t return milliseconds, but qt units (or whatever they’re called).

see if studying this patch helps clarify and search the documentation for "timescale":

– Pasted Max Patch, click to expand. –

July 28, 2009 | 7:04 pm

Thank you both for your answers!

The value one gets by sending a message "gettime" to jit.qt.movie has to be devided by the "timescale" value (one gets by sending the message "gettimescale" to jit.qt.movie) to get a value with the unit seconds.


Viewing 4 posts - 1 through 4 (of 4 total)