smpte~ is an object for Max/MSP that generates a SMPTE time code audio signal. It is based on the open-source libltc library by Robin Gareus.
Requires Max 6 or higher, currently Mac OS X only. Edit: updated, now supports 64 bit.
That is what I need !! Thank you. Is there a documentation ?
There is a help file in the zip that should tell you all you need to use the external, but if you have additional questions, feel free to ask.
I needed to set LTC time to a certain value without breaking the signal and this tool accomplishes that. Apparently there is a great help file when I right click on the object ( sorry, I’m new to this :) ). My LTC generator box was for some reason breaking the signal and causing my test to fail. This tool saved my day, Thank you very much …
Hi Erkan, thanks, happy to hear that!
Very cool & lots of promise! Does it do Drop Frame timecode as well? (Television post-production is usually 29.97DF) Or 23.976 for HD sound / video sync projects? It seems like 23.976 should be achievable by setting it to 24fps and then manually running the phasor at 0.99 – essentially achieving the pulldown. Once again: Very Cool!
Hi Sam, thanks, yup it does 29.97 DC but not 23.976. Actually I didn’t know that was a standard, but let me know if you need it for a project, it shouldn’t be too hard to add.
I’m also interested in getting SMPTE Timecode to work in Max for use with Video and TV production audio.
Here is a good primer on Timecode (not necessarily for anyone in particular, but if you don’t know how TC works, it’s a good start).
Video is shot in the camera at the Dropfame video framerate that is slower than an actual Film rate of 30 or 24 fps.
Since Timecode is an integer counting method that correlates ONE timecode value with ONE frame of video, this fractional framerate must be resolved by dropping whole timecode markers at some point.
By the time 30 frames have incremented in the timecode, more than 30 seconds have passed. So, every now and then frames are dropped from the timecode count to speed up the TC, keeping it perfectly in sync with actual elapsed time.
The standard framerates shot in today’s broadcast and movie production are:
24p — 24 full frame images recorded at the rate of 23.976 fps.
30p — 30 full frame images recorded at the rate of 29.97 fps.
60i — 60 fields (half-frame interlaced images) recorded at the rate of 59.94 fps.
60p — 60 full frame images recorded at the rate of 59.94 fps. This is a specialty framerate for slow motion. No consumer playback of this standard exists widely.
Field sync for 60i is not timed off of timecode as far as I know, as there are 30 unique frame points available, not 60.
True film playback on a projector can take place at a true 24 frames per second. This will never be the case for most of what we are doing in digital land.
23.976 is popular for the “film look” and most cinematic-looking styles, a lot of music videos, movies and DSLR type footage uses the slower framerate with more of a flickering look.
29.97 is popular for the smooth “video look” that gives very good resolution of motion for sports, news broadcast, etc. It is most common for broadcast TV and is the current standard playback rate for over-the-air, cable and satellite broadcast. Since 29.97 is the standard delivery platform, most 24p content can only be delivered by adapting it to fit into the 29.97 format and then resolving it at the other end. It cannot actually be delivered as it is shot.
Over the web and bluray, 23.976 content can be delivered in its native format and will appear as intended and is increasing in popularity.
I just wrapped up production of a major reality show for Discovery Channel and the entire show was shot at 23.976 fps. Even though it will probably enter the editing stage at 29.97 or at very least end up at 29.97 for delivery, it was still shot to have the more cinematic look to the framerate.
I hope that helps. I’m an audio guy not really a video guy, but that is my best understanding of how all this stuff works as it greatly affects the technicalities of how we sort out audio for video. I hope this is all accurate. It’s a very confusing topic and any search will reveal the video guys are just as baffled about their own formats as us audio guys.
Hey Scott, thanks for the info!
Looks like you just added 23.976 and 59.94 fps formats to my todo list :)
Very nice object but I experience a problem in order to convert ms to tc. I set autoincrease to zero and feed the object with a float number. Everything works finde unless the fact that the object adds a whole hour every six minutes. This must be a bug right?
Hi snuef, sorry for the late reply.
Thanks for getting in touch, I didn’t experience this issue so far. Could you email me with steps to reproduce (mattijs -at- arttech -dot- nl)?
Thank You ! Very Useful ! any way to add 60fps to the “to do” list ?
It’s on the todo list now. Although I’m kind of busy with other stuff I’ll let you know if it made it to the external.
Btw, if anyone is interested, there is now a Max For Live device incorporating this external to align timecode to the playhead of a clip in Ableton Live: http://showsync.info/clipsmpte/
I relayed the idea of adding frame rates to Robin, the creator of the library this object is based on. In short he says that LTC is not built for frame rates higher than 30 fps, so it looks like I was too early with my comment that it should be easy to add.
Here’s his reply:
The SMPTE 12M-1999 standard (latest revision) defines LTC only for 24,
25, 30000/1001 and 30 fps.
Furthermore the LTC-signal is Binary Coded decimal:
4bits for the frame-number-unit and 2 bits for the tens.
While theoretically possible to go up to 2^6 = 64 frames, the LTC-frame
is grafted such that no valid data can result in a “syncword” showing up
accidentally in the data stream.
Long story short: the max fps that can be transmitted via LTC is 30fps.
Professional Cameras – such as the Arri-Alexa – transmit LTC at half the
speed for framerates >= 30 fps (actually the Alexa has flexible TC
generator. It can translate any shutter fps to one of 24, 25, 29.97, 30).
To make things worse. the LTC-frame boundary is not identical to the
video-frame boundary (that is what the function ltc_frame_alignment()
function is for). The offset depends on the fps and is only defined for
the four available fps. Any equipment that sends LTC at other framerates
is not according to any official spec and won’t be able to inter-operate.
I’ve tested with an Arri Alex and an Arri Alexa Plus camera. They
implement that correctly.
23.976 and 24.975 fps are _not_ using drop-frame counting and are simply
pull-up/down variants. They are counted as 24 and 25 fps respectively.
Hi Mattijs, thank you a lot for this external, it could be a great tool for my patches in the next days!!!! I have only one question: is there a way to build an external that reads an incoming audio smpte ltc and convert it in a 80 bit stream, so I could convert it in midi time code? …or simply display the timing onscreen? Here’s my patch for the encodings (incomplete but with your external just a step forward)
Thanks for your support!
Basically you’re looking for the ‘other way around’ functionalty, i.e. convert smpte audio to a time message in max. This is certainly something I want to add to the external as soon as I have some spare time. You can always send me an email if you have a specific project lined up that you intend to use this for.
Hi Mattijs, what I need to do is simple: I have to use a smpte ltc audio stream and convert it in MTC. The smpte is coming from an Alesis HD24. I cannot use his MTC function because the locate time from the songs inside it, are allowed only for 1 hour of time…so if I have more than 6 or 7 songs each long 5 minutes, I exceede the allowed time…the only way is to use the audio smpte. Now I convert the smpte with a old M-Audio 8×8 midi interface, but the input is unbalanced, so I have to pass the smpte in the mix monitor desk and then with an aux out, i feed it to the midi interface, and then via midi buss this goes in the max application. I’ve used also a free smpte reader application, using an audio input from the interface, but this app don’t save the last setup used (jam mode is setted at 5 second as default every time you launch this app, and the midi ports have to setup too…) The best way could be an external to add to my max app (she send program changes and does some text synchronizations). In the other way, your smpte generator could be useful to do other functions (I have to think about…). I’ve also downloaded your source code and try to recompile the mxo in XCode…but I get some errors…that’s the first time I’m using it, so I don’t have experience about it…..
Cheers to you! and thanks a lot!
You must be logged in to reply to this topic.
C74 RSS Feed | © Copyright Cycling '74