get_current_smpte_song_time
Hi,
I've encountered a problem with using the 'get_current_smpte_song_time' m4l function in combination with an automated tempo change in Ableton.
It seems to me that ableton doest take the previous tempo(s) into account when it outputs the next smpte. By instance if the tempo increases the time goes in reverse. Does this mean that the pointer doesn't refer to an actual smpte time stamp but instead is a derived from the beat counter:
(60000 / tempo) * the current beat position
Am I missing something here? Because I really need this for synchronising my arrangement to a singer, video artist, light technician etc...
Thank you for your comments.
Fedde
Confirmed. I suppose that Ableton Live doesn't have an internal smpte counter. Probably for the same reason output of Midi Time Code (MTC) is not supported.
If your arrangement is played non-stop from start to end, you could use a simple workaround by building a smpte counter with [clocker] and [translate].
Thanks for your reply. I thought of a thing that indexes the tempo structure of the arrangement. I find it a bit annoying to go back to start every time I'm rehearsing a single part.
I've been running into the same problem. I've built a device that exports the time for every locator that starts with a hastag to a JSON file for usage in After Effects; only to find out the get_current_smpte_song_time() doesn't take into account any tempo changes.
So, has anybody found a workaround or has anyone found a theorical concept on how to get an accurate reading of the actual song time with regard to tempo changes?
Alright, I've figured out how to solve it - in theory. Warning; this is going to be a long read, but for those of you who'd be willing to go to great lengths for their project I'll try to be as detailed as possible. If you're looking for an easy fix; I'm afraid there is none. Within Max/Ableton there simply isn't a function that'll allow you to grab the time visible in arrangement view as far as I can find in the official / unofficial docs - JS or otherwise.
Why / what I'm trying to accomplish
(Optional read, just for context)
I'm using Live (amongst a lot of other things) for my podcast. I've already created a device that'll read out the position of every locator with a name starting with a hashtag and dump it as a comma seperated value file. I need 3 versions of every episode and I'm looking for a way to do this in only one export and let my computer do all the grunt work. Since I'm using a metric ton of plugins and tracks, the export process takes up quite a lot of time and I'm trying to avoid that.
01. Mix, with intro and a bit of talk. For my podcast feed.
02. Syndication mix, without intro or talk. For radioshows who just want the mix.
03. Preview mix, 4 beats of every drop. For a preview movie to announce the episode on socials.
With the locators in a CSV file and only one export; I've got ffmpeg on a seperate commandline script neatly chopping that one export into 3 versions. Works like a charm until.. the master tempo changes. Hence, this whole ordeal.
Tools
(Optional read, just for context)
Ableton Live 10 suite, Max for Live 8, JS in Max, JSON, CSV, XML, PHP, bash/shell scripting (MacOS / POSIX), Homebrew, ffmpeg and JQ.
What is and isn't possible with Max for Live
get_current_smpte_song_time 0; will give the time in HHH:MM:SS:[msec] - but does not take into account the change in song tempo. In fact, it's going backwards once you automate the tempo property. Pretty much the root of the problem.
get_current_beats_song_time; will give you the bars/beats etc but that doesn't tell anything in regard to tempo and is also pretty useless.
The Max device "Max Api Song.amdx" will give you a hint on the inner workings of Ableton in the last row of the first column of the "Properties1" popup. "live.property current_song_time @observe 1" wrongfully shows the value as "ms". In fact, this value represents the amount of beats, starting from the start of the liveset. This value is very interesting; I'll call it "BEATS" from now on.
Commandline voodoo
An Ableton Live Set file (.als) is in fact nothing more than a gzipped .XML file. So, after a rename and an gunzip, you'll have a file without extension that's just pain 'ol human readable XML. Walking down the XML tree you'll find "Ableton > LiveSet > MasterTrack > AutomationEnvelopes > Envelopes > AutomationEnvelope > EnvelopeTarget > PointeeId" with a value of 8 which at current seems to represent the tempo - but this could be variable. Not sure at this point. Either way, it contains the info about the automation points with a Value (=bpm) and a Time which hold that BEATS value we've seen before.
Math magic
With the knowledge of where in the project tempo changes, I should be able to make some sort of map file that correlates every beat with the tempo at that time and incrementing on every previous value by calculating the time every beat takes up. Given the fact I only use linear changes, that is. I'm still figuring out the best way to do this, but I should end up with an accurate JSON file that'll tell me the exact start time of every beat in the song. From that point on it's just a matter of extracting the BEAT value of every locator with a hashtag from the XML and Bob's my uncle.
Update: Some pretty weird stuff is going on. It seems the BPM value in the user interface is not telling you the actual BPM during a tempo transition. It simply shows what it theoretically should be; but the actual timing as shown in the timeline bar in arrangement view is quite different.
For instance, when automating a tempo from 60 BPM to 120 BPM in 24 beats, it would increase the tempo linear by 2,5 BPM per beat. So far, everything is as linear as you'd expect. Yet, if you'd calculate every beat by taking the previous total time and adding 60/[current bpm] seconds; you'll end up with something that roughly follows the reality - but Ableton makes smaller changes at the beginning of the curve and larger ones towards the end of the transition as opposed to the linear way the BPM itself moves.
I've tested the timeline bar's values against an export of the project with super short notes at every kick; the timeline values add up perfectly with what a final render would give; so the time is correct - the BPM value is not.
Still trying to figure out the underlying math, weather or not it's some polynomial or inverse logarithmical stuff going on.
Last update: After digging through an endless pile of Python, using Bit Slicer to pillage Live memory banks and checking out pretty much every file in Live's entire scope with hex editors; it pretty much ends with a mission impossible to get the absolute values correctly; due to the faulty implementation of Live's get_current_smpte_song_time - even the Python version for MIDI remote scripting is borked up.
Internally Live is using a C# function/method called Chrono to keep things in check in regard to latency, buffers, OS and many other things. The only possibility I haven't tried, mainly because I suck at C#/cMode, is using Ableton's own Link libraries on GitHub - as the Chrono function is implemented there.
But; I've found a solution. Instead of using locators, I'm now using a MIDI channel with an operator that outputs an extremely (1 ms) short sound at all four oscillators at +6 dB with pretty much every volume option on full blast, as to be sure the audio clips. The audio is routed to an unused audio channel of my soundcard - so I won't go deaf during production. This has the advantage of not having the locators' line all over the project and being a whole lot easier to copy and paste. Instead of rendering the master, I render the master ánd this channel in one go. After this; it's just a matter of using Apple's afclip CLI tool to get a reading of the exact time in 6 decimal points. Soooo.. Bob's sort of my uncle after all.
An Ableton Live Set file (.als) is in fact nothing more than a gzipped .XML file. So, after a rename and an gunzip, you'll have a file without extension that's just pain 'ol human readable XML. Walking down the XML tree you'll find "Ableton > LiveSet > MasterTrack > AutomationEnvelopes > Envelopes > AutomationEnvelope > EnvelopeTarget > PointeeId" with a value of 8 which at current seems to represent the tempo - but this could be variable. Not sure at this point. Either way, it contains the info about the automation points with a Value (=bpm) and a Time which hold that BEATS value we've seen before.
Continuing with this, if you get the xml like this:
<AutomationEnvelope Id="1">
<EnvelopeTarget>
<PointeeId Value="8" />
</EnvelopeTarget>
<Automation>
<Events>
<FloatEvent Id="82134" Time="-63072000" Value="126" />
<FloatEvent Id="82135" Time="0" Value="126" />
<FloatEvent Id="82136" Time="4" Value="169" />
<FloatEvent Id="82137" Time="6.5" Value="110" />
<FloatEvent Id="82138" Time="10.1" Value="148" />
<FloatEvent Id="82139" Time="12" Value="95.9" />
<FloatEvent Id="82140" Time="14.72" Value="149" />
<FloatEvent Id="82141" Time="16" Value="178" />
...
...
...
</Events>
And get the Time and Value and put all together in arrays like this (ignoring the first row for obvious reasons):
X=[0,4,6.5,10.1,12,14.72,16]
Y=[126,169,110,148,95.9,149,178]
You only need to inverse the Y, interpolate the points and make an integration for get the time series in minutes. It´s can be made easily with python:
from scipy.interpolate import InterpolatedUnivariateSpline
import numpy as np
X=[0,4,6.5,10.1,12,14.72,16]
Y=[126,169,110,148,95.9,149,178]
X=np.array(X)
Y=np.array(Y)
# Y comes in beats per minute (BPM). You must reverse their values to get minutes x beat.
Y=1/Y
# here we interpolate the points
f = InterpolatedUnivariateSpline(X, Y, k=1)
for a in range(0,16,1):
minutes=f.integral(0, a)
seconds=minutes*60
time=str(datetime.timedelta(seconds=seconds))
print("Beat ",a," => seconds" ,seconds, " => time ", time )
And this is the output:
Beat 0 => seconds 0.0 => time 0:00:00
Beat 1 => seconds 0.4610453648915187 => time 0:00:00.461045
Beat 2 => seconds 0.8918005071851225 => time 0:00:00.891801
Beat 3 => seconds 1.2922654268808114 => time 0:00:01.292265
Beat 4 => seconds 1.6624401239785853 => time 0:00:01.662440
Beat 5 => seconds 2.055554701708548 => time 0:00:02.055555
Beat 6 => seconds 2.524839263300801 => time 0:00:02.524839
Beat 7 => seconds 3.0559097434097433 => time 0:00:03.055910
Beat 8 => seconds 3.56246174996175 => time 0:00:03.562462
Beat 9 => seconds 4.030111217611218 => time 0:00:04.030111
Beat 10 => seconds 4.458858146358145 => time 0:00:04.458858
Beat 11 => seconds 4.911405305316177 => time 0:00:04.911405
Beat 12 => seconds 5.479097469243455 => time 0:00:05.479097
Beat 13 => seconds 6.063762580082774 => time 0:00:06.063763
Beat 14 => seconds 6.56645447151627 => time 0:00:06.566454
Beat 15 => seconds 6.988377312687413 => time 0:00:06.988377
The problem is that for music of very long duration (more than 30 minutes), there may be a small error in the time it accumulates. I guess this happens because LIVE does a rounding on the BPM calculations.
The problem is that for music of very long duration (more than 30 minutes), there may be a small error in the time it accumulates. I guess this happens because LIVE does a rounding on the BPM calculations.
After having done some tests, I realize that it is impossible to have perfect precision with this method, because from what I have seen, the tempo changes in ableton are not continuous.
The best solution is put a sampler with a click every beat, or half or quarter beat, depending of the precision than you need, and export the audio. With this, you can sincronice anything without problems.
Just to make sure this is registered correctly, here is the open feature request that I think we have.
When tempo changes half-way in the set, Live shows the accumulated song time in its timeline, but get_current_smpte_song_time
doesn't match this:
What we need is a new API entry, say get_accumulated_song_time
, which returns the amount of real-life seconds it takes to play the set from the start until the current playback position, including all tempo changes, as indicated by the timeline in Live.
This is different from get_current_smpte_song_time
, which simply returns the current time in SMPTE format as calculated like current_number_of_beats * 60/current_tempo
.
Hope this helps.
I had the exact same problem: I wanted to calculate the accumulated time at any given point. But I believe I found the solution.
With some reverse engineering it seemed that Ableton is not linearly interpolating the BPM automation, but it's doing it in steps of 16th notes, 1/4th of a beat.
See my post in this forum for details.
The problem in Ableton is that it considers the current tempo to be constant and global. This means, that where ever you get_current_smpte_song_time, it is checking the the bpm (tempo) at the current pointer in the sequencer and derives the SMPTE from there, thinking it's a fixed state for all positions in the timeline. This is of course technically wrong and incase of tempo automations it will yield the wrong results.
For this reason I have built M4L-tool that can calculate the correct SMPTE regardless of tempo changes to render quelists for video frames at given FPS for visual collaborators. However, it is required to insert markers with @bpm nnn at the given tempo changes while gradual tempo changes (linear or curved) are not supported as it's next to impossible to read and calculate reliably by the Live API. Only direct changes are supported. This has worked for me for the last 5+ years or so.
I can share further details to this device and parts of code but the device itself is very much designed to my unique usecases. Besides, it was done years ago and pretty messy, today I'd find a more suitable and clean approach I'd wager.