Variable temp sync metro

andrewnixon0's icon

I'm writing a Max For Live patch and using a metro which is tempo synced to my Ableton set using 'metro 8n @active 1'; I'd like to vary the metro to fire on 4th, 16ths, etc but can't figure out how to do it. Can anyone help me figure out what I'm sure is a really simple problem?

Evan's icon
Max Patch
Copy patch and select New From Clipboard in Max.

I'd use a phasor~

It's better to sync to audio rate signals, they tend to be more accurate. And metro is kind of a pain to mess with sometimes, as you discovered. I messed around trying to get the note value syntax to work as I expected. You could just use a [translate notevalues ms] object, if you wanted to stick with [metro]

andrewnixon0's icon

Thanks very much for the help Evan - that's great!

jonbenderr's icon
Max Patch
Copy patch and select New From Clipboard in Max.

I use this setup for metro pretty much all the time and never have problems.I've never quite understood the usage of phasor~. I've tested pretty vigorously and any accuracy you might gain (which I'm unable to see) doesn't seem worth cpu a phasor~ and snapshot~ would chew up.

Does anyone have a good example of where the alleged accuracy of a phasor~ in place of metro might actually be useful? Curious if I could improve things I've already done.

Evan's icon

An audio rate signal will generally be more stable and consistent than a control rate signal. A phasor~ is probably one of the least CPU intensive audio objects, and there is no need for a snapshot~ object, just use [edge~] or [change~]. Just to test, I created about 50 phasor~ and snapshot~ combinations and my cpu was at like 7%.

If you start doing things in gen~ using a phasor~, or values that move from 0 to 1 in a specific amount of time, will become pretty clear.

I don't have any specific examples about accuracy improvements, it's just kind of been one of those things I've accepted as truth. Mostly because signals are running at 441 samples per ms (assuming a 44100 sampling rate), the max scheduler runs at 1 sample per ms (I recall hearing that before, someone correct me if I am wrong). Doesn't seem to be too far of a stretch to say that audio rate is more accurate.

All that being said, use what works for you, and if you're happy with it, then keep doing it.

jonbenderr's icon

Thanks for the explanation. Might have to do some more experimenting with this.

broc's icon

> ... because signals are running at 441 samples per ms (assuming a 44100 sampling rate) ...

Actually it's 44.1 samples per ms. And as I understand it, signals are processed in blocks/vectors with the default size of 64. So the accuracy is about 1.5 ms (64/44.1), not better than metro unless you choose a smaller vector size (which would increase the CPU load).

Evan's icon

whoops yeah, forgot the decimal.
So you're saying that a phasor~ (or other signal rate timing mechanism) is actually less reliable than a metro?
Audio events are continuous though, they don't just get reported at the start or end of a vector. Are you saying that because audio is processed in vectors it changes the functional resolution of audio events to 1500 samples per second? and that a metro is a better option because of vector sizes?
I find it hard to swallow that a phasor~ (or other audio rate timing solution) is less accurate than a metro, after hearing otherwise for years.

mvf's icon

EVAN, as long as you stay in the Audio Domain, it's true, then it's sample accurat. (For example you could scale the 0.-1. output from phasor~ directly into 0. to 1000. ms to feed a play~ object for sample playback.) But if you "change the worlds" to control rate with edge~, you will be one vector (in Live always 64 samples) delayed.
Or is there someone who can convince me of the opposite?

Evan's icon

Yes delayed, is one thing, but quantized to the edge of a vector, as broc was implying is another. Is the resolution really 1.5 ms? that' can't be right.

broc's icon

If the signal is reported at intervals of 64 samples, the resolution is 64 samples (1.5 ms).

jonbenderr's icon

Lots of good info here. I've seen the topic come up briefly in other threads, but it was nice to have things sort of spelled out like this.

Out of a bit of curiosity and even more boredom, I decided to run a test. Tried to be as "out of the box" as possible.

Created devices out of metro and phasor~. Each were set to trigger C1 notes, 120 ticks (1/16 note) in length every 480 ticks (1/4 note). (The phasor~ object's frequency attribute was frozen to 480 ticks in the inspector.)

I recorded output of each device to a midi clip on another track for 16 bars.

First image is the very start of each recording. Top is metro and bottom is phasor~.

You can see metro is almost exactly on point. The small bit of lag is actually representative of what happens when you record ANY midi from another source in LIVE. Phasor~ on the other hand is behind almost exactly the amount BROC calculated.

When I zoomed in on the start of the 16th bar (bottom image) I found something even more interesting. Metro was still on point. The delay in phasor~ however shrunk quite a bit. To me this means there is potential drift with phasor~. This is only over the course of 30 seconds. Imagine what that drift might amount to over the course of an hour long set.

The testing method might not be the best, but I think it proves a thing or two. There are probably tons of things you can do to improve using phasor~ as a timing source.

If you think about it on a very basic level though, it kind of makes sense. Metro is doing nothing but outputting a bang in time. Phasor~ is trying to draw a precise curve with all numbers ranging from 0. to 1. in time. The more complicated the process, the more room there is for error. I guess the KISS principal wins.

end.jpg
jpg
start.jpg
jpg
greg robinson's icon

Hi guys

Wondered if anyone can help me. I am new to Max and want to know how to sync audio. Do any of you have any examples you could share with me please as I am stuck.

cheers
Greg

Venetian's icon

have you checked the [plugphasor~] object?

it's "sample synchronised to Ableton Live"

greg robinson's icon

No do you have a patch so I can look at it please

Evan's icon

This thread isn't about synching audio....Start a new thread, or at least ask something more specific. 'Syncing Audio' could mean a million different things.

BTW, I ran some tests of my own with these different timing objects, and found out (not surprisingly) that if you work in the signal domain (via the click~ object from metro) the timing of the plugphasor~ is far tighter than metro. If triggering MIDI (or other non signal-rate message) keeping it out of the signal domain seems to be the right choice. I guess I'll have to go back to some of my older patches and reexamine the timing structures.