I just experienced something I didn’t ever notice and I’m now annoyed and confused.
I’m using Live and Max6.
Im using audio routing between applications using my blue fireface uc.
No problem with that, total mix works like a charm.
The sound is mixed in Live, and goes out through the fireface directly. It is also going into Max6, and is analyzed for visual generation purposes.
I’m also popping out midi notes from Live to Max6.
I noticed a problem in timing. A big one and it seems I didn’t ever have it:
MIDI notes seem to be delayed (a lot)
A kick is popped out. The sound is perfectly analyzed in Max6, no delay apparently. At least non noticeable…
But the midi note that has triggered the sound in Live is received by Max6 with a delay about 100ms… (later)
So, I checked the audio status in Max6 and I changed the signal vector size to 32 (it was 512) as in Live. And then, no noticeable delay.
It can work fine like that.
But I’m not sure about having two different buffer size in Live and max, considering I’m doing the audio analysis, and especially, why this audio parameter would change the midi timing/latency…? (overdrive or not, scheduler in audio etc)
Does it make sense or am I completely messing things and have to sleep right now…?