Say I’ve got a 3000ms long sample. I know that playing it in 1500ms (double the speed) will be an octave higher, in 6000ms (half the speed) an octave lower.

What I’m trying to work out is a method of calculating the playback speed for each semitone. My brain keeps getting close to working out what it is I should be doing, then going way off track.

This has to be an easy one for maths lovers?

]]>– Pasted Max Patch, click to expand. –

]]>