Problems with Rhythm section of markov algorithmic composition

    Mar 22 2013 | 7:12 pm
    Hi all, I'm currently working on a patch that I got inspiration from this great website
    Its working fine with the pitch and velocity, outputting the values stored in the coll object but im having trouble with the duration and delta section they don't seem to be passing numbers through to the makenote, so at the moment I have some nice melodies but no rhythm.
    I'm kinda stumped as to why and would love to have a second (or third!) pair of eyes look at it. It works by analysing a MIDI format 0 file and outputting the note pitch, velocity, duration and delta probability values into a coll object in the form of a markov chain so if anyone would like to test it they may have to use a MIDI file of their own, alternatively i found this website quite good for them
    any help would be massively appreciated! Thanks!

    • Mar 23 2013 | 2:11 pm
      Your patch is pretty slick. I used that same algorithmic composer patch to analyze what I'm playing live, but I didn't think to do velocity and duration. What I did for rhythm is have the patch just analyze for either a rest or a note. then it would decide whether to play or not. Only issue with my solution is that the result is quantized to a metronome value, like 16th notes or 32nd notes, but it sounds fine for what Im doing.
      I don't know if that helps, but I'd love to see what else you come up with on this patch.
    • Mar 24 2013 | 7:55 pm
      Thanks very much although Its heavily inspired by the original patch designed found at so those guys should definitely get the credit. ah ok i see, so what kind of method do you use to analyse for rests? and does that mean that your patch will output a string of 16th notes or 32nd notes constantly depending upon what its set to until you change it manually to quarter notes for example? or does each note change throughout the piece to give you a varying rhythm?
    • Mar 25 2013 | 3:39 am
      I posted a subpatch of what I'm working on. It might be really confusing and not help, but maybe not. the info in the coll "pitchMatrix2" is actually whether to play a note or a rest. When a note is played I assigned it a 1 2 3 or 4 randomly, but a 0 if it's a rest. I did this so that the 2nd order Markov analysis would have more numbers to work with and so that I could just copy the same patch as the pitch analysis part. Again, this is for a live performance, so it might not translate for a midi analysis. If I wanted to play a quarter note I could do it by attacking on the quarter note and resting for the other 16th notes, so I can still get longer rhythms, just not anything faster than a 16th. 
      I hope this might help
    • Jul 16 2013 | 4:55 pm
      Hello A few months gone.. I happen to be playing around with the same patch from algorithmic composer. One thing I noticed was that the duration O/P of borax depends on the seq playing speed. If you set this to 'start 1024' then the durations are correct. To speed things up a bit, You could also use a multiple of 1024, if afterwards you multiply the durations by the same factor.
    • Jul 17 2013 | 4:30 pm
      So that would limit analysis to 'realtime' speed correct?
      Maybe the probably with the first patch was that you are feeding it a very (too) short midi file, that doesn't even register a blip of delta time.
    • Jul 17 2013 | 4:43 pm
      Can be faster than realtime; the point is to multiply 'borax' duration and delta time outputs by the same factor which is used to speed up 'seq'. Tested in fact with a 8-note file. Borax output seems correct.
    • Jul 19 2013 | 5:04 pm
      Here's a fixed version of the patch (I think). I didn't know how to reconnect the patch back together.
      There's a speedfactor which is how many times faster than realtime it scans through the MIDI file.
      I just noticed duration values aren't working right, but I think that's because of how things are reconnected since delta values appear to be working fine.
    • Jul 19 2013 | 5:06 pm
      It's mainly the right inlet of [p MarkovDelta] that I wasn't sure about what connects to it.
    • Jul 19 2013 | 10:15 pm
      I use the right inlet as a kind of reset, like with the others. Our patches are quite similar. Except I use 2nd order for pitch only. And have a different reconnection for the delta (='pauses') generator. Handling of pauses at this point is my main area of interest. Can't seem to do in Max what I'm looking for (getting at gestures).
    • Jul 19 2013 | 11:24 pm
      So there is no rhythm whatsoever in this your patch?
    • Jul 20 2013 | 9:35 am
      Well, some. The pause-between-notes chain is used to modulate note generation speed. Not unpleasant, but not quite what I'm looking for, either.
    • Jul 20 2013 | 10:05 am
      But between that it's just a stream of evenly spaced notes?
      The version I posted up is kind of what I want, but with working duration (doesn't appear to be working in what I posted).
      My intended usage for this is to improvise/generate an assortment of sections/material, then have several 2nd order Markov chains generating self similar versions with structure/form being handled by some other process.
      Essentially will use separate Markov chains to handle Asection, Bsection, Csection, and then actually order them based on some other principle.
    • Oct 12 2013 | 2:42 pm
      Hi there,
      Any news on this?
      Live input possibilities etc.