Algorithmic Composition: An Introduction for the Curious, Terrified, or Perplexed Beginner
"(The Difference Engine's) operating mechanism might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations, and which should be also susceptible of adaptations to the action of the operating notation and mechanism of the engine. Supposing, for instance, that the fundamental relations of pitched sound in the signs of harmony and of musical compositions were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent."
(L.F. Menabrea, Sketch of the Analytical Engine (translation/annotation by Ada Lovelace), London, R. and J.E. Taylor, 1843)
Algorithmic composition is a rich and deep and subtle endeavor, and one of those things that probably every Max audio person out there has opinions about. If you're curious or a beginner, it's hard to know how to translate your curiosity or interest into action.
The multiplicity of opinions aren't so much a comment on the ability of persons to disagree on a big idea, but rather it suggests that there are a whole lot of different approaches to using computers to generate music or audio thought up by a whole lot of people over a very long time (The bad news? There's no single best answer about how to do things. The good news? There's no single best answer about how to do things, so you might as well take a crack at it yourself and join the party!).
There are great collections of Max externals available through the Package Manager to assist you in exploring algorithmic composition - The Algorithmic Composer's Toolbox, the amazing Bach, Cage, and Dada collections of Max externals, the ml.star and ml.lib packages for exploring machine learning, and the recent Upshot release of generative musical modules.
But there are other, quieter and humbler places where you can start exploring your own solutions. That’s what I’d like to introduce in this tutorial: some places to start from the ground up.
In this tutorial, I’d like to encourage you to take those first steps and investigate how you can use algorithms to generate and organize musical variety. We’re going to be using Vizzie modules to do that.
Wait. What? Vizzie modules?
Yeah. You read that right. I’ve got two reasons for doing this:
- First, Max does everything it does using numbers. My friend and colleague David Z. famously once described Max as “making music using meaningless numbers.” One of Max’s great strengths when it comes to thinking about generating and organizing variety is that the numbers that Max passes around don’t have any intrinsic meaning – they mean what we want them to mean, and we can repurpose them to our heart’s content (which is really what transcoding means, when you think about it). So I thought I’d choose an unlikely source for making sounds using your computer’s AU DLS synth (Mac) or Microsoft GS Wavetable Synth (Windows) as a kind of Zen slap.
- The other reason for using Vizzie is a more humble one. In the interest of being approachable to beginning visualists, I decided to set a fixed range for all data that is passed between Vizzie modules – a floating-point range of 0. – 1.0. That range is used in all Vizzie modules, even when converting audio information to data or video information, as well. This means I’ve got a lot of Vizzie modules I can use as sources for generating data, and for processing it as well - the only thing I’ll need to do is to have some tools that help me visualize how 0. – 1.0 floating point values and the 0 - 127 integer data range that MIDI uses are related. For that, I’ll do what all Max programmers do: I’ll create a couple of little tools to smooth the way.
Note: There's an update to Vizzie available using the Package Manager. You should install it.
Making MIDI (A Brief Overview/Reminder)
One of the ways that Max uses “meaningless numbers” involves the formatting of MIDI messages that generate note events. That activity has its own rules. To play a MIDI note in Max, you need three values between 0 and 127 (as specified by the MIDI standard):
- A MIDI note number
- A velocity value (which roughly corresponds to how hard you hit the kay)
- A duration value (the time between pressing down the key and picking it up)
Although you probably usually think of MIDI events as being the result of your action, there’s no reason that that list of three numbers need come from you (or your DAW); they’re just three values. Anything that can generate numbers in the range of 0 - 127 will work just fine.
For this tutorial, we’re going to generate MIDI notes, and use the default AU DLS synth that all of you have as output.
Riding Tandem With the Random
The contents of the first_patch file is a recreation of the first and second Max patches I ever saw (in a lecture at the Academy of the Arts in Utrecht in 1989. Max was called “Patcher” back then). It’s a little hard to express just how revolutionary these two little bits of patching seemed to me back in the day.
While you may not be as excited as I was by these two patches, you can try them yourself by clicking on the title bars of the FADR or the RANDOMIZR modules to turn them on. In the first case, the immediacy of grabbing a fader and generating a stream of notes by wiggling it was astounding.
In this patch, the pair of Max makenote and noteout objects do the heavy lifting when it came to turning that fader output into MIDI notes on the synth. With each change of a fader value, it sends a MIDI note message (they all had the same velocity and duration) to a noteout object that handles adding a note-off (velocity 0) message after the time specified by the duration in milliseconds.
The second patch was my first look at “algorithmic composition in Max.” A simple random number generator (in this case, the Vizzie RANDOMIZR module which can be found in the Vizzie Generate browser tab) kicks out numbers in the range of standard MIDI note numbers. It did exactly the same thing with one crucial variation: a random number generator was doing that my fader used to do. It feels a little funny to write this now after so many years, but I was absolutely transfixed by this… for the first minute or two, anyway. While it was pretty cool to listen to the random number generator tick away, it sounded, well… random. And I wasn’t random when I played. So I wondered about I could do to change that. I tried two simple things to start with - I took 2 random sources and averaged their outputs, and also tried multiplying to random numbers together. At first I thought that I was getting the same output until I started really listening to each one (in these patches, I’ve added a multislider object that lets you visualize the random numbers as well as to listen).
They were different.
I learned my very first lesson in algorithmic composition that day: listening. There are people out there who create arbitrary formal systems and then just turn them on and go with whatever comes out. What I learned is that - deep down - I wasn’t one of those people. I’ve spent the intervening years finding my way by listening and deciding at every point along the way whether what I’ve got is what I’d like to listen to. So I started thinking about what other sources for generating variety were out there for me to explore and listen to.
By the way - here’s a Max patch that helps you visualize the frequency of the output of note values over a long period of time for each of the three ways of working with randomness. With each button press, you generate a thousand random outputs, and then create a histogram that shows you the frequency of output for each value. The interesting results will confirm what you might have heard: adding and averaging the outputs tends to skew the results toward the center of the total output range, and multiplying them means that the single most commonly occurring result is zero (that higher number clinging to the left side of the rightmost multislider display).
Note: Before I move on, I want to make something clear: My move away from randomness in terms of algorithmic composition was a strictly personal one (and I sometimes have gone back and thought again about using randomly created raw materials to achieve results I found pleasing). The history of algorithmic composition is full of extraordinary examples of stochastic processes to create wonderful music, from Lejaren Hillier’s groundbreaking Iliac Suite to the work of Iannis Xenakis’ stochastic synthesis, Barry Truax’s POD composition system, and so on. They’re well worth your attention, and great stochastic work is still being done.
The sort of naïve explanation of my turn from strict randomness in the early days was simple: Since I'm not random myself, the second I handed some part of my stochastic program over to a random number generator, it was completely obvious that I’d done so. I found myself wondering about ways that I could live somewhere between the wide jumps of randomness and something a little, well… smoother. In doing so, I found myself venturing into that second part of what I think of when I think of algorithmic work - organizing the material I’d generated. My patient and longsuffering teachers aimed me at one of my life’s happier algorithmic discoveries: the drunkard’s walk.
Simply put, the drunkard’s walk is also a random process, but one with a twist. At each point, a random number is generated within a set range, and the result is then added or subtracted from the current value. If the range is very small, the result is a kind of wandering path which, though randomly generated, looks like it has some kind of “goal.” If the range is large, the results look (and sound) random. The ACpatch_2 file shows an example of the drunkard’s walk in action. Turn on the WANDR Vizzie module (which can be found in the Vizzie Generate browser tab) to produce output data by using a drunkard’s walk algorithm by clicking on the title bar and watch/listen to the results. If you turn the Step Size dial, you’ll see the output change dramatically. I was particularly taken with the ability to change the “feel” of the output by altering the step size alone.
You’ll notice that the ACpatch_2 file contains a tool I built to simplify the process of working between floating-point values and MIDI values. The Data2MIDInote bpatcher handles the input data scaling and formatting to pass along to the Max noteout object. the Data2MIDInote bpatcher will display its floating-point input properly scaled and labeled, and you can also type in values to test your MIDI synth. Its insides are simple, but tools like this make it a lot easier to work quickly.
The other technique for managing random output I explored was good old averaging using the AVERAGR module (which can be found in the Vizzie Control browser tab). Like the drunkard’s walk, it offered that similar ability to switch between the rough and the smooth. The results look and sound quite different, as you’ll probably notice by experimentation. But they both shared that ability to adjust from randomness to something smooth and destinations between. Which method do you prefer? Try them and find out.
As I kept on patiently trying new sources and listening to the results, I slowly began to produce some little tools that more closely resembled how I worked. One of those tools is memorialized as a Vizzie module: The JITTR module (which can be found in the Vizzie Generate browser tab).
ACpatch_3 provides a simple example the JITTR module’s ability to introduce a controllable amount of randomness to each step in an input. This patch uses a new Vizzie module - the OSCIL8R, which functions as floating-point data output source whose outputs will be recognizable as the kinds of waveforms you use all the time.
Here’s a simple version of that patch that uses the Vizzie JITTR module to introduce subtle to unsubtle degrees of departure from a triangle waveform output:
The right half of ACpatch_3 may look a little more complicated, but don’t be fooled: I’ve replaced the single OSCIL8R Vizzie generator module with its big sister 4OSCIL8R, which combines four synchronized output waveforms as data output sources. In the name of averaging, I’m using a pair of + 0. objects (we need the 0. to indicate we’re working with floating-point numbers) and a / 3. Max object to sum and average the first/second/third and similarly summing/averaging the fourth/fifth outputs of the 4OSCIL8R module to create more exotic output shapes, Those are fed to a Vizzie controller module I’ve not used yet in this tutorial - The RANGR. The RANGR module (which can be found in the Vizzie Control browser tab) lets you set an upper and lower bound for data output, and then scales any input you give it in the 0. - 1.0 output range to the new range.
Since I want to use this module to set the MIDI note number output, that means I can scale the notes I output to a specific pitch range. Of course, I need to know which floating point value corresponds to a given MIDI key. While you and I are completely comfortable with the idea of adding or subtracting numbers to transpose MIDI notes (12 for octaves, 7 for fifths, etc.), doing that with floating-point numbers is um… not at all easy or obvious. So I built myself a little tool to make the task easier.
The Data2Note bpatcher in the patches folder does two things:
- When you send it a floating-point number, it’ll show you the MIDI note that corresponds to it, and also show you the numerical value associated with the floating point value. It’s an easy way to inspect the flow of numbers through your patch.
- You can connect the bpatcher to the data inlet of any Vizzie module and type a MIDI note number or 0-127 value into the number boxes. Those values will be converted to the 0. - 1.0 data range and output.
The insides of the patch are very simple - see? You don’t need a degree in rocket surgery to do this stuff!
I can use my Data2Note bpatcher attached to the low and high output range inputs to handle that. Despite the apparent simplicity of the patch, what’s coming out is starting to sound somewhat less random. Maybe evening interesting. Try setting the values for the 4OSCIL8R modules parameters and listen to the results!
Here are a few other Vizzie modules that you can use to control and organize your algorithm's outputs.
The DATAMANGLR module (which can be found in the Vizzie Control browser tab) is a cousin to the RANGR, with a crucial difference - you can not only set the high and low ranges of the output data, but you can select the behavior for any input data that falls outside of the input range. You can fold, wrap, or clip the results, and there's the additional ability to take that output and rescale it to the full 0. - 1.0 data output range (shown in the center example)
The DATASPLITTR modue (another Vizzie Generate browser tab resident) lets you divide your output into specific regions, and route any data falling in that range to one of four outputs - think of it as four RANGR modules hooked together!
The Vizzie MEDIATR control module, though somewhat exotic, provides an interesting way to deal with two simultaneous streams of output data. While you can use the module to perform standard averaging (in compromise mode) , the MEDIATR also lets you alternately select from one of the two inlets (in debate mode) or randomly select an outlet at every step along the way (in quarrel mode). This module is perfect for more fragmented data sources (We'll look at another Vizzie module that you can use to sum and average multiple outputs next).
The Vizzie 4OSCIL8R module is a great source of linked and synched waveforms. While it's a great source of raw output all by itself, there are a few other neat tricks you can make use of when it comes to providing new sources for your algorithms.
The Vizzie 4DATAROUTR module (which can be found in the Vizzie Control browser tab) provides a simple and elegant way to sum and average a number of data inputs to produce new forms and shapes. When combined with the synched output of 4OSCIL8R module, you can create some interesting outputs which keep time with each other while producing more exotic output that you can further process.
Like all Vizzie modules, the 4OSCIL8R provides a way to set the front panel parameter values on the fly by sending data in the 0. – 1.0 range into their inlets. One really interesting patch you might want to investigate involves what happens when you use the output of one of the 4OSCIL8R module’s data to modify the Multiplier value of another oscillator. It can do some really amazing things (you can find this technique described in more detail in an overview/tutorial piece I did on the rate~ objects).
More Generating, More Organizing
Let's add another way to generate variety, and a Max tool created to further organize floating-point to MIDI note conversions.
Anyone who's ever met me knows only too well that I am a champion of chaotic attractor equations. For me, they come as close as any easily available tool in my Max toolbox to producing output that feels "intentional," and can also be broken down into subunits that produce periodic outputs that, when summed, give you the attractor output. I even wrote a tutorial on creating a Pluggo plug-in based on the Navier-Stokes equation whose output could be used to control and modulate other Pluggo plug-ins (Yeah - my enthusiasm goes waaay back).
ACpatch_4 prominently features the Vizzie ATTRACTR module. It's got a Navier-Stokes equation inside, and gives you scaled Vizzie-style data outputs for each of the 5 subunits of the equation along with a summed and averaged output of all 5 outlets. This patch also includes a new Vizzie trick for beginners and a new bpatcher to make your algorithmic composition explorations a little easier.
In this patch, the outputs of the ATTRACTR module are summed and averaged in pairs, with each pair driving one of the MIDI note parameters. In the case of the MIDI note parameter (ATTRACTR outputs 1 and 2), you'll notice an object called vz.rangr in the patch. This is exactly equivalent to the RANGR module we used in the last patch - every Vizzie module has an equivalent you can create by typing vz follwed by a period and the name of the Vizzie module in lowercase letters. The object that appears has the same number of inlets and outlets, and you can easily view the module's front panel by double-clicking on the object. Using the abstraction is a great way to save patching space when you don't need to see the modules controls. I'm using a pair of Note2Data bpatchers to set the output range of the RANGR module. But what's that long, narrow collection of 12 boxes whose output is passed from the vz.rangr abstraction and on to the left inlet of the Note2MIDI? The pattern of white and blue boxes might provide a clue, along with the contents of those boxes. It's the third tool I've created to make working with floating-point values in MIDIland a little easier - the Note2Transpose bpatcher pitch-by-pitch transposition tool.
When you add a Note2Transpose bpatcher to your patch, it initially appears with a standard octave of keyboard pitches. Typing a new note value (within an octave of the original pitch) into the box associated with a key will remap any keyboard input associated with the original key position to the new pitch. Here's an example of an ordinary Note2Transpose bpatcher and one set to play a version of a pentatonic scale based on G:
Finally, The ACpatch_4 patcher will also showcase the ability to "turn off" repeated notes when the range specified for the vz.rangr abstraction is small, as well. Try connecting or summing and averaging different groupings from the six outputs of the ATTRACTR module and add key mappings of your own....
Image To Sequence
I'm going to close this tutorial/list of things to explore by revealing my until-now secret agenda: There's another reason I chose Vizzie for this investigation of generative/algorithmic possibilities - it's really easy to experiment with translating visual output to audible results.
ACpatch_5 introduces a humble approach to translating visuals that works quite well with the Vizzie modules that generate RGB patterns (3PATTERNMAPPR and 3EASEMAPPR). The ANALYZR module (which can be found in the Vizzie Generate browser tab) will average the amount of red, green, and blue in the incoming texture and output those values individually. I'm using them to generate whole-tone scale arpeggios.
ACpatch_6 introduces another pair of new modules: The BFGENER8R, which uses standard basis functions to create video textures for display and mangling in Vizzie, and the Vizzie SCANLINR module, which lets you grab a horizontal or vertical scanline of an image and transform it into a sequence of data (that we can use as MIDI notes).
Once you're done having fun with these patches, you really should open up the jit.gl.bfg object's helpfile and explore the object. It's an astounding source of amazing visuals. Trust me - it's worth every second of it.
The SCANLINR module takes the visual output of the BFGENER8R module and lets you choose a horizontal or vertical scanline from the input, which is then mapped to a sequence of values (2 - 64) which you can "step through" using the Speed control. You can also select random stepthrough of the sequence, or set an increment to "count by" in traversing the sequence. It's a perfect way to send a sequence of pitches on for more algorithmic adventure. There are thousands of possibilities. This patch has one other subtle trick - the images output by the BFGENER8R can be animated using the Speed control of the BFGENER8R's front panel. That value is set to 0 in this example, so the output to the SCANLINR is static. Once you change that Speed value, the sequence starts to dance!. There's a ton to explore here, and you should keep in mind that you can use the SCANLINR with regular Vizzie video input as well.
John Cage tells us that not knowing where to begin is a common form of paralysis. His advice: Begin anywhere. - Bruce Mau
I hope you'll find this modest collection of modules and tools to be a good place to start exploring generating and organizing variety using Max. There are a myriad other ways to do this, of course. I wish you a fantastic journey, and hope this was maybe a good place to start out.
Further Exploration - Places to start
- Although it's a little dated (it was written in 1988), Gareth Loy’s “Composing with Computers – a Survey of Some Compositional Formalisms and Music Programming Language” is a motherlode of wonderful information, and guaranteed to set you straight when it comes an knowing a little bit about algorithmic composition in the days before we had computers (or an Ada Lovelace to speculate about how such imagined machines might be used). The pre-computational historical survey is worth the price of admission alone, and it’s also a good look at the state of the field in the late 1980s. It's as beautiful as a prehistoric dragonfly trapped in amber.
- Along the way, Cycling '74 people have occasionally reviewed books that we think deserve pride of place on your Max/MSP bookshelf. Some them provide the grounding for those of you setting out (such as The Computer Music Tutorial, Gareth Loy’s Musimathics I & II or Curtis Roads’ Microsound and Composing Electronic Music. We're also recommended sources which are more straight-ahead studies or discussions of algorithmic composition, such as V. J. Manzo's Interactive Composition or - more recently - the magisterial Oxford Handbook of Algorithmic Music.
- Some of the finest minds of the Max community have shared the fruits of their hard work on occasion, and given you the opportunity to sit at the feet of the masters. Run, don't walk and have a look at Chris Dobrian's Algorithmic Composition Blog and William Kleinsasser's Shared Software Project. You can thank me later.
- You're interested in algorithmic composition, so it shouldn't surprise you that you're not alone. The Cycling '74 website's Max Projects area is a rich source of ideas, works, and inspiration. All it takes is a quick search for Max Projects related to algorithmic composition on the Cycling '74 website, and you'll be on your way.
by Gregory Taylor on
Jun 9, 2020 11:15 PM