Need a timeline that automates your Max patchers, but also can connect to other things? Do you want it to use Open Sound Control to communicate? What if you could have two way communication for the standalone Open Sound Control sequencer to query the state of the Max patcher?
Francisco Colasanto — an Expo ’74 exhibitor and skilled Max user, teacher, and technical coordinator at CMMAS — has released the second volume of his “Programming Guide for Artists” books. His first volume was the first Spanish-language book devoted to Max. The second volume is an eBook and takes advantage of video and sound available for the medium. Available in English and Spanish, the second volume focuses on the Max 6 interface. Both volumes appeal to people who want to learn about Max, and are also resources for people who want to teach Max.
While strolling through the internet, you sometimes stumble upon stuff which you like, and sometimes stuff which you really like. This video, brought to my attention by one of my colleagues, is something which I really, really like.
Even before I got it translated by a Japanese-speaking friend, the radiant smiles of the children and the cool setup were fascinating. OK, it seems to be sponsored by a big car company, but the initiators of this project appear to not only be asking what the car of the future will look like. They want to provoke the children’s creativity and to show them that it is their ideas that can transform anything into anything. And i think they demonstrate this nicely by transforming a car into an instrument, using self-recorded sounds and the help of Max, Ableton Live and some sensors – and it seems to have been a lot of fun too :)
Now these children are telling us that in the future, cars might be their friends or climb trees, or many other things – there are no limits to your imagination. I think that this is a great message to teach.
I am constantly surprised by the work that Max programmers accomplish. Whether creating video projection effects, sensor-based installations, generative compositions or immersive 3-D worlds, the breadth of work produced is amazing. But there is more to the Max community than that – it is also a group of people that are anxious to share the things they learn.
As a company, we’ve worked hard to create the documentation necessary to take advantage of new technologies and new features. But there are limits to the amount of content we can develop, and difficulties in meeting the diverse needs of our users’ interests. As a result, we’ve decided to open up a new avenue for information to be created and disseminated by our employees – and also by our users.
The mechanism that we are using is a Wiki system (MediaWiki, to be precise). This is the backbone behind the Wikipedia online encyclopedia, and represents a flexible and robust way to store information. Everything about a Wiki is flexible; you can modify the organization, contents and presentation of a Wiki without going through any sort of vetting procedures. Thus, a Wiki is perfect for a community that wants to share and is willing to have the content management process be open and transparent.
What makes sense to place within a Wiki? We think it is a great place for all sorts of information: extended reference material, tutorials, discussions of technique and methods for working with hardware. We’ve started the system with a few top-level categories:
- All About Max: This is the location for information about Max itself. It is a great place for tutorials and curriculum on Max training, but it is also where we’ve placed the extended reference material. Right now, most of these reference pages are placeholders; if you have information about how you use a particular object, this is a great place to share your notes.
- Max Interfaces: This is where we talk about technologies that exist outside of Max. Hardware information lives here (and we have some placeholders for the Kinect, Lemur and Monome), but this is also a good place for virtual technology interfaces such as DMX and OSC.
- Topics and Techniques: Have you been using Max long enough to remember the old Topic and Tutorials manual? This was a useful repository for discussion of special features and functions used in Max programming, and also is a good location for Max-wide discoveries. If you want to talk (or read) about concepts larger than a single object, or to talk about operating the Max application, this is your home!
- People and Places: Are you interested in Max workshops, schools that teach Max or people that can do consulting? Or do you give workshops or teach classes? Look here for information, and feel free to add your own information in this area.
It is the final section that is probably the most important part of the Max Wiki. Labeled “Do you want to write an article?”, this section provides the information you need to edit articles, include web and media links into your article, and interface with the existing categories of the current site format. Entering information into a Wiki is pretty straightforward, but you may want to look in here to make sure you get the best formatting available for your content.
The Wiki couldn’t have come to life without the work of a lot of people. Gregory Taylor, the Wiki Gardener, did a lot of research before selecting the MediaWiki tool, and has led the charge to make it happen. The entire Cycling ’74 web team helped integrate the Wiki into the rest of the site (have you noticed the Wiki tab now available in searches?), and is helping provide support and maintenance for the future.
But more of all, we want to thank you, the users of Max, for your willingness to share. It is this openness that has created an incredible community of media artists and developers.
Note: In order to help get the Max Wiki started – and to celebrate the sharing – we are offering a bonus for the first 50 major content contributors – a “Give Max to a Friend” coupon, which provides a free 12-month Max license to anyone that has not previously owned Max. Once you’ve updated the Wiki with something that you feel is “significant” content, drop me (ddg @ cycling74 . com) an email and I will provide you with a serial number to provide to someone you may know.
Holland Hopson’s Post and Beam was released last year, but I stupidly didn’t fall in love with it until recently. I guarantee you’ve never heard anything like it — beautifully performed original and traditional folk songs set against an electronic dreamworld. I can’t think of a recording that provides a more powerful study in contrasts — heartfelt and alienating most of all. Check it out and see if you don’t think the Maxified banjo is not the up-and-coming instrument of the decade!
Moritz Simon Geist has built a (giant) robotic version of the classic Roland 808 drum machine using Max for Live, Arduino, and a heap of solenoids.
From his website: “I see hacking – in this case music hacking – as a form of anti-passiveness, through which I think the individual can have an impact on his or her environment, status and state of mind…Within the last five years the rise of Arduino, Ableton Live and Max/MSP has made this much easier.”
Deep in Cycling 74′s backyard, Eric Maundu is doing wonderful things with an Arduino controlled hydroponic plant growing system. The possibilities for our tools are endless!
More info here.
I recently encountered an interesting group of people doing what they call “sketching in hardware”. Look at last year’s conference to get an idea of the diversity of this concept. Underlying all this diversity is a breadboarding approach that feels very much like Max: connecting modules as a form of experimentation, trying to simplify the transition from idea to hardware.
In this kind of environment, it’s tempting to start tinkering right away. Sometimes that works, but it’s easy to get sidetracked. So I resist the temptation and instead try to spend a lot of time exploring ideas beforehand. I go to art museums and performances, explore the Cycling ’74 projects page, and talk with friends.
Once I have a clear idea of the project I want to try, I create a new patcher
and then leave the computer at home and go for a long walk. There are so many ways to realize an idea, and implementing an idea awkwardly takes just as much time as doing it right, so I take time to sort it out and focus on the essentials.
Then it’s time for pencil and paper. Start by writing down the key organization and/or esthetic requirements for the patch. What would Version 1 look like? How will I know when I’m finished? Staying with pencil and paper, I sketch out the overall plan for the project, decide which modules I need and what they need to do, and imagine what it will be like to play with the system. Finally, I put together a little to-do list, and head back to the computer.
As I build the patch, I try to keep it in a functioning state: “always up and running”. That reminds me of one of my favorite orchestral conductors, who would begin each rehearsal with a run-through. If the patch is always ready to go, then making the final version reliable is easy!
As the functional aspects of the patch come together, it might be a good point to look at ways of refining the overall idea. This would be a great time to step away from the computer once again and pick up a good book. Gregory Taylor turned me on to this gem, Universal Principles of Design:
Looking through this book and thinking about my project, I sometimes get insights into how to refine my project to achieve a kind of elegance. Just because my patch works, doesn’t mean its function is optimized. It’s satisfying to finish with a Presentation Mode full of grace and functionality.
Keeping the Big Picture clear from the beginning through the end results in a Max project that is fun to use, works well, and opens up options for the future.
Eric Lyon, developer of numerous cool MSP objects including the FFTease series, has written an amazingly comprehensive new book on writing audio externals in C for Max and Pd. Eric takes you through a series of twelve examples to illustrate how to implement audio and DSP concepts in external objects. Learn more about Designing Audio Objects for Max/MSP and Pd on the publisher’s web site or on Amazon.
Cycling ’74 will be closed October 17th, 18th, and 19th for company-wide meetings. We will re-open October 22nd. Orders, authorization, and support inquiries will be delayed until then. Thanks in advance for your patience!
This weekend, I’m heading to Pittsburgh, PA to perform as part of the annual VIA Festival. VIA is an all-volunteer run festival that pairs hot musical acts with current visual artists to create a unique audiovisual experience. I’m excited to be part of the show, working with the original moombahton group Nadastrom and checking out some really amazing artists along the way.
Memorable Expo ’74 Artist Jeremy Bailey will also be joining the festival along with some other really fantastic people. For my set on Saturday night, I will be trying an experiment with iPhone cameras in the crowd streaming through a complex video performance patch that has been in constant evolution since 2009. To help make this happen, I’ll be leaning on Airbeam Pro and Syphon to route the iPhone streams into Max, along with a small army of volunteer phone-camera people. If you are near Pittsburgh this weekend, come check it out.
Saturday, October 6, 2012. 8pm-1am, 6000 Penn Ave, Pittsburgh,PA.
I’ve spent the last week or so with Brian Eno and Peter Chilvers’ new iPad app Scape, and I have to say that I’m impressed – both by the application itself, and also by the experience of using it.
The video explains what’s going on quite well and there are a couple of good pieces about the app and its creators here and here, but I’ve found myself thinking a lot about the interface and what is not explained, or by the relationship between the iconic representation of elements in Scape, their function, and the process by which I’ve come to some understanding of what’s going on (My thinking here probably has some of its origins in our being focused on the idea of discoverability during the luge ride that was Max 6 development, but I’ve been fascinated with how trying to make sense of the program has slowed me down in a rewarding way).
On one level, the app certainly is discoverable in the sense that there are intuitive models from other software threaded through the design that make it “easy” to use. But I’m interested in the parts that aren’t efficient in the “Let’s get this interface stuff out of the way so that we can get down to producing things” sense.
I think that this application shares some features with slow food in that there’s an admirable sense of reward that originates in process – interacting with the application over time, and learning by going where to go.
In the case of Scape, that winds up being about looking and listening in real time. There are no pop-up quickie hints, no chatty paperclips. You will only figure out how things work by working with them – watching and listening. The elements themselves invite scrutiny about representation – both in how they look and in the way that they’re animated onscreen while they’re “running.”
From the very first, you start wondering about what the relationship between the little animation you see and what you hear is, and about what the graphic space in which you’re working represents. If you skootch similar elements so that they’re in close proximity, their size alters sometimes. What does that mean? When elements approach the edges of the screen, they shrink or vanish. Huh?
All that might lead you to believe that there’s a simple one-to-one relationship between what you see and what you hear. But that’s not quite true. Some elements, when placed, appear to “stay out of the way” of similar elements. As you watch and listen, it becomes clear that there are other more subtle rules that govern the interaction between different elements – but it’s something you’re going to hear rather than see – you need to trust your ears. While inefficient, I’ve really enjoyed the process.
The pleasure of the relationship continues as you use the application: the next time you launch the app to create a scape, the interface may open to show you that a new icon has been added to your palette. What does it do? Only one way to find out – drop it into a blank scape and listen. Add another element and see if things change. Choose a new palette on the right and see how that effects things. You can leverage what you already know for this exploration, but it’s the same iterative slow-time activity. And the new element is always a small surprise to open and explore.
I expect that there some users who’ll be driven to howlin’, fist-shaking fury at the way that Scape doesn’t explain itself (or the way it isn’t documented). I’m sure they’re dedicated and clever individuals for whom the time necessary to live with the app and to develop some kind of personal and inner map of what they think it does (acts which I’d say live at the corner of Idiosyncracy and Virtuosity) might seem a waste of valuable time if they’re focused on doing the pragmatic thing and wanting to just “get on with making Enoesque audio” (The good news for them is that I expect the “random” feature of the app will be as good as they’d be without the investment of time and attention and the effort of listening. Maybe better. Or it’ll at least save a lot of time – simply punch the button and listen for 20 seconds, and then maybe toggle the moods on the right and stardom will certainly follow).
But I’m seriously entertaining the thought that this application is interesting and reasonably unique precisely because of the way it’s set up to encourage developing a relationship with its interface that’s reinforced by the temporally bound experience of listening, and because of the way the application unpacks itself over time in a way that encourages the continuation of that relationship. I’ve never run into anything quite like it.
By now, my version of Scape has what I think might be a full palette (although I’d love to be surprised again), but the thing still engages me – for example, are my intuitions about how the elements interact anything more than personal ones? I expect that the answer is either “Not at this point,” or that I’ll eventually develop enough facility that my idiosyncratic readings will constitute my “style” when working with Scape. Maybe that personal or internal map of things is somehow the point – instead of something defined for me, I’m writing my own internal manual for the interface, and reworking the documentation with each new piece I get as I work with it (Some things have and will probably completely escape me – I note that Eno says that some elements play more sparsely based on time of day, which I would probably have been the last person in the world to figure out – I have spent my time with it in the evenings only, so far).
And, as an Oblique Strategies afficianado from waaaaay back, how can I not like the Scape Strategies?
To finish on a note more directly related to Max, I don’t think it’d be particularly difficult to add some of what I think I’m hearing in action to the Max patching I already do, honestly [another reason I’ve found working with it to be a salutary experience – the ideas aren’t that difficult or subtle; their implementation is]. Obviously, as a programmer I am going to be less successful at surprising myself – but I do think that considering probabilistic interactions outside of the boundaries of an individual bpatcher – between what I think of as the pieces of what I’m using to create the larger whole – may yield some interesting results.
And, since it might be that I’ve learned an interesting lesson, that process may… um… take a while. A good while.
Don’t worry fans of creativity, SUE-C is still creating thoughtful teases of imagery and transforming spaces to exercise and delight your imagination muscle. She uses Max for both her live performances and recorded work to animate hundreds of different objects (paper, photos, small models, shiny things, etc.) You can see pictures of her set-up here.
Her newest work, Infinite Jest, lives both as an installation and as a live handmade film
inspired by the complex and remarkable novel of the same name by the late author David Foster Wallace. When the piece lives as an installation, the audience experiences the space as an environment comprised of projected videos, a text based soundtrack, a gaming console and mini tennis court intended for the audience to walk through and play with. During the performance, the film is brought to life through the lens of live cameras that follow the manipulation of photographs, drawings, scale models and various three dimensional objects by visual artist and performer SUE-C, along with the live lush electronic soundtrack and vocals by AGF. Set in a slightly futuristic world the film is an attempt to create and re-create what character James Orin Incandenza, optics expert and filmmaker considered to be his life’s major works. After many unseen failures, his ‘film’, an entertainment, is eventually released, which proves to be fatally seductive.
With this as a jumping off point, long time audio-visual collaborators SUE-C and AGF explore the expression of seduction in sound and image in collaboration with Kevin Slagle. You can see photos of the installation here.
The installation opens on October 5th at LaBoral Art Center in Gijon, Spain, and stays until February 25th. The live show is October 6th at 8pm.
SUE-C will also be accompanying Morton Subotnick at the SF MoMa on November 15th for a concert featuring his most famous piece “Silver Apples of the Moon” with her live animation.
More about SUE-C and friends here:
While I generally leave the business of standing alone on stage and fingersquiggling on a smartphone touchscreen to others as a performance modality, I remain an unabashed fan of the idea of being able to construct elegant multitouch module interface/performance setups like this one:
The video crossed my transom almost in tandem with the announcement of the release of Brian Eno and Peter Childers’ most recent iPad app Scape, and together they strike me as a useful object of contemplation: thinking about exercise of “reverse engineering” approach when it comes to making new things in Max, and how or whether it interacts with the idea of encoding opportunities for virtuosity.
Even on those occasions where I have tried to duplicate something that moved me as exactly as I could, the result was never a precise copy. Despite that, it seems as though nearly every important thing I learned about Max programming winds up being traceable to what happens when you break down a problem into its component parts and then try to imagine what the “Max version” of those parts would be. It’s usually the case that a really interesting give and take always occurs while the process is going on that, in retrospect, might be more interesting than the initial goal.
In the case of the touchpad/Eurorack example, you can actually puzzle it out – a combination of the text itself and a quick look at the front panels and jacks for the Cyclebox, ES-3″, Maths, Doepfer A-132-3, and Tom Erbe’s Echophon* modules in the rack will more or less tell you what you need. So, reverse engineering this wouldn’t be too awful a proposition.
Beyond that, it’s just a question of downloading a fingerpinger external for Max and engaging in little tweakery.
That’s the way it is when someone else breaks the conceptual ground for you; you’re freed up to find the parts you’re less comfortable with and tweak the patch, or use what you find for the next thing.
* Mr. Erbe, for those of you who may not recognize his name, is the author of the wonderful suite of Soundhack MSP external objects.
- Connect with Codebar:
Gen patchers now have a sidebar that shows you the text code generated by the patcher in the GenExpr language. This will give you a better idea of what’s going on behind the scenes in Gen. Select objects in the patch to highlight the corresponding code segment, or copy the code from the codebar to use inside of a codebox. It’s a great way to learn about the connection between visual and textual programming in Gen.
- Organize Subpatchers:
Now you can use subpatchers and abstractions to make your Gen patches easier to organize visually and create reusable elements, just like you do in traditional Max patches. Use the gen object, with an optional filename to nest a Gen patcher or abstraction.
- Name Your Send/Receive Objects:
We’ve added the ability to clean up some patch cord clutter in Gen patches with the introduction of named send/receive objects. They are limited for use within one Gen patcher, but they can make patchers easier to read and maintain when you have many objects and connections between different logical sections of your patch.
- Save Time with Loops and Branching:
Now GenExpr supports looping and branching constructs like for, while, and if/then. These make it cleaner to write code which does similar things many times, and can save CPU by only evaluating code under certain conditions.
For more information on Gen, check out our Gen Patch-a-day series.
- Grow with Abstractions:
All VIZZIE modules can now be used like any other Max abstraction. (For example, the BRCOSR module’s abstraction is called vz.brcosr.) You can now integrate VIZZIE modules into your regular patching and use them the way you’d use any Jitter object, and you can mix and match VIZZIE bpatcher modules with VIZZIE abstractions when you don’t need a module’s user interface (If you do need to see the UI, just double-click on the abstraction).
- Hit the “Stomp Box Switch”:
VIZZIE module displays still function as on/off “Stomp Box Switches”, but you can also use Max toggle objects or 0/1 messages to turn VIZZIE modules/abstractions on and off.
- Discover Presets:
You can use Max messages to VIZZIE modules and abstractions to choose presets or interpolate between any two stored presets on the fly.
- Try the New Modules:
VIZZIE now includes two new modules and abstractions: Use CROPPR/vz.croppr to sample a portion of your video and move it about, or expand it to full frame size. And the WYPR/vz.wypr will come in handy for those classic screen wipes effects (try ‘em with a little CHROMAKEYR and some vz.rotatr for some serious fun).
In Max 6, Jitter Physics is a new system for programming physical simulations in any Max patcher, opening up traditional game engine features and realtime motion graphics techniques. Since it is integrated with Max, you can also use physical systems for more subversive tasks like controlling audio or video in ways that have nothing to do with physics or animation as we know it. Based on user feedback, we’ve added many useful features to Jitter Physics in Max 6.0.7.
- More Useful Collisions:
We’ve made major changes to detect and make use of how and when objects collide, including collision force and normal information, and the ability to find collisions from an individual object when it happens rather than in the world’s reporting of all collisions.
- Come Together with UI handling:
We’ve improved how objects can be selected and operated on with mouse and other UI events, and how the physics system and the OpenGL system can work together.
- Physics Multiplied:
Using jit.phys.multiple you can easily create many instances of the same physics body by passing a Jitter Matrix. In this update, jit.phys.multiple gets a set of really exciting new features like the ability to add constraints (joints, motors, etc.) to these instances for more complex simulations.
It’s not always easy to describe what Max does or how it works to your relatives, but now I can say “look at the Bay Bridge” thanks to artist Leo Villareal and his crew.
Read more about the project:
So happy to see Alex Harker’s and Pierre Alexadre Tremblay’s amazing work with the convolution reverb (and many other things) packaged up and announced. If you haven’t downloaded and played with it yet, please do so!
Here are some places to find more impulse responses:
And remember that convolution isn’t just for reverbs…
Loud Objects do live improvisational electronics, soldering complex noise circuits from scratch. The performance is as visual as it is sonic – they do all the soldering on the surface of an overhead projector, and you can see the smoke and spatter as their hands manipulate the silhouettes of microchips and wire. It’s more like jazz than engineering – watching them perform you get the sense that they’re more wrangling chaotic forces than fulfilling a schematic. As a topography of wires and chips emerges, the sound becomes richer, more complex, more convoluted. At one point, they step away from the circuit, which is now generating an evolving pattern. The pattern cycles faster and faster, and at its peak, they cut the power.
A Loud Objects performance is an audio/visual event, a generative system, and a physical instrument all in one. And it is, in fact, quite loud.
See and hear Loud Objects at SFEMF.
Pomona College in Southern California has been a Cycling ’74 customer for over a decade. Beyond our appreciation for Pomona’s use of the software in several different departments, we can empathize with the school’s nerdy obsession with a certain number.
You can probably imagine that we too ascribe magical powers to a certain number similar to the one revered at Pomona, a power which often seems even greater when we are confronted with numbers on either side. I’ve pleaded with my friend Andreas Killen to write a sequel to his wonderful book 1973 Nervous Breakdown with a book about, well, you know… (as a programmer I would think of this as an off-by-one error). Whenever I go to a bakery or a meat counter that uses a take-a-number system, I always hope I will be able to tear off the number 74. In fact if the currently available number is, say, somewhere in the high sixties I might just hang out for a little while. The best possible result would be at a place like Gayle’s where they use take-a-numbers that are of the form
Have you been seeing 74 everywhere lately?
Stanford University has begun an important project to preserve the works of the great Max Mathews. Geoffrey C. Willard at Stanford University Libraries posts about it on their Digital Library Blog.
Read the complete post here on which you can hear a sample of Mathews’ work, and find compilations of other early electronic music gurus for sale (album cover art at left).
I’ve always been fascinated by the ability of sensors to pick up on our brain waves and react with a physical outcome. Like in cool old 70s movies where the girl moves the train using Alpha waves or now the new Star Wars based toy that allows kids to move the trainer remote using The Force.
Now thanks to Max and the work of artist Andrey Smirnov and his assistants, we can hear the brain and all its chattering feedback.
Way to think outside the box!
Since Labor Day is coming, you might have some spare time this weekend and might be wondering what you could do. How about a little Max project?
Here’s one simple but fun idea, based on an optical experiment I found on Brusspup’s Youtube Channel. How about making a patch that would let you do this?
No doubt that you are familiar with Wagon-wheels effect, which is what’s happening in the video above. The interaction between frequency of movement and framerate of the camera gives this cool gravity-defying effect. You might be able to do something low tech for the water fall, but for the Max patch, you should go nerdy!
For instance, instead of using cycle~ to generate a low frequency, why don’t you make your own using a fast algorithm (just like this one) in gen~.
To record the video in Jitter, jit.grab will do the work, but you might also want to record the video using jit.qt.record and upload it to youtube! Don’t forget to share it.
Happy Labor Day!
Recently when I was hearing about a very exciting stage forthcoming stage production (you’ll just have to watch this space!) that uses Max, I was reminded of one the last theater productions I saw that used the software, Schick Machine by the Paul Dresher Ensemble. This amazing piece continues to tour around the world (it was in Hong Kong last month and will be in Illinois early next year).
We did a series of interviews with Paul Dresher and Alex Stahl about the software behind the show when it premiered a few years ago. If you never saw these interviews at the time, they’re a fascinating look at how software development interacts with virtuosity, danger, and the demands of the stage.