A while back, I wrote a series of four tutorials based around the idea of how you could generate and organize variety in Max patches.
In this article, Jim Aikin reviews the new add-on product to Live, developed by Ableton and Cycling '74, with a detailed account of his experience.
Matthew Davidson aka Stretta is a talented guy. He’s an accomplished graphic artist and video producer/editor but we talked to him about his music. Stretta’s music is lush, modest and dreamy in the tradition of Brian Eno but it definitely has character of its own. Stretta comes from a tradition of modular synthesis that led him to discover Max/MSP.
Cycling '74 and 85,000 of its closest friends will spend the weekend at the NAMM Show in Anaheim.
We are sharing booth 6314 with Ableton and offering personal demos of our recently-released product Max for Live.
While many people are looking at Max for Live as a great way to integrate their favorite hardware controllers, build really unique effects, and add variety to their productions, I was eager to explore what could be done with video inside of Max for Live.
Even before the Max for Live beta was opened up to the public, a community of testers was hard at work putting Max for Live through its paces.
Robert Henke is a brilliant electronic musician who records and performs under his own name and also as Monolake. His music has been described as minimalist yet complex techno with an architectural sound. For me, his music is very spatial and multi-dimensional.I find it takes me on an extraordinary journey through space and time, similar to a great work of fiction. Henke recently said, "The last century was about the creation of electronic music. This century is about performance."
Coming up with ways to get information about the physical world into Max is one of the most fun aspects of working with the software. Whether it is for video processing, sound creation, or any other type of output, physical interactions provide a space for much more interesting relationships to develop. Unfortunately, many ways to get this information into Max require the user to get comfortable with connecting wires to circuit boards and understanding basic (and sometimes not-so-basic) electronics. For this reason, camera-based interactivity can be pretty enticing. There is also a reasonably low startup cost and plugging a camera in is usually a pretty user-friendly process. In this article, I will share a couple of basic techniques for using affordable webcams to gather data in MaxMSP/Jitter.
In her first years of using Max, Pamela Z was able shed the hardware weight by building her own instrument in Max. Take a look inside her Max instrument and her custom controllers during this presentation from Expo '74.
In this installment of the Video Processing System, we're going to tackle two big hurdles that Jitter users often find themselves coming up against. The first thing we will add is an improved, high performance video player module based around the poly~ object. This will allow us to load a folder full of videos and switch between them quickly and efficiently. The other module we will add is a simple recording module to capture our experiments. Since we are using OpenGL texture processing to manipulate the video, it is a little bit more complicated than just using jit.qt.record, but not by much.
Lately, I've been working on some "classic" OpenGL programming within Jitter, and I've been using jit.gl.sketch to do that work; it is very close to the OpenGL syntax that you find in most books, and is fairly forgiving in terms of incoming data type. However, I got very tired of editing message boxes once the programs got a little bigger, but I wanted replaceable parameters like you get with a message box.
Alex Stahl is a veteran collaborator and this has never been more evident than in his collaboration with Composer Paul Dresher for the opera Schick Machine. As Robert Henke pointed out in the recent Max/MSP/Jitter Conference, Expo '74, many of us spend years working on the same Max patch. Alex Stahl has spent years developing the Max/MSP patches that are at the core of Schick Machine. Along the way he's developed skills that landed him a fascinating job at Pixar Studios. Collaboration can be quite useful in this world. Read more...
Eowave has introduced another product in their line of sensor to MIDI interfaces called the Eobody2 HF, a wireless sensor to USB MIDI device. Building on the user-friendly and rock solid USB MIDI technology used in other recent Eobody boards, the HF allows you to place interactive sensor electronics on dancers, small objects, or anything else where cables would get in the way. Now that we have some of these in stock at the Cycling '74 office, I sat down to give them a thorough run-through to see how it all works.
An amazing artist with an amazing range of work, read the interview of Noriko Matsumoto by Greg Taylor.
So far we have talked about how Max for Live will allow you to create your own custom Max devices that run inside of Ableton Live. Most of the examples you've seen so far have been pretty similar to your average plugin, with the fundamental difference of being to edit the device in place. That in itself is pretty spectacular, and probably enough to please a lot of people and keep everyone busy. Well now I'd like to talk about a couple of features that really make Max for Live unique and pretty exciting: namely, the Live API objects.
I will try to summarize here what I thought were some of the highlights of NIME 2009...
I'd like to share some really simple things that have worked for me that I hope you'll find useful, or that may provide a starting point for your own investigations.
Many of us are invited to perform in unique circumstances – it’s a part of the Digital Media life. Recently, we’ve been featuring some interesting examples of Max-based work, including Andrew Benson’s work with M.I.A. and Dana Karwas’ installations. So when I was asked to play with an electronic music All-Star Band, I couldn’t help but document the experience.
On the afternoon of April 3rd, I received an email from M.I.A.'s manager asking if I'd be interested in working with them on a one-off show on the mainstage at Coachella that would feature live video processing... Upon reflecting, I would also like to share a couple of valuable lessons I learned working on this production...
Last week, we put on our first conference. Now that Expo '74 is history, I've been asked to share my thoughts about the experience...
In this installment, we'll be working on some more advanced ninja tricks - creating the beginnings of a control/preset structure with assignable LFOs, and building a GPU-based video delay effect. These two parts will bring our system to a much more usable level, and allow for much more complex and interesting results. Ironically, most of what we are really doing in this installment is just an extension of bread-and-butter Max message passing stuff.
Keith McMillen Instruments recently impressed all of us at NAMM with demonstrations of a new pair of string performance devices, the K-Bow and StringPort, both of which include some very rich software applications written in MaxMSP. The K-Bow, a bluetooth-based wireless gestural controller integrated into a violin bow, has just started shipping so we thought it would be a good time to catch up with Keith and find out more about the project. I met Keith at his studio...