articles

  • EM Reviews Max for Live

    In this article, Jim Aikin reviews the new add-on product to Live, developed by Ableton and Cycling '74, with a detailed account of his experience.


  • An Interview with Stretta

    Matthew Davidson aka Stretta is a talented guy. He’s an accomplished graphic artist and video producer/editor but we talked to him about his music. Stretta’s music is lush, modest and dreamy in the tradition of Brian Eno but it definitely has character of its own. Stretta comes from a tradition of modular synthesis that led him to discover Max/MSP.


  • Cycling ’74 at the 2010 NAMM Show

    Cycling '74 and 85,000 of its closest friends will spend the weekend at the NAMM Show in Anaheim.

    We are sharing booth 6314 with Ableton and offering personal demos of our recently-released product Max for Live.


  • A Video Processing Device for Max for Live

    While many people are looking at Max for Live as a great way to integrate their favorite hardware controllers, build really unique effects, and add variety to their productions, I was eager to explore what could be done with video inside of Max for Live.


  • The Edit Button Has Been Pressed

    Even before the Max for Live beta was opened up to the public, a community of testers was hard at work putting Max for Live through its paces.


  • An Interview with Robert Henke

    Robert Henke is a brilliant electronic musician who records and performs under his own name and also as Monolake. His music has been described as minimalist yet complex techno with an architectural sound. For me, his music is very spatial and multi-dimensional.I find it takes me on an extraordinary journey through space and time, similar to a great work of fiction. Henke recently said, "The last century was about the creation of electronic music. This century is about performance."


  • Making Connections: Camera Data

    Coming up with ways to get information about the physical world into Max is one of the most fun aspects of working with the software. Whether it is for video processing, sound creation, or any other type of output, physical interactions provide a space for much more interesting relationships to develop. Unfortunately, many ways to get this information into Max require the user to get comfortable with connecting wires to circuit boards and understanding basic (and sometimes not-so-basic) electronics. For this reason, camera-based interactivity can be pretty enticing. There is also a reasonably low startup cost and plugging a camera in is usually a pretty user-friendly process. In this article, I will share a couple of basic techniques for using affordable webcams to gather data in MaxMSP/Jitter.


  • Pamela Z’s Presentation from Expo ’74

    In her first years of using Max, Pamela Z was able shed the hardware weight by building her own instrument in Max. Take a look inside her Max instrument and her custom controllers during this presentation from Expo '74.


  • The Video Processing System, Part 4

    In this installment of the Video Processing System, we're going to tackle two big hurdles that Jitter users often find themselves coming up against. The first thing we will add is an improved, high performance video player module based around the poly~ object. This will allow us to load a folder full of videos and switch between them quickly and efficiently. The other module we will add is a simple recording module to capture our experiments. Since we are using OpenGL texture processing to manipulate the video, it is a little bit more complicated than just using jit.qt.record, but not by much.


  • Creating a “Sketchpad” for jit.gl.sketch

    Lately, I've been working on some "classic" OpenGL programming within Jitter, and I've been using jit.gl.sketch to do that work; it is very close to the OpenGL syntax that you find in most books, and is fairly forgiving in terms of incoming data type. However, I got very tired of editing message boxes once the programs got a little bigger, but I wanted replaceable parameters like you get with a message box.


  • Max for Live Presentation at Expo ’74

    In this presentation, I spoke directly to people that were already familiar with Max, explained some of the details of working within the Live environment, and provided some tips about how to design an effective Live device. Hopefully this will whet your appetite for working with the Max/Live combo!


  • A Video and Text Interview with Alex Stahl

    Alex Stahl is a veteran collaborator and this has never been more evident than in his collaboration with Composer Paul Dresher for the opera Schick Machine. As Robert Henke pointed out in the recent Max/MSP/Jitter Conference, Expo '74, many of us spend years working on the same Max patch. Alex Stahl has spent years developing the Max/MSP patches that are at the core of Schick Machine. Along the way he's developed skills that landed him a fascinating job at Pixar Studios. Collaboration can be quite useful in this world. Read more...


  • Eowave has introduced another product in their line of sensor to MIDI interfaces called the Eobody2 HF, a wireless sensor to USB MIDI device. Building on the user-friendly and rock solid USB MIDI technology used in other recent Eobody boards, the HF allows you to place interactive sensor electronics on dancers, small objects, or anything else where cables would get in the way. Now that we have some of these in stock at the Cycling '74 office, I sat down to give them a thorough run-through to see how it all works.


  • An Interview with Noriko Matsumoto

    An amazing artist with an amazing range of work, read the interview of Noriko Matsumoto by Greg Taylor.


  • So far we have talked about how Max for Live will allow you to create your own custom Max devices that run inside of Ableton Live. Most of the examples you've seen so far have been pretty similar to your average plugin, with the fundamental difference of being to edit the device in place. That in itself is pretty spectacular, and probably enough to please a lot of people and keep everyone busy. Well now I'd like to talk about a couple of features that really make Max for Live unique and pretty exciting: namely, the Live API objects.


  • A Look Back at NIME 2009

    I will try to summarize here what I thought were some of the highlights of NIME 2009...


  • LFO Tutorial 4: Building Complexity

    I'd like to share some really simple things that have worked for me that I hope you'll find useful, or that may provide a starting point for your own investigations.


  • Experiences from “Welcome Sound”

    Many of us are invited to perform in unique circumstances – it’s a part of the Digital Media life. Recently, we’ve been featuring some interesting examples of Max-based work, including Andrew Benson’s work with M.I.A. and Dana Karwas’ installations. So when I was asked to play with an electronic music All-Star Band, I couldn’t help but document the experience.


  • Pluggo Technology Moves to Max for Live

    Effective immediately, Cycling ’74 will discontinue sales of prebuilt Max-based audio plug-in packages. This includes Pluggo, Mode, Hipno, and UpMix. We will still continue to support current users as best we can, but there will be no further development on either the plug-in packages or their supporting technology.


  • Jitter on the Mainstage at Coachella

    On the afternoon of April 3rd, I received an email from M.I.A.'s manager asking if I'd be interested in working with them on a one-off show on the mainstage at Coachella that would feature live video processing... Upon reflecting, I would also like to share a couple of valuable lessons I learned working on this production...


  • A Look Back at Expo ’74

    Last week, we put on our first conference. Now that Expo '74 is history, I've been asked to share my thoughts about the experience...


  • The Video Processing System, Part 3

    In this installment, we'll be working on some more advanced ninja tricks - creating the beginnings of a control/preset structure with assignable LFOs, and building a GPU-based video delay effect. These two parts will bring our system to a much more usable level, and allow for much more complex and interesting results. Ironically, most of what we are really doing in this installment is just an extension of bread-and-butter Max message passing stuff.


  • An Interview with Keith McMillen

    Keith McMillen Instruments recently impressed all of us at NAMM with demonstrations of a new pair of string performance devices, the K-Bow and StringPort, both of which include some very rich software applications written in MaxMSP. The K-Bow, a bluetooth-based wireless gestural controller integrated into a violin bow, has just started shipping so we thought it would be a good time to catch up with Keith and find out more about the project. I met Keith at his studio...


  • The Video Processing System, Part 2

    In our last article, we began to create our processing system by putting the essential structure in place and adding our input handling stage. In this installment we are going to be adding a gaussian blur and color tweaking controls to our patch.


Subscribe to the Cycling ’74 Weekly Newsletter

Let us tell you about notable Max projects, obscure facts, and creative media artists of all kinds.

* indicates required