Use DSP code inside your Max patch.
In third installment of Jitter Recipe Collection, the Jitter Recipe “AnaglyphRender” builds on the “RenderMaster” recipe posted to create a realtime 3-D anaglyph image.
Density is a new interactive real-time program for asynchronous granular synthesis sound file granulation, to generate a wide range of effects: time/pitch shifting, pitch disintegration, time jittering, sound pulverizer, scrub pad exploration, dynamic envelope drawing etc… Interpolation- transitions and Hyper Vectorials pads, provide a powerful way to generate incredible sound objects.
Live performance ensemble using max/msp.
Authors: Andrew Shoben, Neil Gavin, Christopher Lackey
"Words" is an installation in which participants enter a soundscape with three layers: Environment, Words, and User.
Alex Stahl is a veteran collaborator and this has never been more evident than in his collaboration with Composer Paul Dresher for the opera Schick Machine. As Robert Henke pointed out in the recent Max/MSP/Jitter Conference, Expo '74, many of us spend years working on the same Max patch. Alex Stahl has spent years developing the Max/MSP patches that are at the core of Schick Machine. Along the way he's developed skills that landed him a fascinating job at Pixar Studios. Collaboration can be quite useful in this world. Read more...
An amazing artist with an amazing range of work, read the interview of Noriko Matsumoto by Greg Taylor.
So far we have talked about how Max for Live will allow you to create your own custom Max devices that run inside of Ableton Live. Most of the examples you've seen so far have been pretty similar to your average plugin, with the fundamental difference of being to edit the device in place. That in itself is pretty spectacular, and probably enough to please a lot of people and keep everyone busy. Well now I'd like to talk about a couple of features that really make Max for Live unique and pretty exciting: namely, the Live API objects.
I will try to summarize here what I thought were some of the highlights of NIME 2009...
I'd like to share some really simple things that have worked for me that I hope you'll find useful, or that may provide a starting point for your own investigations.
Many of us are invited to perform in unique circumstances – it’s a part of the Digital Media life. Recently, we’ve been featuring some interesting examples of Max-based work, including Andrew Benson’s work with M.I.A. and Dana Karwas’ installations. So when I was asked to play with an electronic music All-Star Band, I couldn’t help but document the experience.
Last week, we put on our first conference. Now that Expo '74 is history, I've been asked to share my thoughts about the experience...
In this installment, we'll be working on some more advanced ninja tricks - creating the beginnings of a control/preset structure with assignable LFOs, and building a GPU-based video delay effect. These two parts will bring our system to a much more usable level, and allow for much more complex and interesting results. Ironically, most of what we are really doing in this installment is just an extension of bread-and-butter Max message passing stuff.
Keith McMillen Instruments recently impressed all of us at NAMM with demonstrations of a new pair of string performance devices, the K-Bow and StringPort, both of which include some very rich software applications written in MaxMSP. The K-Bow, a bluetooth-based wireless gestural controller integrated into a violin bow, has just started shipping so we thought it would be a good time to catch up with Keith and find out more about the project. I met Keith at his studio...
In our last article, we began to create our processing system by putting the essential structure in place and adding our input handling stage. In this installment we are going to be adding a gaussian blur and color tweaking controls to our patch.
In this, the final episode of our guitar processing extravaganza, we are going to step away from making effects and focus on performance support. For a system as complicated as this, performance support means two things: patch storage and realtime control. Thus, we will learn to create a preset system and manipulate the various on-screen controls with an inexpensive MIDI footpedal system.
At this point, we have a pretty useful guitar processing "rack", but it could use a little spice. This spice will come from two additional processors: a looping delay unit, and a basic reverb system. Also, to help keep the output useful, we will drop a limiter on the back end of the entire rig.
This article provides a brief tour of the features we've added to Max for creating Live devices.
Between the tutorials, Jitter Recipes, and all of the example content, there are many Jitter patches floating around that each do one thing pretty well, but very few of them give a sense of how to scale up into a more complex system. Inspired by a recent patching project and Darwin Grosse's guitar processing articles, this series of tutorials will present a Jitter-based live video processing system using simple reusable modules, a consistent control interface, and optimized GPU-based processes wherever possible. The purpose of these articles is to provide an over-the-shoulder view of my creative process in building more complex Jitter patches for video processing.