Laplace Tiger (2009) 12".
A large video performance patch incorporating over 30 video filters, quad-warping and other goodies (a major overhaul was due to Andrew Benson's video processing system articles).
One of the most feared and respected objects in the Jitter collection, jit.expr arrived on the scene as part of Jitter 1.5.
Authors: Gloria Gorchs, David Morella, Marco Domenichetti
concept and production: Gloria Gorchs David Morella Marco Domenichetti...is a return trip into an illustrated album.
While many people are looking at Max for Live as a great way to integrate their favorite hardware controllers, build really unique effects, and add variety to their productions, I was eager to explore what could be done with video inside of Max for Live.
The Méta-Mallette is a platform in which you can load audio/graphics virtual instruments (made in max) or VST plugins.
Author: Livid Instruments
MIDI controllers supported with a variety of Max/MSP/Jitter patches for audio and video.
VJ application for video playback and mixing.
This project uses the Rewire and a series of pseudo-random midi triggers and locators.
Authors: Micheal Miller, Brad Baumgardner, Dhivya Ketharnath, Sriram Pavan Kumar Tankasala, and Eric Souther
The Life of the Techno Buddha was constructed out of 108 Youtube videos when search term “Buddha”.
Coming up with ways to get information about the physical world into Max is one of the most fun aspects of working with the software. Whether it is for video processing, sound creation, or any other type of output, physical interactions provide a space for much more interesting relationships to develop. Unfortunately, many ways to get this information into Max require the user to get comfortable with connecting wires to circuit boards and understanding basic (and sometimes not-so-basic) electronics. For this reason, camera-based interactivity can be pretty enticing. There is also a reasonably low startup cost and plugging a camera in is usually a pretty user-friendly process. In this article, I will share a couple of basic techniques for using affordable webcams to gather data in MaxMSP/Jitter.
In this installment of the Video Processing System, we're going to tackle two big hurdles that Jitter users often find themselves coming up against. The first thing we will add is an improved, high performance video player module based around the poly~ object. This will allow us to load a folder full of videos and switch between them quickly and efficiently. The other module we will add is a simple recording module to capture our experiments. Since we are using OpenGL texture processing to manipulate the video, it is a little bit more complicated than just using jit.qt.record, but not by much.
Alex Stahl is a veteran collaborator and this has never been more evident than in his collaboration with Composer Paul Dresher for the opera Schick Machine. As Robert Henke pointed out in the recent Max/MSP/Jitter Conference, Expo '74, many of us spend years working on the same Max patch. Alex Stahl has spent years developing the Max/MSP patches that are at the core of Schick Machine. Along the way he's developed skills that landed him a fascinating job at Pixar Studios. Collaboration can be quite useful in this world. Read more...
An amazing artist with an amazing range of work, read the interview of Noriko Matsumoto by Greg Taylor.
I will try to summarize here what I thought were some of the highlights of NIME 2009...
On the afternoon of April 3rd, I received an email from M.I.A.'s manager asking if I'd be interested in working with them on a one-off show on the mainstage at Coachella that would feature live video processing... Upon reflecting, I would also like to share a couple of valuable lessons I learned working on this production...
Last week, we put on our first conference. Now that Expo '74 is history, I've been asked to share my thoughts about the experience...
In this installment, we'll be working on some more advanced ninja tricks - creating the beginnings of a control/preset structure with assignable LFOs, and building a GPU-based video delay effect. These two parts will bring our system to a much more usable level, and allow for much more complex and interesting results. Ironically, most of what we are really doing in this installment is just an extension of bread-and-butter Max message passing stuff.
In our last article, we began to create our processing system by putting the essential structure in place and adding our input handling stage. In this installment we are going to be adding a gaussian blur and color tweaking controls to our patch.