Part 3 of the tutorial series on Livid Instruments' Code.
Thomas Resch talks about his Expo '74 workshop.
Converts MIDI input note into 4-note chords to MIDI output in real-time.
Elise Baldwin is an intermedia artist that works with music and projections.
Author: Livid Instruments
MIDI controllers supported with a variety of Max/MSP/Jitter patches for audio and video.
Comprehensive step sequencing device for Max For Live.
Audio looping environment for live and studio use.
This project uses the Rewire and a series of pseudo-random midi triggers and locators.
The Software Klee Step Sequencer is a MIDI-enabled recreation of the electro-music.com hardware Klee step sequencer - developed with the input of the sequencer's creators! It reproduces almost all of the features of the hardware version, and adds many new features as well.
OMM is a robotic orchestra leaded by a human performer gestures.
Eowave has introduced another product in their line of sensor to MIDI interfaces called the Eobody2 HF, a wireless sensor to USB MIDI device. Building on the user-friendly and rock solid USB MIDI technology used in other recent Eobody boards, the HF allows you to place interactive sensor electronics on dancers, small objects, or anything else where cables would get in the way. Now that we have some of these in stock at the Cycling '74 office, I sat down to give them a thorough run-through to see how it all works.
So far we have talked about how Max for Live will allow you to create your own custom Max devices that run inside of Ableton Live. Most of the examples you've seen so far have been pretty similar to your average plugin, with the fundamental difference of being to edit the device in place. That in itself is pretty spectacular, and probably enough to please a lot of people and keep everyone busy. Well now I'd like to talk about a couple of features that really make Max for Live unique and pretty exciting: namely, the Live API objects.
I will try to summarize here what I thought were some of the highlights of NIME 2009...
I'd like to share some really simple things that have worked for me that I hope you'll find useful, or that may provide a starting point for your own investigations.
On the afternoon of April 3rd, I received an email from M.I.A.'s manager asking if I'd be interested in working with them on a one-off show on the mainstage at Coachella that would feature live video processing... Upon reflecting, I would also like to share a couple of valuable lessons I learned working on this production...
Keith McMillen Instruments recently impressed all of us at NAMM with demonstrations of a new pair of string performance devices, the K-Bow and StringPort, both of which include some very rich software applications written in MaxMSP. The K-Bow, a bluetooth-based wireless gestural controller integrated into a violin bow, has just started shipping so we thought it would be a good time to catch up with Keith and find out more about the project. I met Keith at his studio...
In this, the final episode of our guitar processing extravaganza, we are going to step away from making effects and focus on performance support. For a system as complicated as this, performance support means two things: patch storage and realtime control. Thus, we will learn to create a preset system and manipulate the various on-screen controls with an inexpensive MIDI footpedal system.
At this point, we have a pretty useful guitar processing "rack", but it could use a little spice. This spice will come from two additional processors: a looping delay unit, and a basic reverb system. Also, to help keep the output useful, we will drop a limiter on the back end of the entire rig.
This article provides a brief tour of the features we've added to Max for creating Live devices.
Between the tutorials, Jitter Recipes, and all of the example content, there are many Jitter patches floating around that each do one thing pretty well, but very few of them give a sense of how to scale up into a more complex system. Inspired by a recent patching project and Darwin Grosse's guitar processing articles, this series of tutorials will present a Jitter-based live video processing system using simple reusable modules, a consistent control interface, and optimized GPU-based processes wherever possible. The purpose of these articles is to provide an over-the-shoulder view of my creative process in building more complex Jitter patches for video processing.
This week the new Eowave OEM USB boards arrived at Cycling '74 HQ, and I was all too happy to give it a test drive. After having read the impressive spec sheets I was eager to see if the performance of the board lived up to all the promise. I quickly set to work putting it through its paces.