Recently, I went to the NIME (New Interfaces for Musical Expression) conference in at KAIST in Daejeon, South Korea. Over the course of five days, I attended workshops in Web Audio, absorbed paper presentations on digital laughter and watched what could only be described as a pneumatic zombie duet. I also attended not one but three banquets. For those interested in the gaps between banquets, I offer this story.
I step off the plane. Location: Incheon. Body: Exhausted. Mind: Blank. Between the 12 hour flight, the 15 hour time difference and repeated exposure to the in flight movie, A Werewolf Boy, I can already feel my grip on reality starting to slip away. I make my way through the airport, down to baggage claim and onto the express train for Seoul. As far as I can tell the train was constructed in the year 2040 and brought back in time to the present day. The oleophobic seats conform exactly to every contour of my exhausted body. A flatscreen television unfolds from the ceiling above, presenting a promotional ad for a nearby civic development project. BUILDING, it promises, in blaring, positivist capitals. CIVIL. PLANT. HOUSING. Depictions of enormous glass and steel buildings, assembled by swarms of tiny robots, rise before me. Outside my window, we pass row upon row of small scale farms, sometimes running all the way up to the train tracks. Eventually the train comes to a small bridge connecting Incheon to the mainland. Rising up out of the water I can see huge mounds of dirt and grass, looking like the backs of giant turtles lumbering towards Seoul. I am very sleepy. I decide that they probably are turtles, and I write the following poem:
POEM FOR THE TRAIN TO SEOUL
The fog helps me see the tortoises
Grinding out low channels
And the speculative egrets on long stalks
The tortoises are my cold cows
Ruminating on the countryside
And other fictions
They roar silently
Like old men, or magma
Train tracks are humming
The sound of soft gray wool
And my eyes are as heavy as the tortoises
I decide that this poem is very good, then I fall asleep. When I wake up, we’ve arrived in Seoul, where I must have boarded another train for Daejeon, though I honestly can’t remember. Neither do I remember arriving in Daejeon, finding my hotel, or making my way up to my room. Probably all these things happened, but whether they happened to me or to someone who looks a lot like me I will never know. In the morning a straight line connects my backpack to my suitcase, to a pair of shoes, to where I fell asleep, face down on a still-made bed.
A few months ago we made the decision to trim down our office size and send some physical merchandise like shirts, audio libraries, and music releases to Amazon for fulfillment. We plan to be adding some new, fun products in the future, too.
For those of you who love [have] Amazon Prime, you know what this means. There is nothing stopping you… sort of.
Youtube user Naoto Fushimi has been steadily posting some great videos demonstrating advanced, audio reactive Jitter / GL techniques.
Follow here if you like seeing pixels move!
The good folks over at VDMX just posted an excellent video tutorial series, detailing the steps necessary to create communication links between Max and VDMX. In the video, a texture generated in VDMX is sent to Jitter, via the Syphon plugin, analyzed with jit.3m, processed with jit.gl.pix, and sent back to VDMX.
Very cool to see these two apps playing so nicely together!
Using only a single stereo S/PDIF output from your audio interface, you can access up to five ES-4 gate expander modules – each of which supports eight gate outputs. That’s 40 outputs! All this flexibility is easily accessed with Expert Sleeper’s new native Max es4encoder~ object. It couldn’t be simpler.
The five eight bit outputs can be used in a number of different ways. Instead of eight gates, an output can send another single 8-bit message like pitch CV or velocity.
The new (beta) Code Export feature of Gen has only been around for about a month, and is still sparsely documented, but that didn’t stop Varun Nair at the Designing Sound blog from digging in and trying it out. The tutorial goes through the process of creating and exporting a tremolo effect with Gen and then building the code into an Audio Unit plugin. It’s great to see such a clear and well-written tutorial.
Varun also gives a nice and simple overview of getting started with the Gen environment in Max. We look forward to seeing more experiments in this area, and are really excited about what people will do with Code Export. Have any experiences to share? Let us know in the comments.
Next week, a special event will be happening in Brooklyn at Roulette. Toni Dove’s “Lucid Possession” premieres April 25, 26, and 27th. Those lucky enough to be in the vicinity will have the opportunity to experience this unique stage production — a “contemporary ghost story” featuring robotics, gorgeous costumes, and stunning voices and music. There are many talented Max users involved, including Todd Reynolds, Luke DuBois, and Elliott Sharp. They, Toni and all the other artists and crew will make it a memorable experience. Don’t miss it!
At the Code Control Festival in Leicester England this past weekend we gave attendees an advance peek at some of our mobile projects. Sam Tarakajian, our principal mobile developer, showed a new iPad app, the Mira controller for Max, that makes it possible, with as close to zero configuration as possible, to make your patch “touchable.” Mira presents a large set of Max user interface elements on the iPad exactly as they appear in your patch. It also provides access to multitouch and accelerometer data. We’ll be revealing more of this powerful addition to the Max universe as we prepare it for release in the app store later this spring.
As a possible companion to Mira, I revealed a new “hardware” project dubbed the MiraBox — in reality, nothing more than an 8 x 10 wooden picture frame stuffed with foam — that helps capture accelerometer and gyro data from the iPad. The software component of the project was prototyped entirely with Mira and Max 6. Like many others we’re interested in extracting higher-level gestures from accelerometer sensors, but in particular, we’re interested in tracking data when you touch your patch.
Matthew Davidson, the developer of the new Mono Sequencer device, gives us a quickstart primer on using this creative MIDI effect. Watch for new videos over the coming weeks!
Today we’re excited to release Max 6.1.
You can download Max 6.1 now to check out these new features:64bit Application
- Use more than 4GB RAM
- Use high precision 64bit numbers in Max messages
- Load 64bit Audio Unit and VST plugins
Live 9 Support
- New devices
- New Live API features
- Performance and stability improvements
New Gen Features
- Integrated operator reference
- New operators and expression features
- (Beta) Export Gen code to C++ (gen~) or GLSL (jit.gl.pix)
- Faster application launch
- Faster patcher load time
- General optimizations
Complete Max 6.1.0 release notes are available here, and more discussion about what these features represent follows.
64bit application support is a big deal, and given how long Max has been under development in a 32bit world, it was no easy feat. Thank you all for your patience as we’ve worked to make this happen. 64bit applications allow users to take advantage of a much larger memory space and hence more than the ~4GB of RAM we are limited to under 32bit. We’ve also been able to make infrastructural changes to support 64bit numbers when passed via Max messages for higher precision calculations. These two things are features you have been requesting in Max for years, and finally those features are here.
However, we’d like to balance expectations here. Since this is our first 64bit release, we will not have all of the features of the 32bit version, especially regarding Jitter and QuickTime support. QuickTime is simply not available on Windows under 64bit, where we will rely on DirectShow for movie playback (to playback QuickTime files you will need a third party plugin for DirectShow). Apple’s QTKit API on Macintosh 64bit has fewer features than the 32bit version of QuickTime, and requires a dramatic rewrite of our code base. We’ve only implemented the most basic of functionality for movie playback at this time on both platforms. We will be continuing to work on Jitter video playback and other QT features in the 64bit version, but many features are not present and may not make it to the 64bit version ever.
Max and MSP should have nearly all the same features, except where it relies on QuickTime (e.g. PICT files are not currently supported under 64bit and instead we recommend converting to PNG or JPG). However, 3rd party developers will need to port their objects to 64bit for them to be able to run inside the 64bit version of Max. There is no loading of 32bit externals in 64bit version of Max.
We will be providing an SDK for 3rd party developers in the coming days, but it will likely take some time before any particular 3rd party external will be available to use. We would recommend that if you do want to use the 64bit version and you have 3rd party dependencies that you see if you can remove these dependencies by using core objects or abstractions to replace these dependencies, until your favorite 3rd party object is available.
On Macintosh, the application comes as a single FAT bundle, by default set to run in 32bit mode. To run in 64bit mode, select the application and “Get Info” from the finder. In the “General” tab there should be a check box which says “Open in 32-bit mode”. You can turn this off to run in 64bit. If you want to keep separate 32bit and 64bit versions, you can duplicate your max folder, select one of the applications and set it to run in 64bit as described. Externals are also FAT bundles–i.e. contain both 32 and 64bit code.
On Windows, there are separate 32bit and 64bit installers and applications, and externals are in separate .mxe (32bit) and .mxe64 files.
Live 9 Support
Max for Live users will need to use Live 9 in conjunction with Max 6.1. Live 9 will be released on March 5th, and as you may have heard, Max for Live is now included in Live 9 Suite. The factory content will look a little different than in previous versions and you will need to download and install the appropriate live packs for the content which previously was installed by default. In addition to the exciting features of Live 9, there are some great new devices in Max for Live, especially the drum synths and convolution reverb, but I will recommend you go to the Ableton.com website for more information regarding Live 9 and Max for Live.
New Gen Features
Gen has some significant additions and improvements in this release. Gen now has an integrated operator reference in the side bar to make learning and discovery easier than in previous releases. The operator set has grown, and the GenExpr language now supports recursive functions (for CPU, not GPU targets), calling gen patchers as functions, and defining functions with named parameters. But most exciting in this release is that we have a beta version of code export. This means that you can take your gen~ patchers and export them to C++ code and your jit.gl.pix objects and export them to GLSL code. This feature only has limited support in our initial Max 6.1 release, but over the coming months, we will be working to improve the generated code, template examples, and documentation to make this feature useful for those of you who have been waiting for this capability. Note that the code export feature will assume that you are familiar with C++ and working with a development IDE like XCode or Visual Studio. We will be adding more code export examples and documentation in the WIKI.
Thank you for continuing inspire us with your creativity.
If you follow on the Max Gen forum, you might be forgiven sometimes for thinking that the only people using gen~ are command-line-codeophiles busy downloading stuff from DSP archives and dropping them into a codebox object. While that’s awesome, I have a particular “burden on my heart” – as we say in the part of the U.S. my family hails from – for those who love them some graphic patching. The ever-delightful Johan van Kreij may have excited you at some near-future point by showing you the process whereby he uses connect-the-box programming to make something amazing. The only way to tell whether or not you’ll be excited and grateful is to have a look at it for yourself, of course.
Max-enthusiast and Expo ’74 presenter, Jeremy Bailey, has a message for people who contribute to Kickstarter campaigns (for his own campaign).
You’re the best, Jeremy!
Code Control Festival is Europe’s biggest Max meetup. Phoenix Cinema and Arts Centre in Leicester will be hosting its 3rd international conference for artists, musicians, students and teachers to explore Cycling ’74′s Max software, a toolbox for developing unique sounds, stunning visuals and engaging interactive media.
This year Phoenix, in association with Cycling ’74, invites applications to its Code Catalyst Award fund. The Catalyst Award represents an excellent opportunity for artists and practitioners to design and create new work or get free tickets to the events. The deadline for submissions is Friday 8th February.
Guest speakers will include Cycling ’74 CEO and founder David Zicarelli, Cycling ’74 developers Sam Tarakajian and Jeremy Bernstein, and Eric Lyon with more to be announced.
Festival dates: 22nd – 24th March 2013
I got to spend a day at NAMM, and it was a great chance to spend some time with our friends. Here’s a little picture of the folks at the Livid booth, showing great excitement over their new Base product. This is a really nice controller – no moving parts, realtime positional feedback and just the right size for backpacking. They were also spotlighting the Alias 8 controller, which seems purpose-built for making live sing.
Spending time with Livid also reminded me how much I love the OhmRGB Slim, which seems to be the perfect combination of over-the-top features with grab-it-and-go size. This was a great opportunity to talk smart, have fun and feed the GAS (Gear Acquisition Syndrome)!
Contrary to what you may have heard, the 2013 NAMM show wasn’t entirely about the rise of beautifully dirtied analog in the form of the Moog Sub Phatty,Dave Smith’s marvelous Prophet 12 (which you should imagine as a hybrid cross of parts of the Tempest, the Poly Evolver, and the redesigned Prophet), the shrinking (in terms of size and price) of the Korg MS-20, or the return of the Buchla Music Easel (yes, really).
I’m a Max guy, so I prowled the trade floor looking for controllers I could repurpose and come to love. This year, it was sufficiently rewarding to actually lure me away from hardware synthesis fun, gawking at analog video modular systems and nifty modestly sized modeling amps. I have escaped from the Trade Show Floor to tell thee, if somewhat idiosyncratically.
In particular, the Controller Pilgrimage means that I did several things in no particular order: I nerded out about matters such as the ability to fluidly play two-finger trills and mordents on the Ableton Push grid pads – which feel really, really good. I enjoyed the near-perfect size and feel of the Livid Instruments Base. And I think that the QuNexus from Keith McMillen Instruments may offer the first good solution to something I’ve messed with for years in my Indonesian-influenced work – the ability to have a physical interface allow for different playing techniques related to metallophones (alternating striking and damping, grabbing the bottom of a saron key to “silence” it, etc.).
There was one “outside of the box” encounter I wanted to mention, since it might not get quite as much mention as the above – I guess that it really was an “outside of the box” encounter quite literally, since I ran across the object in the laser and LED-stuffed Arena area of the Convention center (Yes, I went to walk on the video floor, too). There, amid the fog and laser-drawn vector graphic squiggles, I met the Alphasphere.
We all love it that the internets bring us images of things we might desire, but there’s no substitute for the real experience of the real thing. Okay, maybe there is a sort of substitute: here’s a great video our pals at Sound on Sound shot at the last Musik Messe that ought to give you a sense of it.
While a quick walk-by on the way to see the video floor struck my inattentive eye as something like a scaled-down ball sensor with hard transducer pads, the real item was far more compelling. The biggest surprise was the surface of the circles that cover the sphere – rather than being some kind of dark hard surface as I might have expected – was a lovely and soft stretched membrane that felt as much like a slightly loosened drumhead as anything else. The sense of feel and control when this surface was stroked or hit or pressed upon was a great experience.
In addition to layout of the circle/drumheads as series of 8 differently sized pads wrapped horizontally around the spherical surface (meditate on that as a topology rather than a grid for a few moments and see if you don’t get some interesting ideas), I found myself using the feel of the “spaces” between the pads as a way to traverse the surface.
While the triggering demos they had in the booth to demonstrate the software that comes with the unit was a lot of fun, I was struck with the notion that the configuration of controllers on the unit – stripped of the intention of its creators and laid open as collections of MIDI-producing outputs – have some really compelling physical and tactile features that I’ve never encountered elsewhere (Oh yeah – the hardware design includes internal LEDS that you can control and turn on and off for visual feedback).
If you’re one of those people who worries that the dominance of the iPad in the control surface world de-emphasizes the aspect of real touch sensitivity as a part of instrument or controller design, I think you’ll be really intrigued.
P.S. You’ll never guess what software was used during the prototyping phase of their design….
One of my favorite places at NAMM is the arena. In the arena, you will find all kinds of lasers, smoke machines, light-up microphone stands, and other visual products to make a musician’s stage act more fun and maybe memorable. This year, I was drawn to a booth where you could walk over a sheet of thousands of tiny LEDs. It felt like walking on hard bubble wrap, making the walk through fire well worth it.
Congratulations to Andrea and Daniele on the release of Bach beta 0.7, which is a massive update with tons of new features and performance improvements.
For those of you who have not heard of the Bach project, it is a huge and fully featured traditional music notation system for MaxMSP.
Last year, Peter Burr approached me with an intriguing Kickstarter campaign he was starting for a travelling show, and wanted to know if I would contribute a video. The concept was that a group of video artists would each make 30 second videos about “the Zone” from Tarkovsky’s film Stalker, and/or the book it was based on Roadside Picnic.
I immediately jumped in and started reading the old Russian science fiction novel. Before I even sat down to watch Stalker, I had a clear view of the visual style and process I’d use, and in fact never bothered to watch the movie (highly recommend it) until after my video was finished. I’m excited to be included in a really fantastic group of artists and what promises to be a pretty great show. For my part, I created a 3D animation with Maya that was exported as a Collada file and then brought into Jitter’s jit.gl.model object. In Max, I created particle effects, applied generative and animated textures, and created a number of other special effects using OpenGL geometry.
This Friday, the touring Max-driven live cinema show will be making it’s first US stop at the Museum of Moving Images in New York. For some more info, check out this interview with Peter Burr at the Creators Project.
The guys at Lanbox have announced some new hardware.
The LanBox LCXi is a rackmount update of the LCX DMX controller. There’s also a remote interface for it, the LanBox interface.Both look really nice.
Eric from Lanbox also told me about a large Max 6 based DMX showcase for them.
“Yutaka (from Japan) made a large setup for Sony last december in Tokyo. He used 44x LanBox-LCX and MAX6 to create a light fountain with *a lot* of lights, controlled the LanBoxes and the MAX patch!”
The long-awaited Arduino Due is now available. This device, featuring an Atmel SAM3X8E ARM Cortex-M3 CPU, is a huge generational leap for the Arduino platform. Using the same footprint as the Arduino Mega, the Due is faster, has more memory and more I/O than any previous Arduino, and puts it on a par with the Maple, Netduino and other similar platforms.
You can check out the details here.
I was able to recently purchase one through the Maker’s Shed, although supplies seem to be somewhat limited. This board, combined with the new Arduino 1.51 beta, led to my first Due-based Blinkie Lights sketch today!
Need a timeline that automates your Max patchers, but also can connect to other things? Do you want it to use Open Sound Control to communicate? What if you could have two way communication for the standalone Open Sound Control sequencer to query the state of the Max patcher?