We’d like to congratulate Owen Pallett on his Academy Award nomination for work with Arcade Fire on the Her score. We did an in-depth interview with Owen in 2010, and were glad to see him get a nod for his recent work.
Cycling ’74 may not have made it to the 2014 Most Admired Companies list, but maybe sharing our obsession with coffee — since you already heard of the one with cats — will get us a few more admirers. Don’t worry, we love tea, juice, and other tasty liquids, too.
We agree: It’s all about the beans. Second to that, we use different kinds of gear. Here are some of our favorites plus a few tips:
- Tim uses a french press with these instructions:
1. use a super coarse grind
2. don’t skimp! 2 tablespoons per cup. If you use too little coffee it ends up getting kinda acidy and gnarly.
3. don’t use boiling water, it will ruin it (especially if you have a light roast) — water should be around 195-200 F.
4. because the water is colder, I recommend pre-heating your cup (I usually just pour some extra hot water into it while the press steeps)
5. let it soak 4-5 minutes, stirring occasionally. usually this doesn’t need to be precise, but with some coffees it seems to make a difference
- Joshua prefers the simplicity of a slow pour.
- AndrewB’s brother-in-law swears by the AeroPress. Yes, we get our families involved in the obsession, too.
- Wes is a fan of his Bialetti Christmas gift.
- Jill uses the Clever Coffee Dripper every day, a gift from her brother. [Note that we are giving gift advice here, too.]
- Florian may get the gold medal for his use of the ROK Presso at home, and specifies that not all beans work well with it.
- And just to prove that we’re not all coffee snobs, BenN admitted he had a delicious cup at McDonald’s this morning.
My question is: How can Max become a part of our coffee-making ritual?
The Creators Project blog recently featured work by artist and maxer Ian Brill. I was fortunate enough to collaborate with Ian on an earlier version of this project, and it was a total blast.
Ian was kind enough to share some details of the project.
I use max to generate a series of generative, gestural motifs, composited across a Jitter matrix. These gestures rely on geometric shapes, realtime analysis of audio, a particle system, and several stages of attenuated feedback, in order to create a frenetic environment for active contemplation. By wrapping the matrix around the installation’s implicit center, the illusion of emerging, kaleidoscopic events around, toward and away from the “oculus” (top-most center) is created. Parametric data and spacial information is shared analogously between Jitter-based visual content and a custom built, 8.1 surround, sound engine, built using maxforlive. Data is sent over serial to an arduino Duet, which in turn parses the pixel values and distributes them across a daisy chain of 560 LED clusters, zigzagging all around the back of the installation.
Last May I was invited by Trond Lossius to give an advanced Jitter workshop at BEK. I hadn’t given a workshop since Max6 was released, so I was super excited for the opportunity to dive into all of the fantastic new Jitter features, particularly the Gen objects. For me, the Gen objects are like a secret ingredient that can be used to spice up a patcher in interesting ways. They’re extremely versatile objects, useful in adding a unique touch.
The Jitter Gen objects are a portal to a vast world of creative possibilities but with that vastness one can easily feel overwhelmed. Gen patchers come with a new set of objects and operate according to a different logic than normal Max patchers. To really understand Gen requires shedding preconceptions and rewiring how you think about composing a patcher. As an entry point into the Gen world, the workshop curriculum focused on only a handful of Gen objects at a time, using them to explore a single visual technique:
- Color Manipulation: understanding vectors and colors
Key Objects: vec, swiz, concat
- Mixing Video Streams: recreate jit.xfade and jit.alphablend, mixing at the image level v. mixing per-pixel
Key Objects: mix, switch
- Coordinates: generating spatial patterns using distance fields (distance from a point, from a line, from a circle)
Key Objects: norm, snorm, cell, dim, length
- Sampling: image warping using sample, applies knowledge about coordinate manipulation to image processing
Key Objects: sample
- Isosurface (3D distance fields): extends the previous work with coordinates to 3D, uses jit.gl.isosurface for visualization and rendering
Key Objects: norm, snorm, cell, dim, length, jit.gl.isosurface
- Particles: moving points through space using vector fields, vector math and quaternions
Key Objects: dot, length, qrot, noise, jit.gl.mesh
- Materials: generating textures for use with jit.gl.material
Key Objects: jit.gl.material
- Surfaces: surface generation and manipulation
Key Objects: sphere, torus, etc., jit.gl.mesh
- Rendering: post-processing captured 3D rendering using jit.gl.node and jit.gl.pix
Key Objects: sample, jit.gl.node
I often have the impression that the Gen objects are underutilized, particularly in Jitter. Only one or two people who attended the workshop had ever used Gen before. After spending some quality time with Gen and going through the curriculum, pretty much everyone had become enthusiastic Gen users. There was just too much goodness to be had.
To spread the wealth, the patches developed for the workshop are being made available here along with the patches I made throughout the five days of the workshop itself. Have a look at the README in the download for info on the different folders. Feel free to borrow, steal, fork, and extend the ideas presented!
Download: BEK Workshop Files
Special thanks to BEK and all of the workshop participants for an amazing week. Credit for the images used in this post goes to:
Bruno Zamborlin developed a new product that will allow you to turn every day objects into musical instruments. The iOS software shown on the Kickstarter page is based on the Max patch that Bruno uses for his live performances and installations.
Mogees consists of a mobile app and a small sensor that detects and analyses the vibrations that we make when we interact with the objects around us. It uses a special sound technique to alter their acoustic properties so as to make them musical.
Two remarkable new books featuring Max have just been published that should be a part of any serious student’s collection.
First, volume 2 of Electronic Music and Sound Design by Alessandro Cipriani and Maurizio Giri continues the first volume’s exploration of audio applications of Max/MSP through hundreds of interactive examples. In this volume, the focus shifts to DSP effects and a unique treatment of what the authors call “motion” — descriptions of how sound can evolve through time. There is also a section on MIDI for control applications, and perhaps the first coverage of the use of Max for Live in book form.
Volume 2 was translated by our former colleague Richard Dudas now imparting his vast wisdom to students at Hanyang University School of Music in Seoul.
The book just arrived today and I am excited to try out the hundreds of amazing examples that accompany the book. Here’s a link to Electronic Music and Sound Design on Amazon.
Next, I just learned today of the publication of Peter Elsea’s new book The Art and Technique of Electroacoustic Music that includes an extensive chapter on Max. The book is structured as a course on the fundamentals you need to compose with acoustic and electronic sounds, so it doesn’t just cover the use of software, but also recording, editing, and studio techniques.
Peter is one of the true heroes of the Max community, generously sharing his collections of objects and tutorials that remain essential resources for anyone working with the software. He recently retired after more than three decades teaching at UC Santa Cruz, inspiring countless students, a number of whom I’ve had the privilege of working with over the years. It’s also fun for me as a Santa Cruz resident that there is so much Max literacy in this town — something which is entirely Peter’s doing. (OK, it’s not like I’m going to be able to walk into the Red Room and strike up a conversation with a random person about the finer points of funbuff, but you know what I mean.)
Check out The Art and Technique of Electroacoustic Music on Amazon.
Are you patching on location in an exotic place, setting up a show, or just connecting objects in the corner of your local cafe? Is Max a part of your studio or office?
We’re excited to launch a new feature of Cycling74.com called Max Workspaces. We know that Max is used all over the world to do a big variety of things. We are hoping this new section of the website will offer a glimpse into the various studios, offices, theaters, coffeeshops, and miscellaneous spaces that the Max community touches. We also think it’s really fun to share photos of where you are, and offer peeks of work in progress. Visit the Max Workspaces page to check out other people’s photos and upload one of your own.
Our friend and co-worker, Rob Ramirez, shared some recent work with us and provided details on how he used Max. The best part is that you don’t have to be a fan of Star Trek… not that I’ve ever met someone like that.
The show was built entirely with Max from start to finish. We began by collecting clips of Shatner as Kirk from the three seasons of Star Trek, and building a database using Max’s sqlite implementation. From this, we created a dictionary of possible words to hand off to our writer. However we quickly realized that any word was possible to create by combining syllables from other words (eg interchangeable created from intercraft + change + considerable). After receiving the completed text, I created a sequencing patch that took the script as input and gave me all the possible variations for each of the words in the text. I could then adjust the starting and ending positions, and overlap between words to fine-tune the rhythm and tone of his speech. – Rob
Cycling ’74 is a completely distributed company theoretically located in San Francisco. It’s technically not located in San Francisco any more due to the fact that it no longer has an office. Its corporate records are stored in my house in Santa Cruz and we pay city business license tax there as well. The only actual office where people go to work every day outside of their places of residence is in Berlin. We (indirectly) rent a small space for three people.
So people are always asking us, how does it work out if everyone works from home?
I know this sounds like some sort of internet meme, but it’s literally true: one of the major hazards of working from home is…cats.
Earlier this week, the high-powered Cycling ’74 executive team was having its high-powered weekly conference call when one member of the team, who wishes to remain anonymous, suddenly interrupted me (I tend to talk way too much) and said, “Uh, my cat has just exploded all over me and I need to go…now.”
So, naturally, because of our cat-friendly corporate policies, we suspended the meeting until our co-worker could take a shower and remove the charming scent of feline spray.
The meeting ended without further incident an hour later. I also need to point out that I spent the entire time in bed, since I woke up right when the meeting was supposed to start. After the meeting concluded, I remained in bed to write up the action items. I had almost finished my summary when I detected telltale cat scratching a few inches away from me. Further unpleasant investigation revealed that yes, due to the dog blocking the path to the litter box, my cat had just peed all over the bed.
I subsequently sent an e-mail to the rest of the meeting participants describing the incident; subject line: “universal resonance.” It turned out that two other people at the meeting were cleaning up after their cats.
Finally, Darwin responded that he had just banished his cat to the outdoors. “I’ve seen the future.”
By way of the very talented Daito Manabe it’s come to our attention that Mira has been spotted in the wild. It has made its way across the Pacific and into a nonspeaking role in a teaser video released to promote a single by the Japanese pop supergroup Perfume.
Cycling ’74 did not pay for this product placement, but we are thrilled to be associated in some small way with this production. As you can see over Daito’s shoulder, Mira was used to control lighting on the set during rehearsal and filming.
The final music video, available here does not feature Mira, but it does have some touchscreen-inspired decor that I suppose reflects back on whatever Mira’s imagined role in the production might have been. Oh, and there’s also an extremely catchy song, if we are to judge from the 1.6 million views as of this writing.
Whew! Boy am I not exhausted from not attending CES. Via my patented investigative technique of reading other news sources I am pleased to bring you this secondhand report of Things You Just Might Want to Consider Connecting to Max, If You’re So Inclined in that Direction. Up first. Smart lighting.
So here’s the problem. You’re sitting on the floor, the way you normally do, and you have a cat on your lap and the cat just won’t get up. Normally this isn’t a problem, but the sun has gone down and you can’t see anything. Wouldn’t it be great to control the lighting in your room from your iPhone? For that cat’s sake. That’s worth $200, right? These are the difficult life decisions the Phillip’s Hue system asks of me.
Using Phillip’s excellent REST API and some clever hacking, Cycling’s David Zicarelli successfully linked Max to the Philips Hue system with disco-licious results. The system isn’t designed for low latency performance, so don’t expect beat-synced ramping of hues, but using Max to control lighting opens new and innovative ways to annoy the people you live with.
Belkin takes aim at Phillips by expanding their WeMo home automation system to include smart LED light bulbs. While the WeMo light bulbs are cheaper than Hue, they don’t offer changeable colors. Belkin’s LED light bulbs join WeMo’s expanding line of home automation products that include switch and motion sensors.
Also announced for the WeMo platform is the WeMo Maker. This device allows you to take readings from 5 volt analog sensors and switch up to 36 volts DC. Check out the WeMo local SDK for iOS and Android for developer information.
Belkin’s WeMo system has its own set of modules at, ITFFF, a popular service that allows you to create connections between dozens of services like Twitter, Instagram and your phone with a simple IF (this) THEN (that) statement. This allows you to use a WeMo sensor to phone you up if it detects motion in your house, or use a light switch to publish a blog post.
Sure, sending tweets when your dog wakes up is all jetpacky and nineteensixtyfourworldsfairy, but when is somebody going to step up to accommodate people who need to control their crock pot from Starbucks? It has been 25 years since the first internet toaster was demonstrated, surely CES 2014 will show some progress on this front?
Well put away the pitchfork and get out your actual fork because the future is now with the Belkin CrockPot WeMo Slow Cooker.
Belkin this, Belkin that. You’d think Belkin invented smart things, but put down that burrito because I’m going to tell you something that will shatter your brain’s mind. Smart Things invented Smart things. Smart Things is an open platform, meaning multiple companies like Honeywell, GE and Aeon Labs make Smart Things-compatible products. The product line up for Smart Things is a bit more extensive than Belkin. There are options for moisture sensors, pressure sensors, keychain ‘presence’ fobs and more.
Smart Things, WeMo and Hue communicate wirelessly to a dedicated hub that you connect to your home network. That’s why ‘starter packs’ of any of these products fall into the $200 range. Of course, they’re not compatible with each other, meaning you need a separate hub for each system. You can cross integrate, however, and the easiest way to accomplish that is via IFTTT. The allows you to, for example, use a SmartThings motion sensor to turn on a WeMo switch.
Sphero 2B and Sphero Robotic Ball
I have no idea how my son found out about the Sphero robotic ball. In the days leading up to Christmas, we were besieged by unusually persistent requests. It went something like this, “Can I have a Sphero robotic ball?” My wife’s position had the solidity of granite. “Have you seen this Sphero ball thing? It’s a another stupid remote control toy right? AND you need an iPhone, right? Dumb right? He’s not getting one. Right?” Remembering last year’s remote controlled spider my well-intentioned parents provided last year, I grunted agreement and didn’t research any further. Spousal dissent trumps son’s disappointment (and teaches a valuable lesson about life).
Our cheapness turned to triumph as Orbotix (Orbotix!) announced the Sphero 2B just two weeks after Christmas. The 2B is twice as fast as the original robotic ball! And cheaper too! And won’t be available for nine months. I bet all those suckers who bought the original Sphero are drowning in pools of early-adopter tears.
As my son and I reviewed the specifications for the 2B, it seemed to fit the description my wife supplied: a remote control vehicle that uses an iPhone for a remote. Except for… an SDK. You can develop for it!
Worse, I realized the original Sphero robotic ball was much more than a remote controlled vehicle. It has an accelerometer, compass, and gyroscope. You can use it as a controller. It can provide haptic feedback. Crap. Now I want a Sphero.
Shure SE846 quad driver earphones
There was a time when one driver was enough. A simpler time. A time when frequencies under 100Hz were just as welcome as a fixed DC offset. Sure, “Time in a Bottle” sounded just fine on an AM radio, but this is 2014 and we can’t be expected to fully appreciate Dad Metal without shoving eight drivers into our skulls.
The SE846 uses a three way crossover, with two drivers dedicated to the low end. The SE846′s secret sauce is an physical maze of stainless steel plates that act as a “ground breaking low pass filter for a true subwoofer experience”. This channel adds about four inches of distance between the driver and the output canal.
If you’re ready to pay $1000, and enjoy using words like “soundstage”, “shimmer” and “space” to your friends, be sure to get a pair of these earphones. They won’t fully obliterate the shame and embarrassment of your teenage years, but they’ll help. Bonus: you can’t hear your kids crying.
Anything we missed? What was your favorite product announcement at CES?
Sam and I had the pleasure of meeting Masato Tsutsui in Japan in December and discussing his interesting Max-based projects.
Here’s one for the holidays:
It’s getting frosty in San Francisco these past couple of weeks, but here at sea level, we rarely get to see much snow. The California Academy of Sciences has a new exhibit that let’s us get a little taste of snow in Golden Gate Park. The other day, while reading the Chronicle, Lilli spotted a story about Toshiro Chiang’s computer-controlled artificial snow machines at California Academy of Sciences. Having originally met Tosh at the first Expo ’74, and having seen some of his Max-driven exhibition controls before, I got in touch with him to see what was behind it all. Max of course!
Here are the details we got from him:
It’s max/msp & some unix, all running on a Mac mini. The original program was written 4 years ago. Back then, we needed to tie together a showcontroller and DMX lightboard asap. This economical solution enabled us to quickly and cleanly solve every problem, with tools that we already had on site.
The patch supports DMX cues and low-voltage digital i/o. There are diagnostic indicators for system health as well as feedback. The computer itself lives in a 1RU rackspace, alongside a 1RU KVM, an ipad-driven mixingboard and some Crown amplifiers. There is even a large industrial push-button, which when enabled with an inline switch-guarded toggle (imagine that red hooded-toggle for fighter pilots enabling weapons in the movies), allows people to have it snow on demand. This feature is no longer used, but can still be supported.
There was originally a stage as well, with some par cam lights and Martin MiniMAC controllable spotlights (with full color + gobos + and pan/tilt/zoom motion!!). This is why the patch needed to support live DMX mixing and cues. There were daytime cues, night cues, special event cues, etc. This feature is also no longer utilized.
Lastly, the computer is the same one which drove our former Piazza Dancing Fountain, in which it tied a schedule to midi triggers, synchronizing the solenoid valves of 16 laminar jets and 6 leap jets to things like Electrolane, Peggy Lee & Ratatat. When we want to switch cpu configurations between fountain and snow, we just open a different patch!
During the month of November, I took a little journey into a new programming area: creating content specifically for the Ableton Push control device. This hardware has a unique place within the Max community due to its tight integration with Ableton Live (and therefore Max for Live), but it is also a powerful control surface in its own right.
With help from Mark Egloff of Ableton, I started with a goal: to create a device that would be a usable performance tool, but would “take over” the button grid on the Push to make it easy to manipulate in real time. I chose an 8-band EQ-like device that I called the Frequency Mixer, and created the code necessary to run it solely from the Push. See the result (along with some video).
Next up was to work directly with the Push in Max – completely outside the Live environment. Based on some information that Mark (Egloff) provided, I was able to determine the values needed to update the Push button matrix RGB values, and created an interesting, if rather useless, 8×8 image display. I can imagine using this to modify a program based on the display values, but have left this as an exercise for the willing Push student!
Finally, based on feedback received on YouTube, I modified the first (Frequency Mixer) project to act on other tracks in a Live set. This way, you could either mix multiple channels, or (by inverting the values) crossfade multiple tracks from a single instance of the Frequency Mixer. This is based on the use of send and receive objects that share a specific name, which is propagated through the entire Live set. See the result — a fun extension to the original device.
While I create some specific devices and projects, the implication should be much greater – that the Push, like many other controller devices, is an interesting playground for the creative coder. Hopefully you will find tips and techniques that can help you get more extensive use out of your Push!
“One of my early desires as a musician was to sculpt and organize directly the sound material, so as to extend compositional control to the sonic level – to compose the sound itself, instead of merely composing with sounds.”
A strange loop arises when, by moving only upwards or downwards, one finds oneself back where one started. The concept of a strange loop was proposed and extensively discussed by Douglas Hofstadter in Gödel, Escher, Bach. In it, he describes a beautifully-structured framework for exploring the question of how a sense of self arises out of something that has no self; to go from a state of meaninglessness to something that can refer to itself.
One kind of audible strange loop is called a Shepard tone. This illusion was invented by the psychologist Roger Shepard in 1964. He used a computer to create a series of tones that seems to rise forever. Jean-Claude Risset created a version of the scale where the tones glide continuously. The tone appears to rise (or descend) continuously in pitch, yet return to its starting note.
These pieces were the result of several years of collaboration with Max Matthews at Bell Labs, realized with Music V. Music V was an extension of Music III, (including Music III’s innovative concept of generating units which pre-dated voltage control as a formal protocol), rewritten in Fortran, with added support for analog to digital conversion so one could manipulate digital audio directly. Music V was also distributed free of charge, at the request of Max Matthews, to stimulate research and the production of computer music. During his time at Bell Labs, Risset also compiled a catalog of computer synthesized sounds, including FM and additive examples, for a synthesis course he gave in 1969 with John Chowning at Stanford University.
I set about executing a Risset glissando in Max without referencing existing implementations. I deduced that one would need a master phasor, subdivided in 90 degree phase offsets to act as the control system for the effect. Each subdivided output controls the pitch of an oscillator that moves four octaves, thus the distance between the quadratic outputs is always an octave.
The second part of effect is controlling the output level of each oscillator. The most logical way to do this is to use the existing phase-offset phasor output. The output of half of a cosine function from 270 to 90 degrees produces the correct shape for our purposes. So, if we reduce the magnitude of our phasor output by half, and shift the phase by three quarters, we’ve achieved our goal. Now when the ramp controlling the output pitch is at its most extreme (at either edge), the oscillator output is inaudible.
My first implementation of the Risset glissando resulted in a new Beap oscillator type. These oscillators are designed to accept a 1v/oct input, so, as long as all oscillators are connected to the same master phasor output, they can be stacked and played like normal oscillators. In other words, one could think about musical structures using the normal rules of harmony.
In addition, I produced a Quadrature Risset Generator module, which, when given a 0-5v phasor input, will produce eight control voltages corresponding to a pitch and amplitude pair for each quadrature output. This module can be used with any Beap oscillator to produce Risset glissandos, or used in conjunction with quantizers to produce chromatic or diatonic Risset figures that endlessly rise or fall.
At this point, I migrated to MaxForLive to produce a couple polyphonic Risset synthesizers, one based on subtractive synthesis and one based on a simple two operator FM group.
Here is a generative Risset Ableton Live set. This uses one Aleatoric module to generate the notes, followed by a mutating MIDI delay, followed by the FM Risset Synthesizer set to a period of 32 measures.
Some example output:
These devices have been added to my Live 9 Inspiration Suite along some related devices like an aleatoric generator and a couple new delay effects. (click the Download .zip button). As always, the latest version of Beap, including the new Risset modules, can be found at the Beap site.
Computer music is still in its infancy and there is so much area to explore. Risset’s work has inspired me to challenge some of my assumptions about music that I thought were fixed.
This all began as a joke within the “Material Team” – since we do a visual programming language, we should have an audio podcast! We could do virtual patching with phrases like “You really need to connect the second outlet of the umenu, because that’ll give you what you really want for the midiout object.” Yuck, yuck, yuck.
But, just for fun, I decided to give it a try anyway. You know what? It turns out it is really fascinating. The reasons is that you don’t talk programming; rather, you talk about inspiration, creativity, hard work, personal backgrounds and future visions. In fact, behind every stack of code is a very interesting person, and chatting with them for a podcast turns out to be very interesting.
I’ve put up the first three episodes (since that’s what is required to begin the iTunes process), and will be doing one new podcast each week until I exhaust myself. If you are interested in being interviewed, or if you know someone that should be interviewed, drop me a line and let me know.
The web page that libsyn maintains for the podcast is here: The ArtMusicTech Podcast
Many of us at Cycling ’74 are gardening enthusiasts. When this stunning project was brought to our attention, two of our favorite worlds came together. Wish we could be there to see and hear it in person. Thanks for sharing, OFL Architecture!
Francisco Colasanto, who works as assistant director at Centro Mexicano para la Música y las Artes Sonoras (CMMAS) in Morelia, Mexico, has released the first two modules of a new ebook about Max/MSP on the center’s web site CMMAS.org. The first module is free (after registering with the site) and covers the Max interface. The second module covers additive synthesis and is available for $5.00. The book will soon be available for iPad and Android.
This publication is unique in that the text is illustrated with short videos that demonstrate the concepts being described. When talking about something visual and dynamic such as a Max patcher, this makes perfect sense. I could use words to describe right-to-left ordering until I’m blue in the face but once you see it in action, you’ll understand it immediately.
Max/MSP: A programming guide for artists is off to a great start and I highly recommend you check it out and continue to watch as Francisco adds new modules.
Looking to brush up on Ableton Live skills and learn more about Max for Live? Coming up this Friday, September 27th, Seattle’s Decibel Festival will be hosting Ableton Day at the Broadway Performance Hall (on the SCCC campus). Max in the Morning will kick off Ableton Day at 11am. Clint Sand will join Ableton Certified Trainers James Patrick and Chris Schlyer in presenting a whole range of material, from entry-level to advanced.
Chris Petti of Dubspot will give a workshop from 12:45-1:30. If you’re in Seattle, this will be a great chance to pick up some new tricks and get immersed.
I just received an email from our friend Lippold Haken. Here’s a video of his amazingly expressive musical controller, with software written entirely in Max.
You can find out more about this incredible musical instrument on Lippold’s website.
Two months ago, Mira was born. Of course, that doesn’t mean that development has stopped–far from it. Since the moment it came into the world, Mira has continued growing steadily. At this point you might well be wondering just how little Mira is coming along. Well, according to babycenter.com, at two months of development “your baby will begin to move beyond his early preferences for bright or two-toned objects toward more detailed and complicated designs, colors, and shapes. Show your baby — and let him touch — a wider variety of objects.” How’s that for good news? Even better, it turns out that occasional vomiting is quite common for babies at two months old. So if Mira has been throwing up on you, that’s apparently nothing to worry about.
As for me, as a developer and new dad I’m feeling somewhat sentimental. So last week I decided to go back and take a look through the old family photo album that is the Internet. Much to my surprise, instead of cats playing the piano and women falling out of grape barrels, I actually found a slew of quite impressive videos. Turns out people have been using Mira to make some rather interesting content.
HeRunsHundreds = MIRA test drive
First, a little amuse-bouche. MrNedRush aka HeRunsHundreds offers a 4×4 drum pad built into a Max for Live device. He’s added some higher-level controls for subdividing into 1, 2, 4 or 1/2 bars (making maximally interesting patterns with minimal effort), as well as a timer bar above the buttons. He’s also added an orphaned dial off to the right, apparently connected to absolutely nothing, as a silent ode to French minimalism.
scratching in maxmsp and mira (featuring laser sounds)
Now MrNedRush gets serious. Forget all that warmup and drum pad nonsense–it’s time for some real music. It’s time, in other words, for laser sounds. There’s an awful lot of expressivity to be had here, for nothing more than a button and a slider. If there were some kind of award for most sound with the fewest objects, this man would be the clear winner. There is, of course, no such award.
HeRunsHundreds = The Knobulator in Mira
Don’t try to understand this interface. There are two giant knobs, that much is clear, but beyond that I’m at a loss. From what I can gather based on the accompanying text, the knob on the right is more of a meta-control than a control proper. Tweaking the rightmost knob rapidly jumps between different ways of shaping an audio effect. As for the knob on the left, the most we can say is that it’s labeled knobulator. So it controls knobulation, obviously, whatever the hell that is. In summation, as a logical exercise, this patch is absolutely impossible to understand. As a tactile exploration, however, it’s a glitch-groovy road trip and an absolute blast to play.
He Runs Hundreds = skinny hands wrists arms (live jam)
See–this is what I’m talking about. So often the debate around the iPad as an interface devolves into nothing more than touchsceen-bashing bloodsport. “Oh no no no,” the hardware elitist say, one hand on an APC 40, the other clutching (with extended pinkie finger) a champagne glass filled with Monster energy drink, “an iPad simply won’t do. A man must feel the knobs, he must enjoy the physicality of the slider.” And that’s fine, I can respect that. But no one ever said the iPad had to replace the hardware. Ebony and ivory, baby, why can’t we all work together? The knobs are good at being knobs, the iPad is good at being a display. As this excellent video demonstrates, the two complement and ennoble each other.
Reflections. The performer’s hand reflected on the immaculate surface of the iPad. The audio interface reflected on the desk’s polished surface. And, if you’ll excuse the painfully stretched metaphor, a certain reflection across time as well. SugarSynth powers the audio, which is an updated version of the original MSP Granular Synthesis patch by Nobuyasu Sakonda’s. The original patch is, by technological standards, ancient, dating all the way back to the year 2000. Forget the iPad–this predates even the iPod, so to see Mira driving the new and improved patch seems like a fitting way to celebrate the sugarSynth and to tie a neat ribbon around a little chunk of Max history.
MIRAnome64, a virtual monome for MIRA/iPad and Max6
Is anyone really surprised to see Julien Bayle’s name here? Outside of actual Cycling ’74 employees, the man may be the single most prolific Max contributor of all time. His work includes externals, Max for Live devices, articles, workshops and, just to cement his total dominion, a three-hundred page book. MIRAnome64, a virtual but fully functional Monome64, is his first project using Mira. The video is more of a demo than a performance but he’s nice enough to show us a bit of how it works. A very clever trick makes the magic possible: by using a mira.multitouch object in conjunction with an array of toggles, he’s able to track touches from toggle to toggle, allowing for sweeping gestures across the whole array. Nice.
Rungler—a chaotic approach to step sequencing
Thirty-seven seconds. Right when the overtones start to kick in, that’s when I know that I’m going to spend the remaining six minutes of this video in a state of ear-drugged catatonic ecstasy. The Rungler, as this video is called, is based on something called the Blippoo Box, which you can think of as similar to an analog step sequencer. There is one small difference: a step sequencer is something that you can understand and control, whereas the Blippoo Box is a living animus of fire and whim that inhabits the very bounds of human comprehension. The result, as I’m sure you will appreciate, is complex, chaotic and highly listenable.
MIRA controls x3 Machines (including Windows)
Come on, that’s pretty cool. One iPad, three machines?
Max6 and Mira on iPadで演奏
Finally, for little dessert, Yasuhiro Otani demonstrates his own Mira + patch. My Japanese is more than a little rusty, but from the website at eleclab.tumblr.com it looks like this patch was made as part of a workshop called the U::Gen Laboratorium. Said workshop has a mascot (apparently workshops need mascots) and that mascot is a girl holding a knife and fork. This, presumably, makes sense. Again, my Japanese is more than a little rusty. My Max, on the other hand, is quite strong, so instead of trying to figure out why this patch got made I’ll focus on how cool it sounds.
Check out this research project and exhibition that features the sonification of electrical activity from a colony of microbial fuel cells.
This past Friday Google released the source for two of its Chrome Web Lab projects that have been running at the Science Museum in London for the past year.
One of the projects, the Orchestra, makes use of Max along with a host of web technologies. For those interested in techniques for controlling Max patches via web sites, Google and user experience developers Tellart are generously providing a valuable resource. This is a great opportunity to peek behind the curtains of a Max project designed to run both online and in a high-traffic environment.
Thursday, August 29, 2013, 7-9PM at 450 Bryant, Suite 100, San Francisco, I’ll be presenting an introduction to programming in Max for Live for the Ableton User Group Meeting.
I’ll have about an hour to explain what Max is, show how it works in Live, and offer some tips on how to start building your own devices. Should be fun!