A realtime sculpture in spatial light and sound.

Integration.04 is a live performance by Dieter Vandoren. His instrument projects light and sound structures in fog-filled space, immersing both him and the audience in it. He manipulates the ephemeral audiovisual shapes as if they were tangible materials, taking cues from the interaction between the human body and acoustical instruments. The abstract digital processes are thus projected into physical, experiential space and the emergent play becomes a strong embodied experience – for both performer and audience.

Max, MSP and Jitter comprise 95% of the software system. The remaining 5% is Python and micro-controller code.

The Max system consists of a suit of modular components/patches:

  • 2 jit.openni processes performing skeleton tracking, on 2 separate machines, for 360° coverage
  • skeleton merger, combines the 2 tracked skeletons
  • motion abstracter, extracts control data from the merged skeleton, all data stored in matrices for efficiency and processed with jit.gen
  • sensor gloves abstracter, handles the accelerometer and bend sensor input streams
  • main patch, system clock, settings manager, sequence control
  • multiple generators, one or more for each scene, generate geometry and audio, some using and gen~
  • geometry renderer, OpenGL output to TripleHead2Go and 3 or 4 projectors, make this a whole lot easier than before
  • audio mix hub, 4.1 output

The sensor gloves are built around SenseStage MiniBee wireless (XBee) micro-controllers. They come with their own firm/software suit, the latter in Python.

More Links

Nov 02 2013 | 7:18 pm

Dieter! Nice integration of technologies, combined with a coherent, powerful aesthetic.

Viewing 1 post (of 1 total)

Explore More

Subscribe to the Cycling ’74 Weekly Newsletter

Let us tell you about notable Max projects, obscure facts, and creative media artists of all kinds.

* indicates required