Capillaries Capillaries

    Electric Spring festival 2017

    Capillaries Capillaries - Excerpts

    Capillaries Capillaries is an audiovisual composition/performance that extends the idea of an audiovisual object to an audio-visual-time object - that is a tangle of interactions between sound, geometry and time. Audio-visual-time object is treated as a (meta)physical material inside a high-voltage continuum. The tensity of the continuum arises from the imbalance between various forces that pull the material in varied directions. Since the material is a complex audio-visual-time object the result is often impossible to predict. Exploring the emerging structures and their behaviours is the main focus of the piece.
    Capillaries Capillaries is a result of bi-directional communication between sound and image. It was made with Max and Live.

    • andrej's icon
      andrej's icon
      Apr 19 2017 | 6:50 am
      Great work, I really admire your spectral processing work and of course the 3d visualisation stuff. 
    • Graham Wakefield's icon
      Graham Wakefield's icon
      Graham Wakefield
      Apr 20 2017 | 2:00 am
      Would love to hear any comments about how you created this!
    • t's icon
      t's icon
      Apr 20 2017 | 9:49 am
      Thank you very much for your comments!
      The whole piece is 23 minutes long and there are various different techniqes involved, from sonificitaion of geometry and creating rhythms and phrases with geometric deformations to more conventional approaches such as precomposing music and visualizing it. Although my general rule is to constantly switch between the visual and musical paradigm when I creat AV material, phrases etc. So overall it's all about AV. Much more about that including patches will be a part of my PhD some point:)
      Therefore I will focus here only on the "Electric Spring festival 2017" clip...
      From AV material that was mainly created with generative techniques I created (in Live) a section that is based on metric time - some sort of metric beat music. Then I moved all the MIDI data from Live into Jitter Matrices in order to distort time. I did that with a tool I call Stringvencer ( or ), which is basically a sequencer with en elastic/deformable timeline (much more thatn that actually but that was the focus here). Using Stringvencer I gradually deformed the time and record MIDI notes back into Live. What happens in the piece in a time span of around 7 minutes is that the regular metric beat become more and more "swingy"/offset from the metric grid and then it goes into a total chaos as initial sequence is temporaly restructured/twisted. And that is also refledted on the visuals in the posted video clip - rectangles and planes start bending and twisting.
      Visuals. The core is a sampled 3D object that is being processed in different ways (parallel and serial processing) where parallel processing results in few different layers - each one ending in separate object. So essentially the source is one 3D object. In this section all visual deformations occur due to the presence of the "black hole" which I am sure you will recognise:) The movement of the BH is controlled with automations in Live. Similarlly what appears as "edits" in the video are actually only camera position automations. So nothing in this part was edited in post production.
      All that is/was nicely running in real time but at the end I decided to do an off-line render. For an off-line render I had to do a lot of "speed-corections" due to physics simulation algorithm and fluctuaiting frame-rate when composing with a real-time system. In other words, when I rendered the thing off-line for the first time all these spirally curves that appear towards the end of the clip were gone from the camera view or just lost due to too many itterations.
      Another problem was how to render off-line something that is between Max and Live and communicates in both directions? The way I do it is the following:
      - Move all automations and MIDI notes from Live into Jitter matrices
      - Render Jitter stuff off-line (using the aut. and MIDI matrices) and filling the coll objects with all MIDI data that Jitter creates.
      - Converting Coll data into Live MIDI clips
      - Off-line rendering Live stuff
      - Putting the two together in some video editing software
      Hope that answers your question:)
    • Graham Wakefield's icon
      Graham Wakefield's icon
      Graham Wakefield
      Apr 20 2017 | 1:33 pm
      Fascinating -- thank you for sharing such a detailed explanation. It can be fascinating to realize the density of work underlying a project, but it really makes a diffierence to the results. Good luck with the PhD too -- I'm sure it will be very interesting!