Thank you very much for your comments!
The whole piece is 23 minutes long and there are various different techniqes involved, from sonificitaion of geometry and creating rhythms and phrases with geometric deformations to more conventional approaches such as precomposing music and visualizing it. Although my general rule is to constantly switch between the visual and musical paradigm when I creat AV material, phrases etc. So overall it's all about AV. Much more about that including patches will be a part of my PhD submission...at some point:)
Therefore I will focus here only on the "Electric Spring festival 2017" clip...
From AV material that was mainly created with generative techniques I created (in Live) a section that is based on metric time - some sort of metric beat music. Then I moved all the MIDI data from Live into Jitter Matrices in order to distort time. I did that with a tool I call Stringvencer ( https://www.youtube.com/watch?v=JpdGQbkCV_o or https://www.youtube.com/watch?v=svrW3IYNkn8&t=128s ), which is basically a sequencer with en elastic/deformable timeline (much more thatn that actually but that was the focus here). Using Stringvencer I gradually deformed the time and record MIDI notes back into Live. What happens in the piece in a time span of around 7 minutes is that the regular metric beat become more and more "swingy"/offset from the metric grid and then it goes into a total chaos as initial sequence is temporaly restructured/twisted. And that is also refledted on the visuals in the posted video clip - rectangles and planes start bending and twisting.
Visuals. The core is a sampled 3D object that is being processed in different ways (parallel and serial processing) where parallel processing results in few different layers - each one ending in separate jit.gl.mesh object. So essentially the source is one 3D object. In this section all visual deformations occur due to the presence of the "black hole" which I am sure you will recognise:) The movement of the BH is controlled with automations in Live. Similarlly what appears as "edits" in the video are actually only camera position automations. So nothing in this part was edited in post production.
All that is/was nicely running in real time but at the end I decided to do an off-line render. For an off-line render I had to do a lot of "speed-corections" due to physics simulation algorithm and fluctuaiting frame-rate when composing with a real-time system. In other words, when I rendered the thing off-line for the first time all these spirally curves that appear towards the end of the clip were gone from the camera view or just lost due to too many itterations.
Another problem was how to render off-line something that is between Max and Live and communicates in both directions? The way I do it is the following:
- Move all automations and MIDI notes from Live into Jitter matrices
- Render Jitter stuff off-line (using the aut. and MIDI matrices) and filling the coll objects with all MIDI data that Jitter creates.
- Converting Coll data into Live MIDI clips
- Off-line rendering Live stuff
- Putting the two together in some video editing software
Hope that answers your question:)