Can someone suggest the best way to create a modular system for live jitter performances? I create, find, and modify all kinds of small patches that either generate visualizations or effect a video feed. So far, for live performances, I’ve needed to keep these in separate files and sort of make a single patch per song. This has been okay so far but now is tedious and limiting. What I would really want is a system where I could drag/drop visualizers or chain effects together + a video mixer in real-time, similar to VDMX, etc… Any thoughts/suggestions are appreciated.