I'm Frederik and I am doing reserach in collaborative music improvisation.
I already made a prototype-table to do some testing (see picture). The table is simply said, a big midi-table for 4 persons.
every button toggles a predefined loop… because the table is for 4 persons, I have 4 sections: drums, baselines, melodies and FX voc (every person has one section)
a quite simple setup (programmed arduino's via puredata to ableton as midicontroller)
my field of interest is the communication between the users of the table. (and not the interaction between the user and the table)
After observation and some half-structured interviews I may conclude that most of the communication covers issues about the structure of the song.
(the song's structure… "more drums" "now comes the climax" "build up" …)
therefore I want to develop a sort of application that visualize the song's structure and share it among it's users (the 4 persons).
Every person should have the ability to draw/select/point his view on the song's structure.
I made a simple picture that illustrate's the idea of the application…
Imagine an ipad, or another touchscreen devices..
The app is synced with the metronome or bmp of ableton
every black grid is 16 counts
the light grey down under are 4 counts
The whole view slides to the left, so you can always see 32 counts upfront…
You can touch where you want (or don't want) drums or bass or synths…
it shares the structure among its users…
My question is…
Has someone any idea in which program-environment I can make such application…
Could max help me with this (ableton bmp sync)…
do you have other idea's?
Please let me know
thank you for your attention.
thanks in advance.
C74 RSS Feed | © Copyright Cycling '74