I’m working towards a performance piece which involves live electronic music and video in Max/MSP/Jitter. I’m working with some time-lapse footage of a cityscape over 24 hours, which I want to be affected by elements of the music in the performance. Specifically I’m looking for a way to store data about the brightness of each frame in the movie file, then in the performance Max would use the data to select a frame to display based on a musical parameter such as volume. So in that case, the louder the signal the brighter the image.
The first step is straightforward enough, I can use jit.3m to get the values for the frames in succession, but I don’t know how to go about storing and ordering the data. Perhaps its possible to export the value pairs into a data handling program to reorder them according to brightness, then I could somehow reimport the data into a Max table to read? It would be better of course if I could find a way without having to leave Max.
Does anyone have any ideas about how to approach this problem? Your help would be greatly appreciated!