For those of you in the SF Bay area, please come and check out these performances of Flock next week; we’ve made extensive use of Max/MSP for the audio synthesis and Jitter for the computer vision and video animation in this work. (And in fact, we recently published some of our computer vision objects for particle filtering, lens and skew correction, and image stitching at http://www.jasonfreeman.net/flock/). Hope to see you there!
Flock @ 01SJ
Jason Freeman, music and concept
Mark Godfrey, software development
Rova Saxophone Quartet
dancers from Santa Clara University
Thursday, June 5th @ 8 pm
Friday, June 6th @ 8 pm
Saturday, June 7th @ 4 pm and 6 pm
In Flock, music notation, electronic sound, and video animation are all generated in real time based on the locations of musicians, dancers, and audience members as they move and interact with each other. Computer vision software analyzes video from an overhead camera to determine the location of each participant, and this data is used to create music notation for four jazz saxophonists, to render a video animation, and to generate an electronic soundtrack. By inviting the audience to help create the unique music and visuals for each performance, Flock seeks to reconcile concert performance with the dynamics of collaborative creation, multi-player games, and social networks to create an engaging live event.
This hour-long work was commissioned by the Adrienne Arsht Center for the Performing Arts in Miami.
Assistant Professor, Music Department
College of Architecture
Georgia Institute of Technology