Thanks for your help. We've received your bug report.

Echolocation (Dynamic Aural Fragmentation)

Duffield
2011
Toronto

Presented at the Ontario College of Art and Design 2011 Thesis Show. A modified version was presented at Videodrome 2011 at the Museum of Contemporary Canadian Art in Toronto, Canada.

Description:
Echolocation (Dynamic Aural Fragmentation) is an interactive installation that creates an immersive sonar environment for the viewer, allowing them to spatially explore and engage with aural/kinetic events of their own creation. This audio/visual installation exaggerates fundamental acoustic principles, causing spatial perceptions through the observation of sound in a quadraphonic, panoramic setup. Using the experience and perception of a bat as a foundation, the installation examines how different species share the same physical laws, but utilize them in different ways – in this scenario, the sense of hearing. Echolocation (Dynamic Aural Fragmentation) is not meant to be a direct simulation, but rather a stylized, abstracted representation that works to translate the dominant perceptual system of a bat (hearing) into the dominant perceptual system of most humans (sight).

Setup:
The lights are turned off, creating a near-dark situation. The space is set up with four microphones and four speakers to create a surround sound (quadraphonic) environment and allow for real-time playback of sounds emitted by the user. All audio information is captured by the microphones into Max 5 where it is processed. It is then output through the speakers with a delay/reverb effect that is reflective of a participant’s spatial orientation. As a visual component, each wall of the structure (which comprises of a video camera and a projector screen) projects the captured event of the user emitting sound and plays the footage in sync with the audio playback. The footage and audio are both stylistically delayed with feedback to emulate the reverberation of audio that is experienced by bats. Only the event of the participant emitting sound is displayed, but from multiple perspectives played back at distinct times, which are respective to each camera.

How was MAX used?

The piece uses Max to run the entire installation.

Echolocation (Dynamic Aural Fragmentation)

No replies yet.

You must be logged in to reply to this topic.