This project explores the low signal/noise ratio character of social networks through a poetic audio-visual piece using Twitter.
Description / Synopsis
Saturation is an audio-visual immersive & minimalist digital piece.
From the deepest, dark and oppressive silence to the brightest light and grainy noise, SaturationÂ accumulates bits of information from social networks and displays them progressively, making the aggregation and overlapping more visible as time passes. The sounds becoming louder, grainier and noisier from seconds to minutes.
SaturationÂ is a statement about how our world is over saturated by data flow and the loss of meaning. Progressively, the amount of data is so big that it overlaps itself. Meaning and information content begins to lose itself, to finally reach total destruction, provoked by the initial system.
SaturationÂ is an introspection about our being and the potential to loose the meaning and sense of the world that surrounds us, as we are subject to huge amounts of simultaneous data feed, called upon us each minute by notifications, data push systems and other device-based disturbers.
SaturationÂ also addresses the notion of Time. By evolving slowly during 20 minutes, it drives the audience to patience, forcing them to question themselves about what they expect from computer systems. It slowly takes them to a state of paroxysmal chaos.
The piece suddenly and brutally starts again, in the dark & silence of its beginning, representing the inevitable and irreversible direction of increasing data systems that wonâ€™t ever go back, building each and every day more flows and data; sometimes for good reasons, sometimes for darker & unknown ones.
It represents the destruction of the meaning when social networks’ noise become too much high.
Here on the right is a (short) extract of the piece running. The whole process from the pure dark/silent to the pure chaos/noise lasts around 30 minutes.
Saturation is based on two software components:
- Processing framework (Data capture over Internet + Visuals + interface with Max6
- Max6 framework
Those modules communicate together using an OSC-like protocol inside the computer itself.
They donâ€™t require a specific application license.
Basically, the capture module grabs data from Twitter social network.
Each tweet is displayed on the screen with a very transparent layer. One tweet only would not be visible.
The more letters overlap, the more overlapping parts of these letters are visible.
The sound is based on additive synthesis. A bank of 1024 oscillators is set up. At the launch of the piece, each oscillatorâ€™s amplitude is on zero. No sound is produced.
Progressively, as tweets are captured and displayed on the screen, each oscillatorâ€™s amplitude is incremented, very slowly, one at a time.
Following the additive synthesis process, the result a multitude of sine frequencies overlapping, the final result being a noise (a noise can be represented by an infinity of pure sine tones)
The process itself shows the loss of sense:
Indeed, each frequency listened to on its own will sounds like a beautiful sine, like a pure frequency; but the whole results in a noise, pure chaos, having lost its first meaning. In the same logic, if each tweet contains information, the accumulation of all tweets doesnâ€™t mean anything anymore, and becomes a big grainy artwork.
How was MAX used?
Saturation's sounds are exclusively produced in real-time by Max6 and a oscbank~ object. The Max patch receives OSC data from Processing application and handles all sound generation.