WaveForm is a sculptural and audio/visual environment created as my master’s thesis at USC, Columbia that reacts to the high frequency electromagnetic radiation emitted from wireless technologies such as cell phones, WiFi routers, and AM/FM radio. Its purpose is to encourage awareness of the immersive and material nature of the electromagnetic weather created by our wireless communication devices
How was MAX used?
The first iteration of WaveForm uses Max/MSP 5 to perform FFT analysis and scaling on incoming signals from six high frequency radio receivers. The scaled and analyzed signals are then fed to the modulation frequencies of six small, FM synth modules. The resulting FM frequencies are output through a six-channel interface and an array of six speakers on top of which rests a tray of water. The tones from the Max patch form cymatic patterns in the water which are then projected as a shadowgraph on the wall of the gallery. The second iteration of WaveForm uses Max/MSP as well as jitter to perform the same scaling and FFT analysis on the signals, but uses the data to control playback of samples as well as the feedback of video and the movement of an OpenGL model. Max/MSP/Jitter was the sole language used in the programming of each iteration of WaveForm.