For my final project, I want to develop an interactive environment based around the concepts of particle systems. I would ideally like to leverage people’s phones to generate sounds, using the accelerometer data to affect the pitch and volume. Each device (or tone) would be represented visually as a particle (either on a large screen or projected in the room). In addition to using the phone’s accelerometer data to direct the pitch and volume, each particle would experience gravity relative to the other particles. The result would be an audiovisual system that tended towards an average equilibrium, but with enough noisey input to be constantly evolving.
I’m not sure what environment I would create this in, but most likely either with p5.js and node, or with Max/MSP. I was inspired by the work of Tristan Perich and Ryoji Ikeda, in that complexity can be derived from the simplest of elements by creating systems.