an interactive media experiment exploring computer vision + auditory embodiment
An interactive media project investigating the influence of sound and visuals upon the ways we navigate and relate to space.
What this software does is map participants’ poisiton in a room to the pitch and modulation of synthesizers, which then drive visuals generated using TouchDesigner.
This was inspired by curiosities into how audio might impact the way people interact with their surroundings and fellow participants; do we tend towards harmony? How might this effect vary with volume, or permanent objects establishing a “base tone”?
Do people have a tendency to form chords in their arrangement? At what point is this influence considered conscious / unconscious?
Features that are currently working:
Features needing work:
There is another branch of this project named 444, dedicated to more critical exploration of computer vision via experimental art; feel free to give it a look!