An interactive video/sound installation initially created during the "Materialize!" course (https://www.hgb-leipzig.de/en/institution/eveningacademy) by Lea Petermann in summer 2021. It uses a webcam as a motion sensor. Using computer vision algorithms, motion is analysed and mixed with the amplified internal sensor noise of the camera. These patterns are converted into audio signals whose rhythm, timbre and frequency are directly influenced by the visual movements.
The video stream from the camera is duplicated. One part captures real-world movements, while the other captures random pixel movements caused by digital noise within the camera sensor. The video algorithm merges these streams, reducing frame rate and resolution, blending both types of motion. This combined video stream is analyzed for movement patterns, and the resulting data are made audibly perceptible. Similar to my project "W_Ferro_1", sound pulses are generated from the video data, directly reflecting the rhythm of the video. Viewers can simultaneously look out the window at the live scene and follow the generated video on the screen, with the sound played through headphones.