Dancing Notes is a reactive installation in and on the bridge between the two Bakery Square buildings. The project utilizes two separate google datasets and their associated technologies; the NSynth dataset and the Depth Image Encoding project. There are three levels of the installation, and three levels of reaction as well. On the underside of the bridge, a depth camera tracks the flow of traffic and transmits the data to a computer sitting inside the building. The Google Depth Image encoding software then processes the data and turns it into a depth-mapped/color-coded image which is broken up and displayed in a pattern through the LED light tubes suspended from the underside of the bridge. A very similar process occurs on the interior two floors – mapping the movement of people through the spaces to the light tubes strung above - but the depth perceptive cameras also have an additional function here. The carpet has been replaced by one which is custom designed to have spaces that can be read as a musical staff. These staffs intersect and shear to ensure that a simple straight walk across the space will produce a variety of notes and tones. When a worker walks through the space, each step in one of the staffs registers as a note chosen from the NSynth database and is played through the speakers set around the bridge. This allows for a relaxing, passive interaction of a few calming tones bouncing around the room when one walks through – or an individual or a group attempting to recreate a symphony by dancing around the bridge.
https://magenta.tensorflow.org/datasets/nsynth
https://sites.google.com/site/brainrobotdata/home/depth-image-encoding
Content Rating
Is this a good/useful/informative piece of content to include in the project? Have your say!
You must login before you can post a comment. .