Process
The project was modeled after Blinking from Module 4. The challenge here was to take their idea and put our own spin on it, improving what they had. For one, we decided to use automatic blink detection instead of having the participants manually press something when they blinked, to streamline the experience. This way, the participants can focus more on the eye contact that they are making, and less on when their counterpart is blinking. We struggled to figure out what sound to play (or whether to play a sound at all) when someone blinked. At first we thought about using our idea from the wire tree installation in which new sounds were layered on every time something happened (in the case of the tree, the event would be plugging a phone in, and in this case, the event is someone blinking). However, that wouldn't have worked the same way because the event in this project would happen too often, and to layer something on each time would quickly create a cluttered sound space. We played with the idea of creating a visual effect upon a blink, but decided that would be distracting and possibly encourage for the eye contact to be broken. We also had a hard time figuring out other details such as how far apart the two people would sit, how to mount the webcams, and so on.
Content Rating
Is this a good/useful/informative piece of content to include in the project? Have your say!
You must login before you can post a comment. .