Emotive Space

Made by Avanti Dabholkar

Created: December 10th, 2015



This project serves as the fourth prototype for an undergraduate architecture thesis titled “Emotive Space: Connecting Biometric Data to Dynamic Spatial Response.” The thesis responds against the tradition of responsive environments that are optimized for environmental efficiency or utilitarian user needs. Instead, Emotive Space positions itself within the counter tradition of installation work that that is focused on human experience. This prototype uses real-time biometric data, in the form of a pulse sensor, to manipulate the animation of three light-reflecting mirrors. When the pulse rate is mapped to manipulate the speed and the rotation distance of each mirror. The purpose of this prototype was to test effectiveness of using biometric feedback to create a dynamic spatial representation of mood. Ultimately, this system is imagined to be built at the architecture scale. 



Since this was primarily a conceptual prototype built with a very specific goal, the system was built at a reduced scale with three mirrors each 3” in diameter. The animated mirrors use a pan-tilt system to achieve two axes of rotation. This was chosen so that the light reflections could move around a space organically. The pan-tilt system for each mirror uses two micro servos -- a total of six micro servos were used. Although these servos proved to be both small and strong, they create a noisy-byproduct that does not align well with the desired experience. A pulse sensor was chosen because it provided the intuitive correlation between biometric data and emotion. The software has two thresholds that break the data into low, medium and high pulse ranges. Low pulse rates are accompanied by a slow animation. Medium pulses rates are associated with slow individual synchronized animation, while fast pulse rates are correlated with rapid individual movements. The animation speed is designed to visually represent the emotional state of its occupants. The medium heart rate was hard to trigger during testing. This is why the video only demonstrates two states.Further iterations would consider implementing multiple biometric data sensors in order to validate the emotional state of occupants before generating a spatial response. 



The prototype is successfully able to convert pulse data data (BPM) into a spatial light animation. However, using biometric data to effect animation required a lot of debugging and refinement. I found out that because the logic for the pulse sensor required 3.3V, while the PWM driver that controlled all the micro servos required 5V, I needed to implement a level shifter. Without the level shifter, I was getting nonsensical data from the pulse sensor (between 2000-3000 BPM). I also ran into obstacles when trying to smooth the pulse data. I ended up using the smoothing algorithm from the Arduino website. In terms of the mirror animation, the prototype has some limitations. If time permitted, the animations would have been more refined to accurately convey the emotion implied by a high, medium or low pulse rates. Another fall back is the fact that the prototype does not allow multiple pulses to effect the system. If more pulse sensors were added, it would prompt me to also design how local and global space will be shaped through the real-time biometric data.


Photo Documentation