The prototype is successfully able to convert pulse data data (BPM) into a spatial light animation. However, using biometric data to effect animation required a lot of debugging and refinement. I found out that because the logic for the pulse sensor required 3.3V, while the PWM driver that controlled all the micro servos required 5V, I needed to implement a level shifter. Without the level shifter, I was getting nonsensical data from the pulse sensor (between 2000-3000 BPM). I also ran into obstacles when trying to smooth the pulse data. I ended up using the smoothing algorithm from the Arduino website. In terms of the mirror animation, the prototype has some limitations. If time permitted, the animations would have been more refined to accurately convey the emotion implied by a high, medium or low pulse rates. Another fall back is the fact that the prototype does not allow multiple pulses to effect the system. If more pulse sensors were added, it would prompt me to also design how local and global space will be shaped through the real-time biometric data.
Is this a good/useful/informative piece of content to include in the project? Have your say!
You must login before you can post a comment. .