How They Perceive Us
When it comes to the internet and our smart home devices, “consent” is a vague umbrella term that is often compromised for the benefit of convenience and intelligence. Our everyday interactions with our devices seem harmless to us until they’ve learned so much about us that they can infer, or even control, our next move. Algorithms are designed to incentivize this interaction, through targeted ads and suggestions to personalize one’s experience .This data collection is a spooky fact behind our devices that everyone knows- is knowledge the same as consent? So, how are our devices collecting our personal data, what exactly can our devices find out about us, and ultimately, how do our devices perceive us?
With the pervasiveness of technology and the increasing dependence on our devices, people are losing the distinction between the digital realm and the physical world- especially when our digital realm is personalized and smart. The first tactic our devices use to keep us engaged is in the psychological stimuli we receive from our devices. Algorithms utilize these psychological leverages to encourage interaction with our devices in order to collect, store, and manipulate our personal data. The separation anxiety many adults face with their devices is the result of a dopamine-driven reward circuitry, which companies will use to keep them online. “Adults in the US spend an average of 2-4 hours per day tapping, typing, and swiping on their devices—that adds up to over 2,600 daily touches” . Within the interactions, personal data is collected using features such as autofill, cookies, and search history. Large platforms such as Facebook and Google may even know users better than they do themselves.
When buying, interacting with, or neglecting our devices, we have the right to understand the extent of which our information is being collected and used. A great weight of spooky technology comes from its abstractness and intangibility. However, as new devices and softwares enter the market at alarming rates, have we lost the timing to detach and analyze it? New technology is not only an opportunity for the future, but an opportunity to reflect on the past and present. As technology continues to advance to greater heights, it has become ever so important to understand the ethical decisions of machine learning and its applications in consumer devices.
Title: How They Perceive Us
Credits: Kimberlyn Cho (B.Arch ‘22), Rio Pacheco (BSECE ‘24)
A great reason behind spooky technology lies in its abstractness and intangibility. How do our devices perceive us from our day-to-day interactions? Furthermore, do our smart home devices know us better than we do?
Data analytics is a powerful tool that collects personal data from a user to infer the user’s preferences and needs. It provides users with the most personalized experience. From Spotify’s ‘discover’ feature to Netflix’s ‘recommended for you’ options, technology may know us better than ourselves. These online interactions are carried into the offline world, which is most evident in today’s generation.
How They Perceive Us is a commentary on how our devices learn, infer, and affect us through everyday human-machine interactions. This exhibit intends to utilize the discretion and vulnerability we’ve internalized with our devices to materialize a receipt of the interactions. By printing a physical representation of one’s digital profile, this experience aims to provoke the discomfort of coming face to face with the AI version of you.
By utilizing the vulnerability of a bathroom and the intimacy of looking at one’s reflection, this exhibition means to tap into one’s most unassuming and candid state. The discomfort is meant to raise suspicion about the accuracy of one’s digital profile, as well as to incite speculations on ways we passively consent to extraction, perception, and persuasion by our devices. Furthermore, we hope for the print to serve as a reflection on how our digital profile is unconsciously embodied by us physically, mentally, and emotionally in real-time.
Initially, we worked on getting the python script to actually be able to print to the thermal printer. Since printing to a thermal printer is not natively supported in python through raspberry pi, we had to use a library called escpos. This library allowed us to communicate directly with the thermal printer through python code, as well as allowing us to print images as well as text.
Detecting faces and printing images
The next step was to get our raspberry pi to be able to interact with the webcam that we were using to capture images. Since USB webcams are also not native to raspbian OS, we used a python library called fswebcam. This not only allowed us to capture images with the webcam but also alter the resolution so that they could print nicely on the thermal printer.
To detect images we used the python library OpenCV. The python script was set up so that an image was constantly being taken, then the image is scanned with OpenCV to detect if a face is present in the image. The image was then printed from the thermal printer.
Data collection and analysis
Once we were able to successfully detect and print faces when detected we looked for ways to extract data from this image. A public API called Hydra AI was one of the more affordable options we found in our searches. This API allowed us to detect age, gender, mask presence, and emotions. After implementing this we ended up with the result shown above.
Creating a story
Our final revision of the printout included comments based on the person's emotions as well as a randomly chosen feedback sentence. If any of the person's emotions were above 20%, our printout would have a small comment below the portion of the receipt with said emotion. Also, we created 5 different possible responses which would be printed at the bottom of the receipt. This made the receipt feel more personal and more like the machine was talking to the person, rather than just giving them information.
Hiding the camera