Back to Parent

Outcome


Img 9610.thumb
Show Advanced Options

Credits

This is a project built by Angie Wang and Anya Singhal.

Thank to -

Daragh Byrne, our instructor, for introducing the theme of spooky technologies and the 'haunted house'  and led us to think about deeper meanings. We also appreciate him inviting a lot of professors and professionals to our critique and the exhibition, so we gain a lot of powerful feedback.

Zhenfang Chen, our TA, for providing technical support and useful feedback. 

Lab Assistant in TechSpark. for providing technical help.

We also appreciate all professors and professionals who discussed our project with us during the critique and exhibition. 

Description


Intentions

In the information age, it is common knowledge that smart devices often collect and even steal personal information. However, the use of various AI models in everyday software and devices is not always apparent to users, leading to unawareness of manipulation by AI. For instance, some software only presents content it predicts will interest the user. Prolonged human-machine interaction can make individuals narrow-minded and increasingly dependent on machines due to their influence. The project aims to highlight the risks of manipulation by AI and the resulting narrow-mindedness and dependence on devices that can occur.

The big idea behind the project is to raise awareness about the potential negative impacts of prolonged human-machine interaction, particularly with the increasing use of AI models in everyday software and devices. The motivation behind the project is to address the growing concern about the use of AI in everyday life and its potential impact on society. With the increasing ubiquity of smart devices and the exponential growth of AI, it is crucial to raise awareness about the potential risks and encourage individuals to take proactive steps to protect themselves.

Ultimately, the project is driven by a desire to empower individuals to make informed decisions about their use of technology and ensure that they are not unwittingly subject to manipulation by AI. The project aims to foster a more responsible and mindful approach to technology that benefits individuals and society by promoting greater understanding and awareness.

Img 9379.jpg.thumb
Show Advanced Options

At the Exhibition...

This project comprises three distinct parts that work together to create an interactive and thought-provoking experience for the audience.

I. Stay There: Phone

"Oh, hello there!

Make sure to grab your phone! Can’t leave without it..."

 This first part involves using a phone provided to the visitor upon entering the space. The phone screen displays a message stating that "the connection between the mobile device and the smart home server has failed," raising questions about the reliability and security of smart home technology.

Img 9995.jpeg.thumb
Show Advanced Options

II. Stay There: Eye

"I am watching you. Stay there.

If you leave, I will warn you: dangerous things will happen. Things that will make you scared. Uncomfortable.

You know I exist. I have your focus. I watch everything. Just focus on me and don’t stray."

In the second part of the project, visitors will encounter a large mechanical eyeball that tracks their movements as they walk around the room. When visitors stand in the location designated by the machine, the eyeball turns blue, indicating a normal operation. However, if visitors attempt to leave the room or move towards the exit or entrance, the eye turns red, warning them not to proceed. 

Img 1084.jpg.thumb
Show Advanced Options

III. Stay There: Screen

"Oh whoops! Ignore us, just turn back around and go about your life. 

  We know everything, but you don’t need to know that."

Finally, the third part of the project consists of an electronic screen displaying an array of eyeballs that follow visitors' movements as they move around the room. This creates a sense of being watched and monitored by smart devices that people are normally unaware of, raising questions about privacy and surveillance in modern society.

Img 1092.thumb
Show Advanced Options
Screen shot 2023 05 09 at 3.00.46 pm.thumb
Show Advanced Options

Process


Throughout the development of this project, we underwent multiple iterations and refined our approach to improving the final outcome. How the project worked was that we created a grid with four UWB(Ultrawide Band) chips placed in the corner of the room and send signals to the handheld chip. The handheld chip will send the location data to the Arduino Nano according to its position in the grid. The motor and the eye in the P5 controlled by the Arduino Nano will move according to the received location data. Our process can be broken down into three key parts.

I. Stay There: Phone

The first part involved the production of the 'cell phone' device. We first laser-cut and 3D-printed the phone shell and installed the UWB receiver chip for positioning in the phone. However, during the exhibition, we encountered issues with the UWB signal reception which resulted in us having to repeatedly disassemble the phone and reset the chip to reconnect the signal. Despite this setback, we resolved the issue and ensured the device functioned as intended.

Additionally, we designed our screen cover to have a message saying "Welcome Home! The connection between the mobile device and the smart home server has failed, " indicating that the phone shouldn't be communicating with anything in the room. However, the phone is actually communicating with our 'smart home'. By this, we want to let audiences feel a sense of being unsecured but being told that they are safe. 

Img 9990.jpeg.thumb
Show Advanced Options

II. Stay There: Eye


We went through a thorough process to create the mechanical eye for our project.

Our first step was to 3D print the shell of the eye, but we quickly realized it wasn't visible enough in the room. To address this issue, we decided to scale the model up and print it in two parts, which took a total of 3 days(34 hours + two times failures due to troubleshooting) to get the desired result. 

Next, we experimented with different materials and spheres for the eye until we discovered the heat sinker from the iDeate lab, which we decided to use as the pupil. However, the heat sinker's weight posed a challenge as the servo motor we used to control the rotation was relatively light and small. We sought assistance from the lab assistant at Techspark, and after some punching, sanding, and gluing, we finally managed to assemble the eye.

Finally, is to assemble the eye in the exhibition. We faced some difficulties attaching the eyes to the motor on the exhibition table due to their weight. Therefore, we decided to glue them to a black acrylic plate to keep them upright. During the process, we came up with the idea of using hot glue to create the effect that the eyes had legs and were trying to climb out of the table. We implemented this idea to give the eyes a more "animistic" appearance.

Img 1114.jpeg.thumb
Show Advanced Options

III. Stay There: Screen

Our project's third and final part involved creating a screen of 20 eyeballs using P5.js. We received feedback during the critique phase suggesting we make the eyeballs more realistic, but we ultimately decided against it. We intended to represent intelligent devices that go unnoticed daily, always watching us but not necessarily as obvious as a mechanical eye. Therefore, we wanted to distinguish them from the realistic mechanical eye we had created earlier.

One of the challenges we faced in this part was making the pupils of the eyes follow the audience's movements and turn accordingly. This involved connecting P5 and Arduino Nano through serial communication and ensuring that the Arduino Nano received data from the UWB sender to control the motor to rotate the mechanical eyeball and P5 eyeballs. However, our connection was unstable throughout most of the exhibition, requiring us to restart the chip constantly.

Img 1117.jpg.thumb
Show Advanced Options

Build

Bill of Materials

  • Arduino Nano
  • UWB Device(Sender and Receriver)
  • P5.js + Display Monitor
  • Servo Motor
  • Heat Sinker
  • Neopixel Ring
  • 3D Printing + Laser Cutting
Img 1116.jpeg.thumb
Show Advanced Options
Img 1090.thumb
Show Advanced Options

Diagram

As mentioned, the project worked by creating a grid with four UWB(Ultrawide Band) chips placed in the corner of the room and sending signals to the handheld chip. The handheld chip will send the location data to the Arduino Nano according to its position in the grid. The motor and the eye in the P5 controlled by the Arduino Nano will move according to the received location data. Here is a diagram of how the UWB locator, Arduino Nano board, the 'phone'(hand-held UWB receiver), and the P5 screen communicate with each other when the user is holding the 'phone' and walks around the room.

Screen shot 2023 05 09 at 4.58.24 pm.thumb
Show Advanced Options

The UWB transmitter operates by emitting numerous pulses throughout a broad range of frequencies. A receiver that corresponds to it picks up the signal and decodes the data by detecting a familiar sequence of pulses transmitted by the transmitter. These pulses are transmitted at a frequency of approximately one every two nanoseconds. Below is a graph of how UWB works from how2electronics.com .

Screen shot 2023 05 09 at 5.16.04 pm.thumb
Show Advanced Options
Show Advanced Options
Show Advanced Options
Show Advanced Options
Drop files here or click to select

You can upload files of up to 20MB using this form.