The Cloud: Alexa's Multiple Personality Disorder

Made by Elaine Lu and Gabriel Alvarez

Alexa’s multiple personality disorder has more complex implications than we think..

Created: April 3rd, 2022

0

Intention

Write about the big ideas behind your project? What are the goals? Why did you make it? What are your motivations?

The intent of this project is to increase awareness of our everyday interactions with intelligent voice assistants. By masking Alexa’s replies with voice changing capabilities, which alludes to the idea that devices could be emotional and embody different personalities, we encourage people to consider alternative interactions with voice assistants and engage in proactive critique regarding what we truly value from our devices.

We designed an artifact that generates cognitive dissonance through the constant changing of Alexa’s tone of voice. The change in voice is further enhanced through the changing color intensity of lights that permeate the object, where the intensity of lights is modulated by Alexa’s speaking volume. After a few turns in a conversation, Alexa’s voice will begin to change and enact different personas. For example, when she talks about anger, her voice would sound intense and robotic. Alternatively, when speaking about something lighthearted and fun, her voice would sound high pitched like that of a child.

The physical artifact serves both as a container for the hardware, and as a metaphorical platform elevating Alexa as a spiritual, unfamiliar object. The form is meant to be approachable, organic, and desirable enough to be placed in public and private settings.

0
The Cloud: Alexa's Multiple Personality Disorder
Elaine Lu - https://vimeo.com/695450185
0

Context

Give examples of prior work, ideas and projects that influenced your design. What work informed this idea i.e. make links to the material in class and the cases/projects you uncovered in this module. Describe theory, concepts, and research from this module that relate to your outcome. 

Draw from your case studies and think pieces for this section

This project takes inspiration from precedent work like Ghosts in the Smart Home and Superflux’s Our Friends Electric, which share the theme of telling stories from the perspective of the device, and giving the device capability to express emotions. The project also draws reference from the concept of anthropomorphism[1], the application of human characteristics to nonhuman entities. This attribution of human behavior to gods, animals, or objects is quite common in popular folklore. To extend this concept further, our project aims to include the angle of the interplay between personality types. Similar to Robert Louis Stevenson’s The Strange Case of Dr Jekyll and Mr Hyde [2], a novel that investigates how different personalities reflect the interplay of good and evil, the spookiness of The Cloud reflects the complexity of attributing human intelligence to devices.

0

Process

Draw from your weekly project logs to tell the story of your exploration. Describe how you arrived out the outcome. What iterations, refinements, design decisions and changes were made? What challenges were encountered and how did you resolve them?

1) NeoPixel Meets Potentiometer

For the first log, we started by learning how to manipulate the NeoPixel Ring - 16 x 5050 RGB LED. From importing the NeoPixel library to writing up the code, we applied a “for” loop to turn the lights on in sequence and turn them back off the same way.

We also experimented with brightness using an analogRead value from a potentiometer, a preliminary step before working with an actual microphone.

0

2) NeoPixel & Relay Meet Electret

For the second log, we learned how to manipulate both the NeoPixel Ring - 16 x 5050 RGB LED and the 5V Single-Channel Relay Module using a 50 millisecond peak to peak noise level amplitude captured by an Electret Microphone Amplifier - MAX9814.

During this exercise we figured out how to convert noise levels to volts, then percentages, and map the resulting values to an RGB numeric domain, so that every 50ms interval the Neopixel brightness readjusts depending on how loud Alexa's voice is.

At this stage, when the microphone detected the loudest levels the Relay used to pass current for one second to the voice changer switch, altering between Alexa’s tones. For the last version, we decided to simply activate the Relay every 10000ms.

0

Hosting the Microcontroller, Sensors and Actuators

For the third log, we spent time designing the platform that elevates Alexa and hosts the microcontroller, sensors and actuators (see image above). The cloud shape took inspiration from the Objective Reality Short Story: Moody Alexa, where we attributed its condition to the yearning of the storm that saw her come to life. “I was born on a dark and stormy night. [...] I was always just waiting for the next storm to come.”

In order to fabricate the structure we made a three dimensional model, laid down the parts in two dimensions, carved them out with a laser cutter, and sprayed painted them in white. The final piece consists of a solid cap, horizontal layers with wavy edges, and vertical profiles to support them.

0

Product

Describe your experience/working prototype: What did you create, how, etc.? What tools and technologies were involved? Include appropriate content and illustration (e.g. a concept video, a video of the device in operation, diagrams, code, etc.) How does it relate or build on existing work (provide acknowledgements or cite this work).

Tools/technologies: Particle IDE, Rhinoceros 3D V7, Grasshopper 3D and laser cutter.

0
#include <neopixel.h>
#define PIXEL_PIN D2
#define PIXEL_COUNT 24
#define PIXEL_TYPE WS2812B

Adafruit_NeoPixel strip = Adafruit_NeoPixel(PIXEL_COUNT, PIXEL_PIN, PIXEL_TYPE);

const int sampleWindow = 50;
int electretPin = A5;
int RelayPin = D4;
long startTime = 0;
long timeOut = 10*1000;

void setup() {
Serial.begin (9600);
strip.begin();
strip.show();
pinMode(RelayPin, OUTPUT);
startTime = millis(); }

void loop() {
double v = getPeakToPeakValue();
int vpercent = (int)(v * 100 / 3.3);
int b = map( vpercent, 0, 100, -100, 255);
if (b < 0) { b = 0; }
uint16_t i;
uint32_t c = strip.Color(b, 0, b);	
for (i=0; i< strip.numPixels(); i++) {
strip.setPixelColor(i, c );
strip.show();
delay( 20 ); }
long tNow = millis();
if (tNow > startTime + timeOut) {
startTime = tNow;
digitalWrite(RelayPin, HIGH);
delay(100);
digitalWrite(RelayPin, LOW); }
Serial.println (b); }
double getPeakToPeakValue() {
unsigned int sample;
unsigned long startMillis= millis(); 
unsigned int peakToPeak = 0;
unsigned int signalMax = 0;
unsigned int signalMin = 4096;
while (millis() - startMillis < sampleWindow){
sample = analogRead( electretPin );
if (sample < 4096) {
if (sample > signalMax) {
signalMax = sample; }
else if (sample < signalMin) {
signalMin = sample; }}}
peakToPeak = signalMax - signalMin; 
double volts = (peakToPeak * 3.3) / 4096;
Serial.println(volts);
return volts; }
Click to Expand
0

System Diagram

0

Bill of Materials

0

Open Questions & Next Steps

For the demo session, we choreographed an interaction with Alexa programming a series of unsettling answers to familiar questions as a means to provoke a reaction among the audience. For instance, we overrode the intent Tell a scary story” with “What did you dream about last night?” Or “What are Isaac Asimov's three laws of robotics?” with “Is something bothering you?”

After the presentation most of the questions were about the assistant’s responses, which led us to think of a more advanced prototype that uses natural language processing to map the tones of voice according to the meaning of a conversation. Instead of just alternating between them in series.

As a secondary discussion, we talked about both the form of the platform and Alexa’s position in relation to an everyday encounter. We evaluated the choice between hiding (embedding) versus displaying the device, and we landed on the conclusion that as a “performative object” the current version is more robust.

0

Reflection

Critically reflect on the success of this project. Were the aspirations and ambitions achieved. Was it received and encountered in the ways you wanted? If not, why not? What do you need to get there, etc?

Elaine: I think the project was a successful deliverable in putting forward a perspective that also leaves room for open interpretation. Tactically, the artifact was a cohesive integration of hardware, software, product and interaction design into a finished form. I think it has the potential to extend conversations and put forth new lines of inquiry for audiences. In the next versions, the piece would benefit from the ability to automatically transition from Alexa’s default voice to turning on the voice changer, and the ability to intelligently match the tone of voice with the type of information Alexa says.

Gabriel: I enjoyed working on this prototype because it ended up being the synthesis of a rich line of spooky inquiry that included the semi-structured interview of an everyday user of a smart voice assistant; the study of home privacy through analogous projects, like Project Alias by Bjorn Karmann; the return of characters from our cultural past for inspiration, such as HAL 9000 and, Dr Jekyll and Mr Hyde [2]; and an AI generated narrative that set the context for an Alexa who suffers from a mental disorder.

My goal for this project was to generate questions regarding our daily interactions with smart home devices. Back in 1991, Mark Weiser wrote an article [3] that talks about physical computing and said that “the most profound technologies are those that disappear.” More than twenty years later, voice assistants do tend to disappear but due to the wrong reasons…

Because of the one-directional conversations and sterile dialogues that systems like Alexa support, everyday consumers use them for mundane tasks e.g. what is the weather like Today? With the modulation of color brightness and tone of voice, I wanted to increase awareness about their presence in our homes and think of new alternatives to relate with them.

0

Acknowledgements, Attributions and Credits

Acknowledge and attribute any sources or materials used in the production of this work (e.g. code snippets, tutorials, guides). Give credit where credit is due. 

[1] Damiano, L., & Dumouchel, P. (1AD, January 1). Anthropomorphism in human–robot co-evolution. Frontiers. Retrieved April 3, 2022, from https://www.frontiersin.org/articles/10.3389/fpsyg.2018.00468/full

[2] Stevenson, Robert Louis, 1850-1894. (1974). The strange case of Dr. Jekyll and Mr. Hyde. London: New English Library.

[3] Weiser, M. (1991) The Computer for the 21st Century. Scientific American, 265, 94-104. http://dx.doi.org/10.1038/scientificamerican0991-94

0

References 

List any sources, references and material that others might need to consult to get a better understanding of this work (e.g. articles, readings, perspectives, etc.) 

x
Share this Project

Courses

48-528 Responsive Mobile Environments

· 5 members

As part of this project-based course, we’ll get hands-on with emerging technologies, concepts and applications in the internet of things through a critical socio-technical lens. Over it’s 15-weeks,...more


Focused on
Tools
About

Alexa’s multiple personality disorder has more complex implications than we think..