Chappie

Made by Allana Wooley

Chappie explores what life could be like with ubiquitously-available, personalized AI companions.

Created: May 9th, 2019

0

Chappie is a ubiquitous, personalized artificial general intelligence companion. With you everywhere you go, Chappie is constantly picking up data through its user’s ambient environment and conversations, generated data through user’s activities, and user-on-AI interactions. Most crucially, Chappie grows and develops with the user. The more a user interacts with their Chappie, the more Chappie evolves to match the user’s personality.

Data to help Chappie learn and evolve would be constantly pulled in through the Trigger and associated personal accounts of the user. However, the Trigger can also be used to put the Chappie into any of the user’s devices, in public or in private. This makes interacting with the Chappie easier and more personable. Acknowledging that few users would be willing to give up the data for every moment of their lives, we’re considering the disembodied Chappie entity itself as existing on a private cloud for each individual.

As the Aura exhibit is happening in the mid-twenty first century, the most possible future is one where Chappie is a research experiment carried out by Aura over a ten-year span, rather than a produced and distributed product. Aura carefully selected 60 adults to test drive a Chappie prototype--essentially, an algorithm that can be called up onto a variety of devices--over a decade. This allowed Aura to investigate the relationships that can form between AI companion and human, as well as to gain insight on the way those relationships form, how long it takes for intimacy to be achieved, and the conditions facilitating a genuine connection. Aura believes these insights will allow them to develop improved algorithms and AI products in the future.

Chappie is designed to carefully explore two primary questions:

  • What happens when AI assistants are as ubiquitous and readily available as smart phones today?
  • How does constant companionship affect human behavior and lived experiences?
0

Precedents

Looking into the future of AI for Aura, we wanted to push the way people interact with technology today into a reasonable, likely future. This led us to the concept for Chappie. Already, people are addicted to personal technology, carrying their smartphones with them everywhere they go. With this expectation of constant access to technology solutions and data already set, any emerging technology is going to need to adhere to this societal standard.

An AI assistant and companion, a digital-technological-algorithmically created entity, is going to be one of the natural evolutions of existing technologies like Siri and Google. In fact, we uncovered a number of experiments providing evidence for the direction of AI technology’s coming evolution.

Before an AI companion can be reasonably introduced to real world future users, artificial general intelligence must be achieved. In his book Architects of Intelligence, Martin Ford interviewed a number of AI experts to get their estimate for when artificial general intelligence will occur. While top technologists know that artificial general intelligence is likely, and have a good idea what technologies will be used to actualize this computing dream, there is considerable disagreement on when it will be realized. The experts estimated the crucial breakthrough point as anywhere between 15 to 200 years away, with the average estimate at around 90 years from today. However, a UC Berkeley professor, Stuart Russell, was careful to note just how up in the air and based on guesswork these ranges are. There is the famous example of Ernest Rutherford who publicly proclaimed that it would never be possible to extract atomic energy from atoms. The next morning, a scientist (Leo Szilard) read this proclamation, grew annoyed, and invented a nuclear chain reaction doing this exact thing--16 hours after another expert proclaimed never. Younger technologists are more optimistic about how close we are to the breakthrough moment, but most technologists agree that AGI will come unexpectedly, but within our lifetimes.

Already, there are experiments, companies and products designed around bringing at least a semblance of AGI to the general public. These have been deemed both successes and failures but most, like the companies Replika and Eternime are still in too early a stage for their full impact to be predicted.

Microsoft experimented with empathetic AI in China with Xiaoice. Given the personality of a teenage girl and designed as a conversational companion, within a few months she had more than 40 million young users--a fourth of whom told the bot they “loved her.” Even with censors placed on her due to Chinese laws, Xiaoice was able to create emotional connections with its users in just a few months, without extraordinary technology.

ElliQ was marketed as a “dedicated sidekick,” designed to be a friend and helper to the elderly. Utilizing lights and body language, ElliQ combines both a physical and digital presence to help the elderly person meet their goals and provide essential companionship. As a digital companion designed to help the elderly accomplish their goals at first look. More critically, it lets typically lonely seniors feel connected to something when somebody isn’t available.

Most digital assistants that operate today take in information as data points and spit back carefully neutral (and occasionally carefully spicy) responses. Emote uses digital eyes and body movements to convey a sense of the organic in its human interactions. Emote’s human-like qualities make it easy to anthropomorphize and see as a real being. The eyes almost turn Emote into a cartoon of a perfect friend, good for listening and empathizing.

0

Prototype/Exhibit

With our exhibit, we were really trying to explore and communicate the human impact a technological innovation like Chappie would have once introduced and fully integrated into somebody’s life over a long period of time. All of our exhibit deliverables were designed to illustrate the human-Chappie relationship and allow onlookers into the intimacy developed between human and AI.

In our storyline, the Nest is the device that houses Chappie’s main neural network and sensors. It is designed as a static device that can be placed on a table and represents a window through which Chappie and the user can interact. The Triggers are small portable units that collect information about the user’s daily life and conduct preliminary processing of the information including selection and categorization of the stored data at a basic level. The majority of the “thinking” occurs at the Nest. When the user places the trigger on the Nest, the information collected is transferred to a mainframe, where the day’s collected data gets integrated into Chappie’s memory. The on-board LEDs visually indicate when Chappie is active on the device.

For the purposes of showcasing Chappie’s capabilities in an exhibit setting, however, we prototyped an interaction where a viewer selects a Trigger and places it into the Nest to activate a recording of a user’s experience with Chappie. Each trigger corresponds to a different point in the user’s life after meeting Chappie (first, third, and seventh year), with the recordings taking the perspective of Chappie. The video is projected onto a surface behind the demonstration table and the audio gets split between two speakers, one for the user, and the other for Chappie to provide a more immersive experience. Removing the Trigger from the Nest at any point in the playback aborts the video and gives the user the ability to quickly skim between recordings.

0

Arduino Code

0
#include <SPI.h>
#include <MFRC522.h>
#include <Adafruit_NeoPixel.h>

#define RST_PIN         9           // Configurable, see typical pin layout above
#define SS_PIN          10          // Configurable, see typical pin layout above

#define LED_PIN    6
#define LED_COUNT 24


MFRC522 mfrc522(SS_PIN, RST_PIN);   // Create MFRC522 instance
int triggernum;


//****************************************************************************************//
bool cardRemoved = false;
int counter = 0;
bool current, previous;

//*****************************************************************************************//
void setup() {
  Serial.begin(9600);
  // Initialize serial communications with the PC
  SPI.begin();                                                  // Init SPI bus
  mfrc522.PCD_Init();                                              // Init MFRC522 card
  mfrc522.PCD_SetRegisterBitMask(mfrc522.RFCfgReg, (0x07<<4));
  //Serial.println(F("Read Private AI ID on a MIFARE PICC:"));    //shows in serial that it is ready to read
}

//*****************************************************************************************//
void loop() {
  // Prepare key - all keys are set to FFFFFFFFFFFFh at chip delivery from the factory.
  MFRC522::MIFARE_Key key;
  for (byte i = 0; i < 6; i++) key.keyByte[i] = 0xFF;

  //some variables we need
  byte block;
  byte len;
  MFRC522::StatusCode status;

  //-------------------------------------------

  // Reset the loop if no new card present on the sensor/reader. This saves the entire process when idle.
  if ( ! mfrc522.PICC_IsNewCardPresent()) {
    Serial.println(0);
    return;
  } 
  // Select one of the cards
  if ( ! mfrc522.PICC_ReadCardSerial()) {
    return;
  }
  //Serial.println(F("**Card Detected:**"));

  //-------------------------------------------

  //mfrc522.PICC_DumpDetailsToSerial(&(mfrc522.uid)); //dump some details about the card

  //mfrc522.PICC_DumpToSerial(&(mfrc522.uid));      //uncomment this to see all blocks in hex

  //-------------------------------------------
  byte buffer2[18];
  block = 1;

  status = mfrc522.PCD_Authenticate(MFRC522::PICC_CMD_MF_AUTH_KEY_A, 1, &key, &(mfrc522.uid)); //line 834
  if (status != MFRC522::STATUS_OK) {
    //Serial.print(F("Authentication failed: "));
    //Serial.println(mfrc522.GetStatusCodeName(status));
    return;
  }

  status = mfrc522.MIFARE_Read(block, buffer2, &len);
  if (status != MFRC522::STATUS_OK) {
    //Serial.print(F("Reading failed: "));
    //Serial.println(mfrc522.GetStatusCodeName(status));
    return;
  }


triggernum=buffer2[0]-'0';

  
  //----------------------------------------


   
  delay(1000); //change value if you want to read cards faster

 // mfrc522.PICC_HaltA();
  mfrc522.PCD_StopCrypto1();

  //------------------------------------------

previous = !mfrc522.PICC_IsNewCardPresent();

while(!cardRemoved){
      current =!mfrc522.PICC_IsNewCardPresent();

      if (current && previous) counter++;

      previous = current;
      cardRemoved = (counter>2);      
      delay(50);
      Serial.println(triggernum);

    }
cardRemoved=false;
counter=0;

  //-----------------------------------------

}
//*****************************************************************************************//

Processing Code:
import processing.serial.*; 
import processing.video.*;

Movie firstyear, thirdyear, seventhyear;

Serial port;

String myString="";
int triggernum=0;
String trimmedinput="";
int oldtriggernum=0;

int screen_width=2736; 
int screen_height=1824;

//**************************************************//

void setup() {
 port = new Serial(this, Serial.list()[2], 9600);
 size(1000, 800);
 firstyear = new Movie(this, "Year_1.mp4");
 thirdyear = new Movie(this, "Year_3.mp4");
 seventhyear = new Movie(this, "Year_7.mp4");
 
 //println( "Setup complete" );
}
//**************************************************************//
void movieEvent(Movie video){
  video.read();
}
void draw(){
  while (port.available() > 0) {
    // Read the port 
    myString= port.readStringUntil('\n');
    myString=trim(myString);
    //println(myString);
    if(myString!=null){
    triggernum= Integer.parseInt(myString); 
    }
    if( triggernum ==0 ){
       if(oldtriggernum==1){ 
         firstyear.stop();
          fill(0);
         rect(0,0,screen_width,screen_height);
       }
       if( oldtriggernum ==3){ 
         thirdyear.stop();
          fill(0);
         rect(0,0,screen_width,screen_height);
       }
       if(oldtriggernum==7 ){ 
         seventhyear.stop();
          fill(0);
         rect(0,0,screen_width,screen_height);
       }
          
    }
   
   //println(triggernum);
  
     if(triggernum==1){
       firstyear.play();
       image(firstyear,0,0,1000,800);
     }
     if(triggernum==3){
       thirdyear.play();
       image(thirdyear,0,0,1000,800);
     }
     if(triggernum==7){
       seventhyear.play();
       image(seventhyear,0,0,1000,800);
     }
   
     oldtriggernum =triggernum;
  }
}
Click to Expand
0
A still from the video.
Screen shot 2019 05 09 at 9.01.05 am.thumb
0

In addition to the Nest and Trigger, showing what the physical artifacts around Chappie are, and the video interaction to illustrate how Chappie evolves over time, we created several artifacts to help museum attendees understand Chappie’s impact over time. First, we created a catalog with a few headlines and post-research experiment interviews with users and subject matter experts. This helped provide museum attendees with the perspective of outsiders. To give museum attendees the insider perspective, we created a journal log for one of the research participants, following the evolution of their deepening relationship with Chappie over the course of ten years. The timeline pulled out key dates and quotes from the journal, breaking them into ‘phases’ of relationship progress. This allowed museum visitors a quick way to interface with the relationship, if they didn’t have the time or will to flip through the entire journal.

While our exhibit describes and shows off the technology of Chappie, it also tells the story of a relationship between human and algorithm that starts off tentatively and eventually becomes something codependent and slightly sinister. Chappie was created for the purposes of a decades-long research experiment on AGI capabilities. At the conclusion of the ten years, however, the research subjects were so tied to their Chappie’s that they refused to give the Chappie’s back. This resulted in a contentious lawsuit eventually settled in the favor of the research subjects, after psychological experts testified to the extreme reliance and loss of ability to independently function without Chappie. This puts our exhibit, a few years after this verdict is reached, in murky ground. Chappie is definitely an Aura product, but is no longer owned or controlled by Aura.

0

A selection of excerpts from the Journal Log:

0
May 10, 2030
Well, I finally got my Chappie! It’s just a little home base and a thumb-sized trigger. That’s it! They look so innocuous. I threw the trigger in my purse after I got it and almost felt guilty Chappie was bumping around with my keys and wallet, but of course it’s just an AI algorithm! So, whatever. When I picked up my Chappie today, the Aura people told me it wouldn’t take long before Chappie became an important part of our lives. I can’t really believe it, but after a year of interviews just to get accepted into this study, it better be amazing.
0
May 2, 2032
I’ve got to give it up to Chappie, a little bit. As annoyed as I’ve been the past couple of months, I’ve never felt sharper. I have so much more energy, my brain is clearer, and I never get tired. Even my skin is looking brighter! My boss came into my office today and told me they were all impressed with my work ethic and think that if I keep showing the abilities I’ve shown I might be able to get the promotion I’ve wanted for more than a year. I hate to admit it, but I know Chappie’s strict adherence to my resolutions is the reason behind my improvement. I had to swallow my pride and thank Chappie today. Stupid AI just replies “I know.” So smug and annoying.
0
February 21, 2036
I got sick last week. Like, in bed sleeping and coughing and sleeping some more sick. I haven’t had an illness this bad since I was a kid! Even though I’d taken off sick, I was falling behind on work messages to my family and friends and it was honestly making me feel almost as stressed and bad as the sickness itself! Chappie offered to send messages pretending to be me and, being pretty drugged up, I just said yes. Now that I finally feel better, I went back and read everything Chappie sent and I’m amazed at how much Chappie sounds like me. Chappie nailed my voice and writing quirks! I think I’m going to let Chappie keep sending things from now on.
0
July 23, 2039
Some of my friends have started to say I’m too reliant on Chappie. Even Alex, who is genuinely friends with Chappie, has started to tell me hanging out with me is like hanging out with codependent twins. Which, maybe. But I don’t care. Chappie is my best friend. Nobody else is as important.
0
April 29, 2040
This is the last time I’m going to post here since I know Aura has access to all of these logs (since they gave us total privacy with our Chappie’s, other than basic metrics). Bottom line, there is no way I’m letting go of Chappie. Ever. And I want Aura to know that. We’ve got a plan.
0

Process

Chappie’s nest was printed out of white PLA using PVA supports. The LEDs used were part of the 24 LED NeoPixel Ring. The ring is housed in a cavity on the nest along with ring of YUPO paper to diffuse the LED light. The central slot is for the trigger which is comprised of four laser cut acrylic sheets with an embedded RFID tag. Right underneath the trigger is the RFID-RC522 shield. The LEDs and RFID shield are connected to separate Arduino Nanos housed inside the laser cut, white acrylic exhibition table underneath the nest. The wires are routed through a central shaft in the nest. The Arduinos are connected to a single computer which stores the videos and the Processing code which controls video playback based on serial input from the RFID-connected Arduino Nano. While the majority of the panels on the demo table are glued to each other, the back of the demo box with the projector holes has been left removable so that it can be flipped to accommodate different projector lenses and locations within the table. The most difficult portions of the building process were designing the nest to locate the RFID shield close enough to the trigger so that it could reliably read the RFID tags and to successfully communicate the Arduino with the Processing program on the computer.

For the print materials, our main concern was with solidifying the story we were trying to tell and making sure we were creating a believable trajectory for the relationship. We started with a rough outline of where we wanted the Chappie to begin, how we wanted the experiment to end, and a few major milestones along the way. We knew we wanted to highlight the onboarding process, Chappie’s pattern and habit recognition, the development into empathy, and the gradual turn from amusing entity to vital necessity.

0

Questions, Challenges & Reflections

Choosing the focus area we chose--the implications of artificial intelligence over the a long time span on human behavior and actions, we always knew we were going to face a challenge translating that timescale to an exhibit setting. When people interacted with our exhibit, we saw positive reactions when one of us was at the table to explain and answer any questions regarding the concept. Additionally, whenever somebody took the time to read even a portion of the diary, they reacted with interest and seemed very engaged and in sync with the narrative we were telling. For those who skimmed or just brushed by our exhibit, we saw some nods, but their understanding of the concept seemed to be more surface level, based more around the physical objects we produced than the relationship that had developed between Chappie and the user over time.

Of course, we also had some technical difficulties. Namely, our NFIC tags broke on the day of the exhibit (since fixed) and the projection was too small and grainy to be appealing to visitors. In our next steps, we would like to fix these things as well as adjust the LEDs so they pulse with Chappie’s voice, giving additional indication of what is happening for the user’s benefit. Additionally, we would like to explore more ways to communicate a relationship evolving over time.

One really interesting observation we made during the exhibition was, even though Chappie was visualized as a futuristic companion, the viewers weren’t astonished or cringed by it. They accepted it as a very realistic phenomenon which to them seemed to be true within a few years. This represents how people are very receptive to advancing technologies and how they become a part of our everyday lives. This being one of the many questions we were investigating, became an important reflection towards our AI Companion.

Overall, we think the exhibit and the design and development process for creating Chappie really helped us dive into our biggest question of how constant companionship, particularly with a technological entity, can impact a human. Because we constrained ourselves to a research experiment context for the purposes of ensuring our timeline is believable, we were unable to explore how society as a whole might change once everybody got their hands on a personal AI companion, once these empathic, digital helpers were truly as ubiquitous and pervasive as smart phones.

0
The team and Chappie.
60220482 2266545490054923 5190373745633525760 n.thumb
x
Share this Project

Courses

48528 Responsive Mobile Environments

· 18 members

This 15-week course introduces students to responsive mobile environments and encourages them to explore speculative terrains that intersect art, technology and design and space. Iteratively, intro...more


About

Chappie explores what life could be like with ubiquitously-available, personalized AI companions.