Aura fIdO
Made by Justin Kufro, Kthies and Chileshe Otieno
Made by Justin Kufro, Kthies and Chileshe Otieno
To evoke thought on the potential effects artificial intelligence can have in the domain on pet-owner relationships in the future.
Created: May 3rd, 2019
Our project's main goal is to evoke thought on the potential effects artificial intelligence can have in the domain on pet-owner relationships in the future. We chose to accomplish this through telling the story of an Aura AI product targeted at dogs called Aura fIdO, which struggled during its several iterations. We think that automating aspects of the pet-owner relationship can go too far, in that it will further disconnect the two. The automation area that the fIdO product explores is dog training and basic analytics (e.g. hunger, thirst, health, last owner interaction). The product would be able to autonomously train a dog, with the tradeoff that dogs tend to become loyal only to the collar's commands, as well as sometimes become aggressive in cases where training intensity was set to a high level. We sought to explore the various use cases of such a product, both the good and the bad. We reasoned that including the good could be the difference between believable launched product with public controversy and a totally failed product that never had a chance to make it to market.
Our research involved looking at our project and finding items, events, or mannerisms that can be found in the present that could be indicators of our possible future. Basically, we knew where we wanted to be in the future, so what today is on the path to that future. We looked through the lens of politics, economics, society and technology. Since our project was primarily technological, we were able to find plenty of precedents that also looked at smarter pet products, such as the Link Smart Collar, an IoT training device, or the iFetch system, which takes the human aspect out of playing with your dog. Additionally, there were a lot of societal shifts that would be great indicators of our product potentially becoming a reality. Especially when it comes to pets, the market has been steadily increasing, and people are starting to treat their pets more like family or children, leading to an increased willingness to spend money on pets.
There were several goals for our project in terms of deliverables. First and foremost, a collar with attached electronics capable of playing sounds, giving haptic feedback, and lighting up. We planned on putting only essential electronics in the collar - to reduce perceived size & weight - and hiding all other electronics in a small pedestal that the collar would rest on. The collar would then be controlled by a static web page branded as an Aura product which would help tell our narrative. Other supporting objects of our narrative are our curator's statement, placards, user startup guide, and a slideshow of fIdO user testimonials.
Our zeroth iteration was a mock-up prototype what a collar might look like. This was extremely basic and was done in just 45 minutes during class time. It helped us describe our narrative within the context of a physical object.
The media from our zeroth iteration can be found under this section.
The second iteration of our project improved upon the areas noted from our first iteration. The major change to the web page was that it now included a statistic on the 'last owner interaction' with a value upwards of 25 days. The goal behind this was to put the potential diminishing in pet-owner relationship clearly in the app. We overcame our pedestal manufacturing difficulties from the first iteration. We were able to re-design, laser cut, and paint it white such that it blended with the exhibit's aesthetic. Because of this we were then able to fully assemble our collar and successfully hide bulky electronics in the pedestal. We also designed the pedestal such that it was able to hide a large, short-throw projector inside as well. This was used to project the auto-looping testimonial slideshow (https://drive.google.com/open?id=158c2J_SW7hTUxl8W1cV8xixo4UIOTsbNzR6LJ-kFJ1Q).
The site can be found at here (https://drive.google.com/drive/folders/1Scognxhrx70kZhXGG8GVHtNeDZq5R5Wi?usp=sharing). Its content is generated with embedded ruby, and was hosted on a python SimpleHTTPServer for the show from a laptop. It has some static content, an embedded map, a gif created from a YouTube video (https://www.youtube.com/watch?v=vL3BMuv82CE, referenced in code as well). The interactive part of the app are the buttons on the tricks page. These will post events to the Particle Cloud, which the dog collar particle subscribes to and plays various pre-recorded voice snippets. There are three available tricks, all with three voice tones each.
The exhibit printouts (posters & placards) can be found here (https://drive.google.com/open?id=1wAMzkt6kTNyznIfbSoDp9EsqKEpfBX8J).
The media from our second iteration can be found under this section.
The exhibit was overall a success our project in terms of provoking conversation around the ethics of such a product and what it could mean for the pet-owner relationship. It did well at bringing several artifacts together (start-up guide, collar, app, testimonials, placards, curator's statement) to tell our story.
One conversation that we had with a visitor was about the effects of technology on creating underside space between you and your pets. This was even to the point of things like giving a dog or cat food and water every day manually. This turned into a discussion of care ethics where doing even these sorts of small trivial things by hand (i.e. without supporting technology) for your pet could be seen as a sort-of ethical duty to them and your relationship.
Another notable conversation was about how this technology seems great at first glance; however, as one digs deeper and investigates the narrative, it comes across as undesirable and fundamentally flawed in nature. This is exactly the kind of effect that we were going for with this exhibit.
Some sound bytes were overheard as people viewed and interacted with the exhibit such as "Last owner interaction 27 days ago! No wonder its health is low!" and "I wouldn't feel like a real pet owner if I had this for my dog".
One limitation of the exhibit space was the amount of background noise present. This made the collar sounds very difficult to hear, which diminished some of the experience.
# define Start_Byte 0x7E
# define Version_Byte 0xFF
# define Command_Length 0x06
# define End_Byte 0xEF
# define Acknowledge 0x00 //Returns info with command 0x41 [0x01: info, 0x00: no info]
int volume;
String toneVoice = "";
void tricksRoll(const char *event, const char *data){
Serial.println(data);
String sdata = data;
execute_CMD(0x16,0,0);
delay(1000);
if (sdata.equals("0")){
execute_CMD(0x03,0,1);
}
else if (sdata.equals("1")){
execute_CMD(0x03,0,2);
}
else if (sdata.equals("2")){
execute_CMD(0x03,0,3);
}
delay(2000);
}
void tricksStay(const char *event, const char *data){
String sdata = data;
execute_CMD(0x16,0,0);
delay(1000);
if (sdata.equals("0")){
execute_CMD(0x03,0,4);
}
else if (sdata.equals("1")){
execute_CMD(0x03,0,5);
}
else if (sdata.equals("2")){
execute_CMD(0x03,0,6);
}
delay(2000);
}
void tricksSit(const char *event, const char *data){
String sdata = data;
execute_CMD(0x16,0,0);
delay(1000);
if (sdata.equals("0")){
execute_CMD(0x03,0,7);
}
else if (sdata.equals("1")){
execute_CMD(0x03,0,8);
}
else if (sdata.equals("2")){
execute_CMD(0x03,0,9);
}
delay(2000);
}
void setup ()
{
Particle.subscribe("tricks_roll_over",tricksRoll, ALL_DEVICES);
Particle.subscribe("tricks_stay", tricksStay, ALL_DEVICES);
Particle.subscribe("tricks_sit", tricksSit, ALL_DEVICES);
Serial.begin(9600);
Serial1.begin(9600);
execute_CMD(0x3F, 0, 0); // Send request for initialization parameters
//while (Serial1.available()<10) // Wait until initialization parameters are received (10 bytes)
delay(1000); // Pretty long delays between succesive commands needed (not always the same)
// Initialize sound to very low volume. Adapt according used speaker and wanted volume
execute_CMD(0x06, 0, 0x30); // Set the volume (0x00~0x30)
delay(1000);
// execute_CMD(0x0D,0,1);
// execute_CMD(0x11, 0, 0x01);
}
void loop()
{
}
// By Ype Brada 2015-04-06
// https://community.particle.io/t/a-great-very-cheap-mp3-sound-module-without-need-for-a-library/20111
void execute_CMD(byte CMD, byte Par1, byte Par2) // Excecute the command and parameters
{
// Calculate the checksum (2 bytes)
int16_t checksum = -(Version_Byte + Command_Length + CMD + Acknowledge + Par1 + Par2);
// Build the command line
byte Command_line[10] = { Start_Byte, Version_Byte, Command_Length, CMD, Acknowledge, Par1, Par2, checksum >> 8, checksum & 0xFF, End_Byte};
//Send the command line to the module
for (byte k=0; k<10; k++)
{
Serial1.write( Command_line[k]);
}
}
This 15-week course introduces students to responsive mobile environments and encourages them to explore speculative terrains that intersect art, technology and design and space. Iteratively, intro...more
To evoke thought on the potential effects artificial intelligence can have in the domain on pet-owner relationships in the future.