Soft Communication "Lucky Cat"

Made by Tian Li and Xinye Wang

From our project, we plan to design a connection between kids and parents. Our concept is implementing the actions, imaging, or other soft communications instead of sounds. We also plan to make this subtle communication spooky and deliver care for people.

0

Proposal

Communication between people and their family members who need to be taken care of (eg. children, elderly) is tricky, since it can easily become rough supervision or seemingly neglect. The ordinary methods of interaction like phone calls or facetime can remediate the issues caused by mental needs of people cannot be physically together, but something we call soft communication can be created as a supplement to it. It serves as something more trivial and subtle instead of the communication of voices and videos.

We proposed two geometries for the design proposal. The first one is about a regular tetrahedron, and the second one is a triangular prism. Both of the geometries are simple and pure. We would like to filet the corner and make them more natural and softer and provide users a closer feeling. Regarding the first option, we plan to use wood material and implement small folding pieces to interact with the received signal. The second one’s material is also wood and it has a screen and people can touch it.  

0

Intention - "Love Story"

Our story happens between a family consisting of a little boy and a pair of migrant parents and a smart device that monitors indoor conditions, and transfers care by helping communication among family members without sound.

Hello everyone! I am a little machine in this family. My task is to help with the "soft communication" among family members - especially the parents who have to leave their hometown and go to a big city for a better living and their children who cannot go with them.

Oh, it's time for me to check the house's condition. It seems that the boy played basketball just before coming home since he was wiping the sweat on his forehead. I am not sure if he is doing that or doing something else. Let me calm him down by speeding up the electric fan around him. Maybe I help him turn up the fan quietly without letting him know.

(2 hours later) The boy's mother just sent me a request saying that she is thinking of the boy by showing a gesture in front of my terminal where she lives. Let me send this message to the boy by waving the little flag around him.

(On a Monday) Today is a special day for the boy who is learning at a primary school since he needs to attend the flag-raising ceremony, which requires him to wear a red collar. Let me inform him by stretching out the hidden red collar inside me, which is driven by a small servo!

(On a Tuesday) Today's weather is super hot, about 37 degrees Celsius. Let me remind the little boy about how hot outside is and let him wear a T-shirt rather than a jacket by quickly running the servo back and forth. Maybe he has already received the signal from me...  

0

How to Realize

We first use Teachable Machine to train the model, and the model was categorized into three different classes, including "Pass by", "Close to Cam", and "No Actions". After training the model, we applied a web camera to capture users' actions and deliver the actions into the model.

For the Pass by class, we tried one action to reflect accordingly, which is the lucky cat waves its hand. In order to realize this, we used Teachable Machine to recognize people's shoulder to make the decision and set one servo into the device. Then link the Particle script into the model, and set the servo motion to a loop with a specific degrees.

For the Close to Cam class, we also set one action, which rotating the tail of the luck cat. We used the similar method as the first class, but set different servo rotation degrees compared to the first one. Also, we set the default distance of ears as a certain value. If the value is less than a certain distance, then the model will execute the code to make the actions.

For the Comfortable Feeling class, we didn’t set the action for the servo, but for the next step, we would like to implement the screen to our device to have more interactions.

Next step, we would like to embed the screen and implement the Raspberry PI into our device. The screen can reflect more detailed actions and have more soft communications between users.

0
// Team Project
// Tian Li & Xinye Wang

int servoPin_01 = A4;
int servoPin_02 = A5;

Servo myServo_01;
Servo myServo_02;

void setup() {
    myServo_01.attach(servoPin_01);
    Particle.function("tail", handleMoveServo_01);

    myServo_02.attach(servoPin_02);
    Particle.function("hand", handleMoveServo_02);
}

int handleMoveServo_01(String cmd){
    if (cmd.toInt() > 215) {
        myServo_01.write(180);
        delay(1000);
        myServo_01.write(90);
        delay(1000);
        return 1;
    }
    return 0;
}

int handleMoveServo_02(String cmd){
    if (cmd.toInt()) {
        myServo_02.write(180);
        delay(2000);
        myServo_02.write(90);
        delay(2000);
        return 1;
    }
    return 0;
}

void loop() {

}
Click to Expand
0
// ml5.js: Pose Estimation with PoseNet
// The Coding Train / Daniel Shiffman
// https://thecodingtrain.com/learning/ml5/7.1-posenet.html
// https://youtu.be/OIo-DIOkNVg
// https://editor.p5js.org/codingtrain/sketches/ULA97pJXR

let particle_access_token = "b2bc473484175c1f4f1f3bb864d9a633c9c05c45"; // ADD your API key here 
let particle_function_url = "https://api.particle.io/v1/devices/e00fce686aab9c87e750ab2c/tail?access_token=" + particle_access_token;

let particle_function_url2 = "https://api.particle.io/v1/devices/e00fce686aab9c87e750ab2c/hand?access_token=" + particle_access_token;

let video;
let poseNet;
let pose;
let skeleton;
let earWidth;
let passLeft = 0;
let passMid = 0;
let passRight = 0;
let timer = 0;
let enterTime = 0;
let passed = 0;

function setup() {
  createCanvas(640, 480);
  video = createCapture(VIDEO);
  video.hide();
  poseNet = ml5.poseNet(video, modelLoaded);
  poseNet.on('pose', gotPoses);
  setInterval( sendToParticleFunction , 2000 );
}

function gotPoses(poses) {
  //console.log(poses);
  if (poses.length > 0) {
    pose = poses[0].pose;
    skeleton = poses[0].skeleton;
  }
}

function modelLoaded() {
  console.log('poseNet ready');
}

function countPass(pose, timer) {
  if (passLeft == 0 && passRight == 0 && passMid == 0) {
    if (pose.leftElbow.x <= 300 || pose.rightElbow.x <= 300) {
      passLeft = 1;
    }
    else if (pose.leftElbow.x >= 400 || pose.rightElbow.x >= 400) {
      passLeft = 1;
    }
    enterTime = timer;
  }
  if (passLeft == 1 && passRight == 0 && passMid == 0) {
    if (timer - enterTime > 3) {
      passLeft = 0;
      passed = 0;
    } else if ((pose.leftElbow.x < 350 && pose.leftElbow.x > 300 ) || (pose.rightElbow.x > 300 && pose.rightElbow.x < 350)) {
      passMid = 1;
      enterTime = timer;
    }
  }
  if (passLeft == 0 && passRight == 1 && passMid == 0) {
    if (timer - enterTime > 3) {
      passRight = 0;
      passed = 0;
    } else if ((pose.leftElbow.x < 350 && pose.leftElbow.x > 300 ) || (pose.rightElbow.x > 300 && pose.rightElbow.x < 350)) {
      passMid = 1;
      enterTime = timer;
    }
  }
  if (passLeft == 1 && passRight == 0 && passMid == 1) {
    if (timer - enterTime > 3) {
      passLeft = 0;
      passMid = 0;
      passed = 0;
    } else if ((pose.leftElbow.x >= 350 ) || (pose.rightElbow.x >= 350)) {
      passed = 1;
      passLeft = 0;
      passMid = 0;
    }
  }
  if (passLeft == 0 && passRight == 1 && passMid == 1) {
    if (timer - enterTime > 3) {
      passLeft = 0;
      passMid = 0;
      passed = 0;
    } else if ((pose.leftElbow.x <= 300 ) || (pose.rightElbow.x <= 300)) {
      passed = 1;
      passRight = 0;
      passMid = 0;
    }
  }
}

function draw() {
  image(video, 0, 0);

  if (pose) {
    // console.log(pose)
    earWidth = pose.leftEar.x - pose.rightEar.x;

    if (frameCount % 60 == 0 && timer > 0) { 
        timer ++;
    }
    countPass(pose, timer);
      
    let eyeR = pose.rightEye;
    let eyeL = pose.leftEye;
    let d = dist(eyeR.x, eyeR.y, eyeL.x, eyeL.y);
    fill(255, 0, 0);
    ellipse(pose.nose.x, pose.nose.y, d);
    fill(0, 0, 255);
    ellipse(pose.rightWrist.x, pose.rightWrist.y, 32);
    ellipse(pose.leftWrist.x, pose.leftWrist.y, 32);

    for (let i = 0; i < pose.keypoints.length; i++) {
      let x = pose.keypoints[i].position.x;
      let y = pose.keypoints[i].position.y;
      fill(0, 255, 0);
      ellipse(x, y, 16, 16);
    }

    for (let i = 0; i < skeleton.length; i++) {
      let a = skeleton[i][0];
      let b = skeleton[i][1];
      strokeWeight(2);
      stroke(255);
      line(a.position.x, a.position.y, b.position.x, b.position.y);
    }
  }
}

function sendToParticleFunction( )
{  
  if (earWidth >= 215) {
    console.log("You are close!!!");
    var particle_data = { arg: earWidth };
    console.log(particle_data)
    // httpPost(particle_function_url , 'text', particle_data,  function(result) {
    //   console.log( result );
    // });
  }
  
  
  if (passed == 1) {
    var particle_data2 = { arg: passed };
  //   httpPost(particle_function_url2 , 'text', particle_data2,  function(result) {
  //   console.log( result );
  // });
    console.log("Passed!!!!!!");

    passed = 0;
    passLeft = 0;
    passMid = 0;
    passRight = 0;
  }
}
Click to Expand
x
Share this Project

Courses

48-528 Responsive Mobile Environments

· 5 members

As part of this project-based course, we’ll get hands-on with emerging technologies, concepts and applications in the internet of things through a critical socio-technical lens. Over it’s 15-weeks,...more


Focused on
Skills
Tools
About

From our project, we plan to design a connection between kids and parents. Our concept is implementing the actions, imaging, or other soft communications instead of sounds. We also plan to make this subtle communication spooky and deliver care for people.