The Threatened Owl

Made by dlengenf ·

Created: December 16th, 2015

0

Objectives

The Threatened Owl was an offshoot of project attempted to provide a more engaging interaction from the audience, but found that exploring how an animal fends off engagement was far more interesting. The goal of the project then became attempting to build a robot that could perform, at least in part, like a threatened bird. Therefore, size and performance were evaluated, and visual tracking and wing actuation were necessary features. The size of the bird would serve to add to its performance in a subtle fashion. Complete modeling of the wing of a bird is outside the scope of this project due to time constraints and given that my background is not in biology or osteology. However, I do have an interest in the intersection of geometry and behavior and how they affect each other and so the world of animals is a familiar and convenient place to begin.

0

Implementation

In the animal world, when seeking to avoid conflict, animals make themselves as large as they can and sometimes become very grand in gesture in order to express energy and control over that energy. In this way they communicate that they are not worth the effort to attack. Therefore, I sought to make the bird large in size and actuate its wings upon detecting teeth. In addition to this, I implemented “eye contact” to the best of my ability. Eye contact in the animal world is very aggressive. Especially when the animal is threatened, it is focused on the potential attacker. For this reason, the owl’s gaze follows the individual closest to it in its presence. With more time, the owl would be constructed more convincingly as an owl with a larger diversity of movements when threatened. In this way, the owl would more closely approximate reality.

0

Outcomes

The owl performed well. Its wings actuated to the degree expected and on cue. When the owl was not longer able to view the human, it paused in caution and when the owl could “see” the human again it began again its threatened dance. The size of the bird made it far more fierce than had it been smaller and the noise of the stepper motor made it sound additionally aggressive, despite not being an intended feature. In addition, it managed to keep the aggressor in its gaze for the range of motion acceptable from the neck of an owl. The exposure of the actuators did not aid in the convincing power of the robot, and made clear the necessities of creating skin or shelling for a robot. Making the skin more closely approximate the reality of the animal I am seeking to model makes the model more seemingly real. This can be seen in the design of stuffed animals.

0

Photo Documentation


Figure 1: Full Image of the Owl



Figure 2: Right Wing Construction, Depicting the Mechanics



Figure 3: Owl's Head and Neck, Depicting the Mechanics



Figure 4: Internal Circuitry of the Owl

0

Video Documentation

Video

0

Technical Documentation


Figure 5: Brainstorm Sketch



Figure 6: The Electrical Schematic

0
#include <Servo.h> 

// For the stepper motor
#define DIR_PIN  2    // The direction pin controls the direction of stepper motor rotation.
#define STEP_PIN 3    // Each pulse on the STEP pin moves the stepper motor one angular unit.
#define STEPS    20//800   // Number of steps per   

// For the servos
#define LEFT_WING_SERVO  9
#define RIGHT_WING_SERVO 10
#define DELAY            1000 // milliseconds

Servo leftWing;
Servo rightWing;  

// Servo Positions
int leftWingPos = 0;
int rightWingPos = 0;

// Min/Max Angles
const int leftMinAngle = 13;
const int leftMaxAngle = 135;
//const int rightMinAngle = 10;
//const int rightMaxAngle = rightMinAngle + (leftMaxAngle - leftMinAngle);

// Delays
const int msInterDelay = 500;
const int msSpinDelay = 50;  // 15

int pos = 0;

// For the stepper
const long usDelay = 1000000000;
long dir = HIGH;

/* For Serial Comms */
String inputString = "";         // a string to hold incoming data
boolean stringComplete = false;  // whether the string is complete

boolean threatened = false;
unsigned long prev = 0;

// Initialize and put everything in the base configuration
void setup() {
  /* STEPPER */
  pinMode(DIR_PIN, OUTPUT); 
  pinMode(STEP_PIN, OUTPUT);
  digitalWrite(DIR_PIN, dir);

  /* SERVO */
  leftWing.attach(LEFT_WING_SERVO);
  rightWing.attach(RIGHT_WING_SERVO);

  // Put wings in folded configuration
  leftWing.write(leftMaxAngle);
  rightWing.write(0);

  /* Serial Comms */
  Serial.begin(9600);
  inputString.reserve(200);
}

// Actuate the wings
void wingControl(char state) {
  const int ps = 3;
  switch (state) {
    case 'e':
      Serial.print("EXTEND\n");
      for (pos = leftMinAngle; pos <= leftMaxAngle; pos += ps) {
        leftWing.write(pos);
        rightWing.write(pos);
        delay(msSpinDelay);
      }
      break;
    case 'r':
      for (pos = leftMaxAngle; pos >= leftMinAngle; pos -= ps) {                                
        leftWing.write(pos);
        rightWing.write(pos); // 20
        delay(msSpinDelay);
      }
      break;
    default:
      break;
  }
}

// Actuate the neck
void neckControl(int angle) {

  // Set up direction of motion
  int default_dir = LOW;
  if (angle < 0) {
    digitalWrite(DIR_PIN, default_dir);
  } else {
    digitalWrite(DIR_PIN, !default_dir);
  }
  
  for (int i = 0; i < abs(angle); i++) {
    digitalWrite(STEP_PIN, HIGH);
    delayMicroseconds(usDelay);
    
    digitalWrite(STEP_PIN, LOW);
    delayMicroseconds(usDelay);
  }
}

void test_stepper() {

  digitalWrite(DIR_PIN, HIGH);
  
  for (int i = 0; i < 100; i++) {
    //Serial.print("HERE");
    digitalWrite(STEP_PIN, HIGH);
    delayMicroseconds(usDelay);
    
    digitalWrite(STEP_PIN, LOW);
    delayMicroseconds(usDelay);
  }
}//*/

void serialEvent() {
  while (Serial.available()) {
    // get the new byte:
    char inChar = (char)Serial.read();
    // add it to the inputString:
    inputString += inChar;
    // if the incoming character is a newline, set a flag
    // so the main loop can do something about it:
    if (inChar == '\n') {
      stringComplete = true;
    }
  }
}

void loop() {
  //test_stepper();
  serialEvent();
  if (stringComplete) {
    if (inputString.equals("threatened\r\n") || inputString.equals("threatened")) {
      if (!threatened) {
          Serial.print("THREATENED MODE\n");
          threatened = true;
          wingControl('e'); // EXTEND WINGS
          prev = millis();
      } else {
        unsigned long curr = millis();
        if (curr - prev > DELAY)  {
          Serial.print("RESETTING\n");
          threatened = false;
          wingControl('r'); // RETRACT WINGS
          //prev = curr;
        }
      }
    }//*/
    Serial.print(inputString.toInt());
    Serial.print("\n");
    neckControl(inputString.toInt());
    inputString = "";       // clear the string
    stringComplete = false; // reset the conditional
  }
}
Click to Expand
0
#!/usr/bin/env python

# For Serial
import serial

# For Audio
import time # audio and serial
#from kivy.core.audio import SoundLoader

# For face tracking
import numpy as np
import cv2
import cv2.cv as cv
from video import create_capture
from common import clock, draw_str


# Configure the serial connections (the parameters differs on the device you are connecting to)
ser = serial.Serial(
    port = '/dev/cu.usbmodemfa131',
    baudrate = 9600,
    parity = serial.PARITY_ODD,
    stopbits = serial.STOPBITS_TWO,
    bytesize = serial.SEVENBITS
)

ser.isOpen()

help_message = '''
USAGE: facedetect.py [--cascade <cascade_fn>] [--nested-cascade <cascade_fn>] [<video_source>]
'''

def detect(img, cascade):
    rects = cascade.detectMultiScale(img, scaleFactor=1.3, minNeighbors=4, minSize=(30, 30), flags = cv.CV_HAAR_SCALE_IMAGE)
    if len(rects) == 0:
        return []
    rects[:,2:] += rects[:,:2]
    return rects

def get_center_rect(x1, y1, x2, y2):
    cx = int(round((abs(x2 - x1) / 2.0) + x1))
    cy = int(round((abs(y2 - y1) / 2.0) + y1))
    return (cx, cy)

def draw_rects(img, rects, color):
    center = ();
    for x1, y1, x2, y2 in rects:
        cv2.rectangle(img, (x1, y1), (x2, y2), color, 2)
        
        # Circle Data
        center = (cx, cy) = get_center_rect(x1, y1, x2, y2)
        radius = 5
        circle_color = (0, 0, 255)
        #print 'CIRCLE:', center
        cv2.circle(img, center, radius, circle_color, thickness=3, lineType=8, shift=0)
        break; # just draw the first one
    return center
    

def draw_sub_rects(rects):#, prev_time):
    dt = 10
    
    for x1, y1, x2, y2 in rects:
        roi = gray[y1:y2, x1:x2]
        vis_roi = vis[y1:y2, x1:x2]
        subrects = detect(roi.copy(), nested)
        if (len(subrects) > 0):
            print "THREATENED"
            ser.write("threatened" + '\r\n') #send a "threatened" signal
        '''
        if (clock() - prev_time > dt):
            if (len(subrects) > 0):
                if sound:
                    #print("Sound found at %s" % sound.source)
                    #print("Sound is %.3f seconds" % sound.length)
                    sound.play()
                    #time.sleep(5) # delays for 5 seconds
                    sound.stop()
            prev_time = clock();
        '''
        draw_rects(vis_roi, subrects, (255, 0, 0))

if __name__ == '__main__':
    import sys, getopt
    print help_message
    
    #sound = SoundLoader.load('angry_owls.mp3')
    #prev_time = 0

    args, video_src = getopt.getopt(sys.argv[1:], '', ['cascade=', 'nested-cascade='])
    try: video_src = video_src[0]
    except: video_src = 0
    args = dict(args)
    cascade_fn = args.get('--cascade', "../../data/haarcascades/haarcascade_frontalface_alt.xml")
    nested_fn  = args.get('--nested-cascade', "../../data/haarcascades/haarcascade_smile.xml")

    cascade = cv2.CascadeClassifier(cascade_fn)
    nested = cv2.CascadeClassifier(nested_fn)

    cam = create_capture(video_src, fallback = 'synth:bg=../cpp/lena.jpg:noise=0.05')

    while True:
        ret, img = cam.read()
        gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
        gray = cv2.equalizeHist(gray)
        t = clock()
        rects = detect(gray, cascade)
        #print rects
        vis = img.copy()
        if (len(rects) > 0):
            cx, cy = draw_rects(vis, rects, (0, 255, 0))
            draw_sub_rects(rects)
            #draw_sub_rects(rects, prev_time)
            dt = clock() - t
            
            # Allow the owl to track the face
            ANGLE = 15 # must be an integer
            X_MARGIN = 100 # the margin of error to which the owl should try to center
            screen_cx = cam.get(3) / 2.0 # screen's center x value
            if ((screen_cx - X_MARGIN/2.0) >= cx):
                print "ROTATE camera LEFT"
                ser.write(str(ANGLE) + '\r\n')
            elif (cx >= (screen_cx + X_MARGIN/2.0)):
                print "ROTATE camera RIGHT"
                ser.write("-" + str(ANGLE) + '\r\n')
            
            '''
            To interpret the angle on the Aarduino's
            side, the DIR is set by the sign and 
            the quanity of rotation is set by the angle.
            
            The Arduino takes care of the bounds of
            owl neck's rotation. Meaning this code just
            tells the owl's neck by how much it should
            turn and in what direction.
            '''
        
            draw_str(vis, (20, 20), 'time: %.1f ms' % (dt*1000))
        
        cv2.imshow('facedetect', vis)

        if 0xFF & cv2.waitKey(5) == 27: # 27 => ESCAPE key
            break
    cv2.destroyAllWindows() # clean up the display
    ser.close() # cleanup the serial operation
    exit()
Click to Expand
x
Share this Project

This project is only accessible by signed in users. Be considerate and think twice before sharing.


Courses

16-223 Introduction to Physical Computing

· 1 member

Physical computing refers to the design and construction of physical systems that use a mix of software and hardware to sense and respond to the surrounding world. Such systems blend digital and ph...more


About

~