Back to Parent

Outcome


Objectives

This robot was all about creating a learning-by-doing environment between the robot and the user. The intention was to create a robot that responded to a user's actions with some of it's own. Over the course of a few minute experience, the user could gain an understanding of how to control the robot using his or her actions. Alternatively, given the amount of variation in potential responses, the user may find themselves trying to teach the robot how they want it to respond.

Outcomes

Building a life-size robot was pretty integral to the goals of the robot. Keeping the robot large meant that the interactions with it could be more easily related to human-scale movements. However, keeping the robot large meant that we were constantly challenged with ways of distributing the weight in order to use the smaller scale components available to us. We spent a lot of time designing and developing a bearing that would allow us to place heavier loads for the arms and drive them from a NEMA-17 stepper motor. The time we spent on this, as well as developing a way to support each drive plate meant that we lost time on the behavior end. Overall, the size and scale of the robot, as well as what movement we were able to achieve definitely aided in the objectives of the robot. We also were able to get gestures on the Kinect to move the robot as we intended, but because we did not have time to really experiment with those behaviors, it was still difficult for a user to see what impact they were having on the robot.

Implementation

One of the largest visual choices we made was the scale of the robot. Keeping it large enabled users to visualize the movements of the robot at a more human scale. This allowed users to make a connection between their movements and the movements the robot was making. The arms were designed to be abstract. By giving them an organic shape, we hoped it would add a layer of mystery the robot that would intrigue a user into playing with the robot. Lastly, using a Kinect, we were able to capture real human movements, which we could analyze and move the robot accordingly.

Technical Documentation

Code available here: https://github.com/PseudoSky/el-matterdoor.git

Drop files here or click to select

You can upload files of up to 20MB using this form.