Back to Parent

Process

Initial Problem Statement: "How can the peripherals of human-human interaction

be replicated in human-computer interaction through

mechanical responsive UI?"

The process began with the development of the flower form for my mechanical display. This started as a form with a square base with 4 identical/symmetrical petals that opened and closed using micro servo motors. Once this form was fairly successful I started work on the system of servos running from my Particle Argons.

Each argon powered 4 servos and the servos were hardcoded to move in sync 1 degree at a time between 0 and 180 degrees. One final position of the servo (0 degrees) was the open state of the flower and the other position (180 degrees) was the closed state as the servo pulled the strings attached to the petals.

Once the servos were functional while unattached to the flowers, I then had to fabricate the final flower forms and secure them to the servos. The final flower forms were created using laser-cut Yupo paper (white exterior) and laser-cut cardstock (orange interior). Each flower was glued closed and attached to the servos using fishing line and super glue.

The initial intention was to control the patterns of the servos using an Amazon Alexa to increase the presence of the entity within the device. I ultimately ran into issues regarding the calling of functions with specific voice commands to the Alexa, so the servos were hardcoded for the experience.


Content Rating

Is this a good/useful/informative piece of content to include in the project? Have your say!

0