Remaking Shaker (1996)

Made by Leslie Liu and Ziru Wei

Remaking a past IoT case study to explore nonverbal, expressive ambient communication.

Created: December 8th, 2024

0

💡 Intention


We plan to try a new framework for paired device, mimicking the motion of two objects under physical laws to make the events on the other end more evident. Everyone has an intuitive understanding of gravity: for example, when using a scale, we can easily understand why one side sinks and the other rises; or when pulling a blinds cord, the cord descends while the blinds rise. These movements indicate that a force is at work within the physical mechanical system, which we can use to reflect the presence of the other device in our paired device system.

We started from learning historical cases about remote gestures/motions, from SHAKER which translates the movement on both side in to shakes, to more paired devices in YO-YO MACHINE project gallery. Then, we moved on to make our own paired device, putting our ideas into practice.

0

[HISTORICAL CASE]


Our precedent was Rob Strong and William Gaver's Shaker from their 1996 paper, “Feather, Scent and Shaker: Supporting Simple Intimacy.” Developed at RCA's Computer Related Design department, this IoT device aims to facilitate “nonverbal, emotional communication between separated lovers... for less intimate friendships and more symmetrical communication.” By shifting the context of peripheral and expressive awareness devices from the professional and public realm to the domestic and personal, these standalone devices allow for more ambiguous and impressionistic, emotive communication.

-1

[PHASE I: REMAKING]


Approach & BOM


The original Shaker setup includes a sender device and receiver device for each person in the pair; when person A shakes their (sender) device, mechanical movement of a metal rod in the item induces an electric current in the coil, which sends an electrical signal to person B’s receiver device, activating B’s device’s solenoid, which leads to proportional movement in B’s receiver device to match A’s sent gesture.

In our remake, we retained the use of the ball tilt sensor, focusing on ensuring that the shake / original input would be reliably registered.

Our bill of materials includes the following (each x2) — Particle Photon 2 microcontroller, INA219 for current monitoring, L298n H-Bridge for motor driving, and a complete solenoid assembly that we would determine as we remade the original.

0

Process


While our original BOM calls for solenoids to register transmitted signals, we opted for a microservo to keep the setup minimalist and focused, as we felt that our remake might reference Gaver and Strong’s original only loosely.

We considered using a three-axis accelerometer to detect changes in orientation, before settling on the ball tilt sensor, which the PhysComp Lab, graciously, had in stock.

0

Prototype

The ball tilt sensor, connected to the breadboard, which also has a microservo attached, registers shakes within a certain time threshold. The Photon 2 receives this information and shares it to the Particle cloud, from which the system controls the microservo, causing it to respond when cloud communication registers the last shake. In this prototype the servo’s movement is manually controlled by a team member to simulate networking.


0
const int tiltPin = D2;
unsigned long lastPublish = 0;
const unsigned long PUBLISH_INTERVAL = 2000;  
const unsigned long SHAKE_WINDOW = 2000;      
const int SHAKE_THRESHOLD = 10;                

int lastState = 0;
int stateChanges = 0;
unsigned long windowStart = 0;

void setup() {
    Serial.begin(115200);
    pinMode(tiltPin, INPUT_PULLUP);
    windowStart = millis();
}

void loop() {
    int currentState = digitalRead(tiltPin);
    unsigned long currentTime = millis();
    
    if (currentState != lastState) {
        stateChanges++;
        lastState = currentState;
        Serial.print("State change: ");
        Serial.println(stateChanges);
    }
    
    if (currentTime - windowStart >= SHAKE_WINDOW) {
        bool isShaking = (stateChanges >= SHAKE_THRESHOLD);
        
        if (currentTime - lastPublish >= PUBLISH_INTERVAL) {
            String data = String(isShaking ? "1" : "0");
            Particle.publish("shake-detect", data, PRIVATE);
            Serial.print("Shake detected: ");
            Serial.println(isShaking ? "YES" : "NO");
            lastPublish = currentTime;
        }
        
        stateChanges = 0;
        windowStart = currentTime;
    }
    
    delay(10);  
}
Click to Expand
0
0

Reflection


During research we found a 2020 website titled “Yo—Yo Machines,” (https://www.yoyomachines.io/) which serves as an extension and continuation of the case study. As a part of the Yo—Yo Machines team, Gaver notes the ways in which contemporary flows of information have shifted social norms and dynamics around ambient, networked devices/communication. From this remake we discusses a common interest in expressing mixed/ambiguous feelings through signals that ranged in clarity — from the indecipherable, vague to the clear, and readily indexable.

0




----------------------------------------------




[PHASE II: REINTERPRETING]



0

Approach


We all know that a seesaw is a lever; when one side is heavier, that end will sink, and the opposite side will rise. If neither side is occupied or if the weights are equal, both ends will remain level.

But what if two people are not in the same location? Can they still play on a seesaw together? How can they interact with a seesaw to feel each other's presence?

Here is our strategy: We use a distance sensor to detect the presence of objects, then transmit variables through the Particle platform, and finally, adjust the movement of the linear actuators on both sides based on the status of objects on each side. In this system, the movement of the seesaw is no longer driven by gravity, but you can still play with it with a friend who is miles away.

0

Process


The process begins with learning from precedent projects like shakers and yo-yo machines, followed by a brainstorming phase that selects a see-saw concept over a carrot design. The development phase then splits into technical implementation (including coding and electronics) and physical crafting (mechanical design and electronics packaging), ultimately culminating in the final product integration and testing.

0

Conceptual Designs


During our study of precedents, we learned a lot, but we remained puzzled by the obscure encoding and decoding processes: what happens to the receiver when the shaker shakes, and what is the intent? We found that after playing with our remake prototype, such ambiguous signals reduced the enjoyment of the product and introduced unnecessary guesswork.

Therefore, we plan to try a new framework for our paired device, mimicking the motion of two objects under physical laws to make the events on the other end more evident. Everyone has an intuitive understanding of gravity: for example, when using a scale, we can easily understand why one side sinks and the other rises; or when pulling a blinds cord, the cord descends while the blinds rise. These movements indicate that a force is at work within the physical mechanical system, which we can use to reflect the presence of the other device in our paired device system.

The concept of "remote carrot" is based on a legend that all the carrot roots on Earth are connected, so pulling one carrot causes another to sink slightly.

The "remote see-saw" concept may evoke memories of your childhood, playing on the seesaws with friends at the playground. This up-and-down interaction is quite fun.

In the end, we plan to use the see-saw concept, assessing its feasibility, fun, and intuitiveness.

0

Prototype

0

We have open-sourced everything about our project on GitHub, except for our Particle tokens. We also use this repo for documenting our technical trials. Feel free to visit!

There, you can view all the sketches, our web-hook setup, and all legacy tests. This includes testing with various sensors—from ball-tilters to distance sensors—transitioning from button-controlled to sensor-triggered linear actuators, and exploring Particle cloud web-hooks.

0

BOM table

This BOM table has listed all the required components for each device.

0
Wiring Diagram

  • Pin Setup
    • LED indicators:
      • Red (D2)
      • Yellow (D0)
      • Blue (D1)
    • QTR-1RC sensor on D3
    • L298N motor controller:
      • Enable (A2)
      • Direction controls (D6, D7)
0
Sketches and Particle web-hook


https://github.com/zuriniw/RemoteSeesaw

0

Workflow Diagram

0

Components Function
- 🔴 Red LED: actuator position state
- 🟡 Yellow LED: local object detection
- 🔵 Blue LED: remote device's object detection

Movement Logic
This section simulates the movement of an object on a seesaw using linear actuators, following gravity principles.
- ● ● / ○ ○ —> Both sides same state: Move to middle position (totalLength/2)
- ● ○ —> Object in Device 1 only: Device 1 extends, Device 2 retracts
- ○ ● —> Object in Device 2 only: Device 1 retracts, Device 2 extends

Communication System
- Communication via Particle.publish() and Particle.subscribe()
- Device 1 broadcasts "doBlue_1" events
- Device 2 broadcasts "doBlue_2" events
- Each device updates state based on partner events

Actuator Control
Linear actuators can control direction, speed, and timing, but we aim to use them to achieve different positions in three modes to simulate seesaw movement. Here is our strategies:
- Position tracking using time-based calculations and movement calculations based on speed
- Movement functions: extend, retract, stop, and moveToPosition(int targetPosition)
0

[WORKING WITH AI COPILOTS]

Throughout our process we have integrated use of ChatGPT 4o and Perplexity as AI copilots. The below code block includes 3 instances of how we approached use of artificial intelligence models/products as ideation, execution, and troubleshooting aids.

While we found Perplexity and ChatGPT helpful, during our setup we noted how having a solid grasp of IoT fundamentals as developed during earlier Skills Dev labs was integral to our effective use of these tools, as it was quite easy to get lost in the sauce misdiagnosing an issue/bug if we didn't fully understand what the code we wrote materially meant.

0
You are a DIYer remaking the Shaker devices as conceptualized 
in Rob Strong and William Gaver's 1996 paper "Feather, Scent, 
and Shaker." Give examples of common gestures that involve 
shaking, as well as interaction metaphors to consider.

--> we used this prompt during ideation phases as we were 
considering different scenarios in which shaking is socially 
acceptable / common / expected.

---

you are moving your tilt switch demo of Shaker from a breadboard 
to a more permanent, soldered protoboard. does the protoboard 
require power rails? list pros and cons within 150 words. 

--> in the absence of instructors, we consulted an LLM about the 
feasibility of moving our hardware from a breadboard to a 
protoboard, understanding that these models largely reflect a 
statistical average, to get a gist of how one might approach 
building this remake.

---

integrate the 2 code boxes,do not change the features, syntax, 
parameters,just merge them.

--> We need this prompt because we tested 2 piece of code with 2 
components first to ensure they can work separately.
--> After we were aware of they can work separately, we asked the 
LLM to merge the code and tested if they can work together.
--> This helped us move forward step by step, making it easier 
to debug (in software and hardware)
Click to Expand
0

[REFLECTION AND CRITIQUE]

It would be even more interesting if we could implement tangible gravitational feedback! For instance, if a feather is placed on one side and an egg on the other, the side with the egg would descend further. This could subtly initiate a competition between both sides about what to place. However, our current prototype reflects that the intention and presence are what matter most, not the objects placed.

x
Share this Project


Focused on
Skills
About

Remaking a past IoT case study to explore nonverbal, expressive ambient communication.