Group 4: LumiTouch Frames to How's the Weather
Made by Freya Yang and Isabel Fleck
Made by Freya Yang and Isabel Fleck
Created: December 11th, 2024
This project draws inspiration from historical emotional communication devices Lumi Touch frames, designed and developed by MIT Media Lab, which use haptic and visual elements to convey feelings. Drawing from theories of affective computing and emotional design (Norman, 2004), this project emphasizes subtlety and intimacy in interaction. The bird design was inspired by symbolic connections to nature and harmony, aligning with the Lumi Touch's objective to create gentle, non-intrusive communication. We reference prior works on embodied interaction (Dourish, 2001) to underline the importance of physical touch in human connection.
How’s the Weather aims to reimagine personal emotional communication in a digitally connected world. This interactive IoT device enables users to check in with a friend’s mood and express their own feelings through tactile interaction and synchronized light patterns. The design’s driving intent is to foster emotional closeness despite physical distance, using the metaphor of weather to represent moods. By connecting two people in real-time with a tangible, engaging medium, Lumi Touch offers a solution for subtle, meaningful communication in today’s fragmented digital landscape.
The key goal was to rebuild the Lumi Touch frame as an IoT-enabled emotional interface. We focused on recreating the essence of mood communication through touch and light. Prioritizing simplicity and emotional resonance, we emphasized creating a tactile, interactive object with seamless integration of IoT components.
First, we started the initial sketches early brainstorming metaphors of touch and weather. Material choices are components included NeoPixel LEDs for dynamic lighting and a Particle Photon for connectivity. We then developed the interaction folow, which is squeezing the bird sends a signal to trigger corresponding lights on a paired device.
Next, we moved on to prototyping. We started by connecting a basic pressure sensor to an LED light to test input-output functionality. Once we achieved control of the LED light through pressure, we integrated a NeoPixel LED strip to produce more complex lighting patterns. We designed lighting responses that varied based on both the intensity of pressure and the duration of the touch.
After achieving reliable lighting control, we connected two devices via Particle Photon microcontrollers. When one user pressed the sensor on their device, the paired device lit up in real time, successfully achieving synchronized emotional communication. Finally, we built the physical prototype using foam board and paper, crafting the window frame shape to reinforce the metaphor of looking out at the weather.
The challenge of this process is that we still need to ensure real-time responsiveness and crafting a bird that is ergonomic and durable if we really want to push our prototypes with our design concept.
Input: User squeezes the bird
Processing: The signal is sent via the Particle PhotonFlow: Squeeze -> IoT signal transmission -> Light activation on paired device.
[Firgue 1: First Attempt, using LED as the first prototype demo]
[Figure 2: Second Attempt, using the NeoPixel LED Strip]
// This #include statement was automatically added by the Particle IDE.
#include <neopixel.h>
#include "Particle.h"
#include "neopixel.h"
SYSTEM_MODE(AUTOMATIC);
// Define pin and NeoPixel configuration for Photon 2
#if (PLATFORM_ID == 32)
#define NEOPIXEL_PIN SPI // Use MOSI for Photon 2
#else
#define NEOPIXEL_PIN D3 // Use D3 for other Particle devices
#endif
#define NUM_PIXELS 8 // Number of LEDs in the NeoPixel strip
#define PIXEL_TYPE WS2812B // Specify the NeoPixel type
// Pressure sensor pin
#define PRESSURE_PIN A0
// Create a NeoPixel object
Adafruit_NeoPixel strip(NUM_PIXELS, NEOPIXEL_PIN, PIXEL_TYPE);
void setup() {
strip.begin(); // Initialize NeoPixel strip
strip.show(); // Turn off all LEDs initially
Serial.begin(9600); // Debugging
// Subscribe to events from the Particle Cloud
Particle.subscribe("weatherEvent", weatherEventHandler);
}
// Function prototypes for weather effects
void thunderEffect();
void rainEffect();
void cloudyEffect();
void sunnyEffect();
// Function to publish weather events
void publishWeatherEvent(const char* eventName) {
String event = String("weatherEvent/") + eventName;
Particle.publish(event); // Publish the event with the prefix
weatherEventHandler(event.c_str(), ""); // Call event handler immediately
}
// Particle event handler
void weatherEventHandler(const char* event, const char* data) {
String eventString = String(event);
if (eventString == "weatherEvent/sunny") {
sunnyEffect(); // Call sunny effect function
}
else if (eventString == "weatherEvent/cloudy") {
cloudyEffect(); // Call cloudy effect function
}
else if (eventString == "weatherEvent/rainy") {
rainEffect(); // Call rainy effect function
}
else if (eventString == "weatherEvent/thunder") {
thunderEffect(); // Call thunder effect function
}
}
void loop() {
// Read the pressure sensor value
int pressure = analogRead(PRESSURE_PIN);
Serial.println(pressure); // Debugging the sensor value
// Check how long the pressure is held
static unsigned long pressStartTime = 0;
static bool isPressed = false;
if (pressure > 1000) { // Threshold for detecting a press
if (!isPressed) {
pressStartTime = millis(); // Start timing
isPressed = true;
}
} else {
isPressed = false;
}
if (isPressed) {
unsigned long pressDuration = millis() - pressStartTime;
if (pressDuration > 5000) {
publishWeatherEvent("thunder");
} else if (pressDuration > 3000) {
publishWeatherEvent("rainy");
} else if (pressDuration > 2000) {
publishWeatherEvent("cloudy");
} else if (pressDuration > 1000) {
publishWeatherEvent("sunny");
}
} else {
strip.clear(); // Turn off LEDs when not pressed
strip.show();
}
}
// Thunder: Blue and yellow flash
void thunderEffect() {
for (int i = 0; i < NUM_PIXELS; i++) {
strip.clear();
if (i % 2 == 0) {
strip.setPixelColor(i, strip.Color(0, 0, 128)); // Dim blue
} else {
strip.setPixelColor(i, strip.Color(128, 128, 0)); // Dim yellow
}
strip.show();
delay(50);
}
strip.clear();
strip.show();
}
// Rain: Light dim blue light
void rainEffect() {
for (int i = 0; i < NUM_PIXELS; i++) {
strip.clear();
strip.setPixelColor(i, strip.Color(0, 0, 128)); // Dim blue
strip.show();
delay(100);
}
}
// Cloudy: Dim violet/lavender light
void cloudyEffect() {
static int brightness = 0;
static int direction = 5;
brightness += direction;
if (brightness >= 100 || brightness <= 0) {
direction = -direction;
}
uint32_t color = strip.Color(75 + brightness, 50 + brightness, 150 + brightness); // Dim lavender/violet
for (int i = 0; i < NUM_PIXELS; i++) {
strip.setPixelColor(i, color);
}
strip.show();
delay(50);
}
// Sunny: Warm orange light
void sunnyEffect() {
static int brightness = 0;
static int direction = 5;
brightness += direction;
if (brightness >= 255 || brightness <= 50) {
direction = -direction;
}
uint32_t color = strip.Color(255, 140, 0); // Warm orange
for (int i = 0; i < NUM_PIXELS; i++) {
strip.setPixelColor(i, color);
}
strip.show();
delay(50);
}
Click to Expand
When checking in on a friend we often ask them how are they feeling. Taking the idea of an ambient device that lets the other know you are thinking of them, we reinterpreted this idea in the form of a window and bird. In the way we think about a person, we often look out the window, taking this symbolic gesture, we took the form of a window and our connotations around weather and emotions, as a way to communicate how we are feeling to friend. With a bird on a window sill, the way in which you treat the bird, replicated your emotional state. The happier, the more likely you are to be friendly with the bird, as opposed being upset, you might want to squeeze the bird harder. Based on how lightly or harshly you squeeze, the weather will replicate these feelings, a good day is sunny, a bad is stormy and so forth. We chose to use weather as our way to communicate emotion, because we often associate the weather with the thoughts, looking out on a rainy day, we feel sad. It also serves in the gesture of checking in on someone, asking how's someone feeling, like a rain check, it acts as an initiative to check in on a friend.
We envisioned Lumi Touch as a system that could scale to additional contexts, such as team collaboration or mental health monitoring. Exploring weather as a metaphor allowed for universal emotional representation.
Iteration 1: Explored additional moods represented by weather states (e.g., thunderstorms for distress).
Iteration 2: Developed multiple prototypes to simulate group interactions (e.g., syncing two devices).
Reflection
In looking back on our design, I think the biggest think I would want to change about it, is making the device have a setting at default for one person to make the initial question, instead of simply broadcasting your emotion out from the blue, by adding this initial asking setting with a different light and pattern, would also allow for the feeling of being thought of more evident. In this way the interaction would also feel a little less cold and self centered, and really emphasis the connection of thinking of someone. This would also require the light setting once responding to last through the day, so that this does not remain as only live feedback.
How's the Weather:
An interactive IoT device for meaningful connections. Check in with a friend's mood and share yours through touch and light.
Design concept: An Interactive IoT device that allows you to check in with a friend about their current mood and how they are feeling, and tell them how you are doing.
[Figure 3: How's weather Design Concept Storyboard]
#include "Particle.h"
#include "neopixel.h"
SYSTEM_MODE(AUTOMATIC);
// Define pin and NeoPixel configuration for Photon 2
#if (PLATFORM_ID == 32)
#define NEOPIXEL_PIN SPI // Use MOSI for Photon 2
#else
#define NEOPIXEL_PIN D3 // Use D3 for other Particle devices
#endif
#define NUM_PIXELS 8 // Number of LEDs in the NeoPixel strip
#define PIXEL_TYPE WS2812B // Specify the NeoPixel type
// Pressure sensor pin
#define PRESSURE_PIN A0
// Create a NeoPixel object
Adafruit_NeoPixel strip(NUM_PIXELS, NEOPIXEL_PIN, PIXEL_TYPE);
void setup() {
strip.begin(); // Initialize NeoPixel strip
strip.show(); // Turn off all LEDs initially
Serial.begin(9600); // Debugging
// Subscribe to events from the Particle Cloud
Particle.subscribe("weatherEvent", weatherEventHandler);
}
// Function prototypes for weather effects
void thunderEffect();
void rainEffect();
void cloudyEffect();
void sunnyEffect();
// Function to publish weather events
void publishWeatherEvent(const char* eventName) {
String event = String("weatherEvent/") + eventName;
Particle.publish(event); // Publish the event with the prefix
weatherEventHandler(event.c_str(), ""); // Call event handler immediately
}
// Particle event handler
void weatherEventHandler(const char* event, const char* data) {
String eventString = String(event);
if (eventString == "weatherEvent/sunny") {
sunnyEffect(); // Call sunny effect function
}
else if (eventString == "weatherEvent/cloudy") {
cloudyEffect(); // Call cloudy effect function
}
else if (eventString == "weatherEvent/rainy") {
rainEffect(); // Call rainy effect function
}
else if (eventString == "weatherEvent/thunder") {
thunderEffect(); // Call thunder effect function
}
}
void loop() {
// Read the pressure sensor value
int pressure = analogRead(PRESSURE_PIN);
Serial.println(pressure); // Debugging the sensor value
// Determine weather event based on pressure level
if (pressure > 3600) {
publishWeatherEvent("thunder");
} else if (pressure > 2700) {
publishWeatherEvent("rainy");
} else if (pressure > 1800) {
publishWeatherEvent("cloudy");
} else if (pressure > 900) {
publishWeatherEvent("sunny");
} else {
strip.clear(); // Turn off LEDs when pressure is too low
strip.show();
}
}
// Thunder: Blue and yellow flash
void thunderEffect() {
for (int i = 0; i < NUM_PIXELS; i++) {
strip.clear();
if (i % 2 == 0) {
strip.setPixelColor(i, strip.Color(0, 0, 128)); // Dim blue
} else {
strip.setPixelColor(i, strip.Color(128, 128, 0)); // Dim yellow
}
strip.show();
delay(50);
}
strip.clear();
strip.show();
}
// Rain: Light dim blue light
void rainEffect() {
for (int i = 0; i < NUM_PIXELS; i++) {
strip.clear();
strip.setPixelColor(i, strip.Color(0, 0, 128)); // Dim blue
strip.show();
delay(100);
}
}
// Cloudy: Dim violet/lavender light
void cloudyEffect() {
static int brightness = 0;
static int direction = 5;
brightness += direction;
if (brightness >= 100 || brightness <= 0) {
direction = -direction;
}
uint32_t color = strip.Color(75 + brightness, 50 + brightness, 150 + brightness); // Dim lavender/violet
for (int i = 0; i < NUM_PIXELS; i++) {
strip.setPixelColor(i, color);
}
strip.show();
delay(50);
}
// Sunny: Warm orange light
void sunnyEffect() {
static int brightness = 0;
static int direction = 5;
brightness += direction;
if (brightness >= 255 || brightness <= 50) {
direction = -direction;
}
uint32_t color = strip.Color(255, 140, 0); // Warm orange
for (int i = 0; i < NUM_PIXELS; i++) {
strip.setPixelColor(i, color);
}
strip.show();
delay(50);
}
Click to Expand
The feedback we received during the demo showcase day revealed several opportunities for improvement:
Miscommunication and Ambiguity: Reviewers pointed out potential confusion in associating specific weather patterns with moods. For instance, the meaning of a “sunny” or “rainy” light could be interpreted differently across users.
Self-Centered Messaging: Concerns were raised about whether users are always sending messages focused on their own mood. Suggestions included adding a mode for users to signal “I am thinking of you” without directly expressing their mood.
Customizability: Reviewers asked whether the mood state could be modified after sending and if the lighting could persist for a longer duration to reflect ongoing emotional states.
Affordances of the Bird: Questions arose about the bird-shaped device’s interaction design. Is squeezing the bird the ideal action, or could it be reimagined to feel more natural or inviting to touch?
Instructions and Usability: How can we better communicate the intended use of the device to new users? Clearer guidance or onboarding could enhance the experience.
Taking the critique from the reviewers on the demo showcase day into consideration of future improvement, the iterative design could introduce a secondary mode for users to communicate “thinking of you” without mood-specific patterns. We could also allow users to modify or sustain mood states over extended periods. We need to revisit the physical affordances of the bird to make interactions more intuitive and comfortable. Furthermore, developing visual or interactive instructions to ensure clearer usability for first-time users is also worth considering.