Made by Harley Guo · UNLISTED (SHOWN IN POOLS)

Created: April 5th, 2022



As we struggle to crawl through our daily chores, we could only focus on a few things at a time. Working, studying, making, learning, we almost never pay enough attention to our immediate environment and surroundings. Thus we are motivated to design and develop a solution that can keep track of what's happening around us when we go about with our daily routine. It should provide us a chance to self-reflect and encourage us to be mindful of what often gets neglected in life.



The concept of self-tracking is nothing new. There are ample examples affiliated with the Quantified Self initiative* that log, study and visualize data for purposes like health monitoring, self expression and more. We want to build upon such concept of self quantifying, and make informed decisions on both tracking and interpreting personal data that better suit our somewhat less tangible goal: being mindful about the world we live in.



We were quick to decide that we want some form of wearable device that logs environmental data according to the wearer's movement. I was also clear that we need some method to translate those data into a cempelling visual representation, which can in turn be presented and archive in some format resembling elements of a timeline.

We would like the experience of wearing and using this device to be personal, but also sharable. It could potentially be a conversation starter or a social topic that groups of people would talk about.



Our final product is a wearable device that can clip onto a lanyard, keychain or a pocket. It is based on a particle argon microcontroller with onboard an accelerometer, a thermometer and an ambient light sensor. The accelerometer detects motion, and triggers the device to send temperature and light readings to cloud whenever the wearer moves. 


Additional software would run on a server to log data sent by the wearable. At the end of a day, it translates a day worth of data into a pixelated image. Specifically, each temperature and light reading is mapped to a HSL color, with temperature mapped to hue, light mapped to lightness and the saturation is predetermined. Then the HSL color can be easily translated to a RGB color. With hundreds to thousands of readings a day, we can arrange all those pixels into a 2D array to form a pixelated image representation of the wearer's day.

The image is automatically uploaded to an instagram account associated with the wearer, allowing them to check back and reflect upon their day.



It has been a very pleasant project to work with. In the beginning, we were not quite sure if the images would be visually substantial by our method of translation, but the images generated from our sample data are reassuring.

If we were given more time, we would work on improving the physical housing of the wearable, and shrink down unnecessary components like breadboard and wiring to make a more compact and desireable device. We would also experitment with different methods of translating sensor inputs to colored pixels, as our current method has its own limitations (namely it tends to produce too much green and purple), and the current mapping is not the most intuitive for users to understand. The publishing platform is also worth reconsidering, as instagram contents could only be layed out as image feeds. Arranging images in a montly or even yearly overview format could lead to even more subtle and valuable observations.


Schematics and code samples

/*  retrospectre particle argon iot board source code
 *  accelerometer triggers cloud upload of light and temp sensor inputs
 *  Harley Guo
 *  March 2022
 *  Carnegie Mellon University

#include <string.h>

const int xPin = A1; 
const int yPin = A2; 
const int zPin = A3;
const int lightPin = A5;
const int tempPin = A4;
const double THRESHOLD = 0.05;

bool threshold(double x) {
    return x > THRESHOLD;

void setup()
    pinMode(lightPin, INPUT);
    pinMode(tempPin, INPUT);
void loop()
    int x = 0;
    int y = 0;
    int z = 0;    
    int light, temp;
    double xG, yG, zG;

    char outbuf[50];
    char *outp = outbuf;
    // avg 10 accelerometer reads
    for (int i=0; i<10; i++) {
        x += analogRead(xPin);  
        y += analogRead(yPin); 
        z += analogRead(zPin);  
    x /= 10; 
    y /= 10;
    z /= 10;
    // calculate g from voltage
    xG = ((((double)(x * 3.3)/4095) - 1.65 ) / 0.330 );   
    yG = ((((double)(y * 3.3)/4095) - 1.65 ) / 0.330 );   
    zG = (((((double)(z * 3.3)/4095) - 1.80 ) / 0.330 ) - 0.60); 

    // accelerometer triggers
    if (threshold(xG) || threshold(yG) || threshold(zG)) {
        light = analogRead(lightPin);
        temp = (analogRead(tempPin)-500)/10;
        // write buffer
        outp += sprintf(outp, "%d", light);
        outp += sprintf(outp, "%s", "|");
        outp += sprintf(outp, "%d", temp);
        if (Particle.connected()) {
            Particle.publish("info", outbuf);
Click to Expand
arduino micro in place for a particle argon
Argon wiring.thumb
# Listens to particle cloud and logs data into csv
# Harley Guo
# Carnegie Mellon University
# March 2022

from sseclient import SSEClient 
import requests, json, csv, string

MSG = SSEClient(f'{TOKEN}')
print("file name: ", end = '')

def main():
    for msg in MSG:
        data = str(
            j = json.loads(data)
            s = j["data"].split("|")  
            s = list(map(lambda x: int(x), s))
            with open(FILE, 'a', newline='') as f:
                writer = csv.writer(f)

if __name__ == "__main__": 
Click to Expand
# Autogenerating pixel images from csv file
# Harley Guo
# Carnegie Mellon University
# March 2022

from PIL import Image
import numpy as np
import csv, math, colorsys

print("file name: ", end = '')
SAVEFILE = input()
SAT = 50
SIZE = 1080

def getInRange(x, l, r):
    if x < l:
        return l
    elif x > r:
        return r
    return x

def main():
    with open(FILE, 'r') as tf:
        treader = csv.reader(tf)
        tempMin, tempMax, lightMin, lightMax = 4095, 0, 4095, 0
        rowCount = 0
        for row in treader:
            temp = float(row[1])
            light = int(row[0])
            if temp < tempMin:
                tempMin = temp
            elif temp > tempMax:
                tempMax = temp
            if light < lightMin:
                lightMin = light
            elif light > lightMax:
                lightMax = light
            rowCount += 1

    with open(FILE, 'r') as f:
        reader = csv.reader(f)
        size = math.floor(math.sqrt(rowCount))
        pixels = [[] for _ in range(size)]
        res = size ** 2

        counter = 0

        for row in reader:
            if counter >= res:

            imageRow = math.floor(counter / size)

            # temp as hue 0~90
            h = float(row[1])
            s = round(SAT / 255, 3)
            # brightness as light 300~3600
            l = int(row[0])

            # remap
            h = round(np.interp(h, [tempMin, tempMax], [0, 1]), 4) 
            l = round(np.interp(l, [lightMin, lightMax], [0, 1]), 4)

            r,g,b = colorsys.hls_to_rgb(h,l,s)
            r,g,b = int(r*255), int(g*255), int(b*255)
            counter += 1
    png = []
    for i in range(imageRow):
        temp = []
        for j in range(imageRow):
            for k in range(int(SIZE/imageRow)):
        for k in range(int(SIZE/imageRow)):

    arr = np.array(png, dtype=np.uint8)

    new_image = Image.fromarray(arr)'{SAVEFILE}.png')
    print(f'saved as {SAVEFILE}.png...')

if __name__ == "__main__":
Click to Expand
Share this Project

This project is only listed in this pool. Be considerate and think twice before sharing.


48-528 Responsive Mobile Environments

· 5 members

As part of this project-based course, we’ll get hands-on with emerging technologies, concepts and applications in the internet of things through a critical socio-technical lens. Over it’s 15-weeks,...more