Scent Memo
Made by Shengzhi Wu
Made by Shengzhi Wu
A digital device that can store and diffuse scent for reminiscing the past.
Created: February 12th, 2019
Scent Memo is digital device that can store and diffuse various of odors related to one's memories. Researchers have found that olfactory is strongly wired with people's memories and emotions, and that's why when we smell certain type of odors, we instantly travel back to a past moment filled with emotions. Which is why it's a perfect way for reminiscing one's past and recall our memories. With that in mind, I am wondering what if every physical object can store the odor information related to an individual's memory? What if a device can even diffuse that smell and help a person to reminisce?
Inspired by some previous works on Joseph Kaye's olfactory display and David Dobbelstein's project inScent, which is a olfactory display for mobile notification, I started to explore the concept of utilizing smell for a kind of design intervention and help people to recall the memory. The initial thought was to shoot a concept video, in which a olfactory display facilitates people's reminiscing activities while also caused serious dissonances, and it eventually becomes annoying and disturbing for daily life.
But this concept was commented by Daragh that might depend too much on future technology, as well as it puts a strong negative opinion on the technology and does not offer enough space of discourse.
Therefore, I pivot to a little bit more to the near future and attempt to hold my opinion on it, instead to provide a possibility of future technology and discussing the consequences or potential scenarios that may occur with the technology.
I borrowed a wooden souvenir of my friend as a path way to trigger the memory. The souvenir was initially collected and purchased in Japan, which help him reminiscing his past experience of that period of time in Japan, since he spent almost a whole year there. So he always places this object on his desk. I choose the incense mixed with cherry blossom scent, which I got it from the conversation of the best associate scent with his memory. Then I decide combined with an AR app to recognized this souvenir and trigger the device to diffuse that smell.
I made a functionally prototype with different pieces of technology. An Android AR app first recognizes the wooden souvenir, and then it sends command to a Particle Photon circuit through Internet, and eventually the 3 LEDs will light up in a certain order to give a sense of progressing, and then a small fan will be spinning to diffuse the smell.
For the physical computing, I used a Particle Photon, 3 LED, and a small fan for diffusing the smell. It's my first time using Particle Photon, and it took time to figure out the Internet communication part. I also wrote a bit code to make the 3 LEDs lighting up with different paces, so it feels it is running from left to right. The light up process is using AnalogWirite, so it's smoother.
The interesting part of the particle code is that it's constantly listening to a web server command, once it receives a "on" command, it starts to work, otherwise stop working.
#include "math.h"
int ledPin1 = D2;
int ledPin2 = D1;
int ledPin3 = D0;
bool isEmitting;
int brightness1 = 0;
int fadeAmount1 = 5;
int brightness2 = 85;
int fadeAmount2 = 5;
int brightness3 = 170;
int fadeAmount3 = 5;
int myValue;
int fan = A0;
void setup ()
{
isEmitting = false;
pinMode(fan, OUTPUT);
pinMode(ledPin1, OUTPUT);
pinMode(ledPin2, OUTPUT);
pinMode(ledPin3, OUTPUT);
digitalWrite(fan,LOW);
Particle.function("led",ledToggle);
}
void loop ()
{
if(isEmitting){
SendScent();
}else{
StopSendingScent();
}
}
void SendScent(){
myValue = random(255);
// digitalWrite(ledPin2,HIGH);
analogWrite(ledPin1, brightness1);
analogWrite(ledPin2, brightness2);
analogWrite(ledPin3, brightness3);
digitalWrite(fan, HIGH);
brightness1 = brightness1 + fadeAmount1;
brightness2 = brightness2 + fadeAmount2;
brightness3 = brightness3 + fadeAmount3;
if (brightness1 <= -50 || brightness1 >= 255) {
fadeAmount1 = -fadeAmount1;
}
if (brightness2 <= -50|| brightness2 >= 255) {
fadeAmount2 = -fadeAmount2;
}
if (brightness3 <= -50 || brightness3 >= 255) {
fadeAmount3 = -fadeAmount3;
}
delay(15);
}
void StopSendingScent(){
analogWrite(ledPin1, 0);
analogWrite(ledPin2, 0);
analogWrite(ledPin3, 0);
digitalWrite(fan, LOW);
}
int ledToggle(String command) {
if (command=="on") {
isEmitting = true;
return 1;
}
else if (command=="off") {
isEmitting = false;
return 0;
}
else {
return -1;
}
}
Click to Expand
For the Android AR app, I used Unity and C# to programming it, and there's not much resources for connecting Unity project to a Particle Photon through Internet, and that part is the most difficult one. So I looked up the Particle example of sending a web request to particle and use the similar way, send a form through C# code using Unity Web request to realize the Internet communication.
The image recognition and augmented reality part, I also used Vuforia, which is a marker based AR SDK. I already had a lot experience with that, so it doesn't take much effort.
The final app is built into a Android App, deployed on my Pixel Phone.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Networking;
public class PhotonController : MonoBehaviour
{
bool ledIsOn = false;
public GameObject uiImages;
void Start()
{
uiImages.SetActive(false);
// A correct website page.
//StartCoroutine(GetRequest("https://api.particle.io/v1/devices/3d003a000447363339343638/value?access_token=5c8b7dfdeeda7ff3f10ea3090b6546d4a471dac3"));
// A non-existing page.
}
IEnumerator GetRequest(string uri)
{
while (true)
{
using (UnityWebRequest webRequest = UnityWebRequest.Get(uri))
{
// Request and wait for the desired page.
yield return webRequest.SendWebRequest();
string[] pages = uri.Split('/');
int page = pages.Length - 1;
if (webRequest.isNetworkError)
{
Debug.Log(pages[page] + ": Error: " + webRequest.error);
}
else
{
//Debug.Log(pages[page] + ":\nReceived: " + webRequest.downloadHandler.text);
JSONObject lightData = new JSONObject(webRequest.downloadHandler.text);
// Grab the "result" value and store it
lightData = lightData["result"];
Debug.Log(lightData);
}
}
yield return new WaitForSeconds(1);
}
}
public IEnumerator Upload(string state)
{
WWWForm form = new WWWForm();
form.AddField("arg", state);
using (UnityWebRequest www = UnityWebRequest.Post("https://api.particle.io/v1/devices/3d003a000447363339343638/led?access_token=5c8b7dfdeeda7ff3f10ea3090b6546d4a471dac3", form))
{
yield return www.SendWebRequest();
if (www.isNetworkError || www.isHttpError)
{
Debug.Log(www.error);
}
else
{
Debug.Log("Form upload complete!");
}
}
}
private void Update()
{
if (DefaultTrackableEventHandler.isTracking)
{
if (!ledIsOn)
{
StartCoroutine(Upload("on"));
Debug.Log("uploadind on");
ledIsOn = true;
uiImages.SetActive(true);
}
}
else
{
if (ledIsOn)
{
StartCoroutine(Upload("off"));
Debug.Log("uploading off");
ledIsOn = false;
uiImages.SetActive(false);
}
}
}
}
44.959 KB · Download / View
inScent: A Wearable Olfactory Display for Mobile Notifications
The olfactory display utilizes various scents to send different mobile notifications to participants. Different kinds of easy-distinguished scents are chosen, such as flowers, mint, lavender and lemon. Moreover, a mobile app called Anbient Smell is developed on android phone, through which a user can manually trigger the scents to get familiar with them. Then, the users can assign different scents for the given scentifications, including Scented message(text messages), Scented Reminder( Calendar event),, Time Sense(to trigger a feeling for the passing of time) and Scented Event (the delivery of an important parcel). Then, participants are required to wear the InScent device in university space setting, and a semi-structured interview is conducted to get the qualitative feedback of the process.
https://docs.google.com/document/d/1Y09K4Mxb8dzg2z5rUgDgMqXmwoiG7LcHETf6GJEFAc4/edit?usp=sharing
The project still left many questions and challenges to ask.
In addition, from the demo day, I learned that using a storyboard or a storytelling method is how important to convey one's concept, even it's a demo presentation, but with a good story to imagery the scenario, audiences would be more engaging. Especially describing a scenario up front, make people enter that context is very critical.
Moreover, I kept thinking about the question Daragh asked in the class, what is the experience I would imagine users to use this device. I think for a purpose of reminiscing, it still feels thin. What if it can be used for training, like to enhancing people's ability to associate a certain odor with an object? Like a cook might need it to remember a lot of different smell of Cheese. Or what if it can be used to help Alzheimer's patients to remember things they cannot remember? I think my scenario is not convincing enough to rise up many questions for discourse, which is something I want to explore for next project.
[1] Psychology and Smell
http://www.fifthsense.org.uk/psychology-and-smell/
[2] Jordan Gaines Lewis, Ph.D. Smells Ring Bells: How Smell Triggers Memories and Emotions
[3] Brewster, S., McGookin, D., Miller, C. Olfoto: designing a smell-based interaction. Proc. CHI (2006), 653–662.
[4] Obrist, M., Tuch, A., and Hornbæk, K. 2014. Opportunities for odor: experiences with smell and implications for technology. In Proc. CHI '14. ACM, NY, USA, 2843-2852.
[5] inScent: a Wearable Olfactory Display as an Amplification for Mobile Notifications David Dobbelstein, Steffen Herrdum, Enrico Rukzio Ulm University, Ulm, Germany, ISWC '17, SEPTEMBER 11–15, 2017, MAUI, HAWAII, USA
https://dl-acm-org.proxy.library.cmu.edu/citation.cfm?id=3123035
This 15-week course introduces students to responsive mobile environments and encourages them to explore speculative terrains that intersect art, technology and design and space. Iteratively, intro...more
A digital device that can store and diffuse scent for reminiscing the past.