One of our goals for this project is to create a space that shows empathy towards those who interact with it. We decided that one way of achieving this goal was to utilize text and sensor inputs to gain a better idea about how the user was feeling. We thought that it would be important to collect both information that the user provided about their own feelings as well as physiological indicators that the user can not control to get the best idea of the user’s feelings. (Since sometimes people don’t always say how they feel.) With this input, an algorithm was created to interpret the feelings of the user. Then, we wanted to play music in response to this mood. One of the questions we had was whether we should play music that matched the mood or music to better the mood (i.e., playing calming music to soothe someone who is stressed). The answer to our question came from the video, “Brené Brown on Empathy,” in which it was stressed that in order to show empathy, one must “connect with something in [oneself] that knows that feeling.” Empathy indicates a connection through the feeling. Thus, we decided to match the music to the feeling.