Your heart beat, my music

Made by ialvarad ·

Created: September 29th, 2015

0

Overview: Wearables are gaining popularity and increasingly recording all sorts of measurements about our physical state. I propose a design for a music recommendation engine based on the emotional/physical state of a user’s friends and family.

Brief for problem space: Behavioral psychology has proven something that philosophers have wondered for years. Does our body affect our mind? It turns out that it does, in subtle and less subtle ways. Frowning for even as little as a minute can lead to an increased heart rate; endorphins from exercise trigger positive feelings in the mind. Why aren’t our music recommendation systems designed to serve the body in addition to the mind? Google Music, Pandora, Spotify, Beats Music, and even Apple Music all provide “smart” playlist recommendations based on search queries. Very few let you search by a ‘mood’ or a ‘feeling’. Even fewer let you share your emotions and state of mind with friends/family - even though music is especially apt to represent complex emotional states. Users could benefit from a system that recommends songs based on the physical status of their friends and families.

Approach: I was inspired by pplkpr, an app created by Kyle McDonald and Laura McCarthy that tracks, analyzes, and auto-manages user's relationships. In conjunction with the data being pulled from a smart wristband, pplkpr monitors user’s physical and emotional responses to those around them.

Process: I created a very very low level prototype of the front end side of this system using HTML and a few API calls. I pre populated a list of emotions and styles of music, then used a music recommendation API called Echnoest to get an artist suggestion based on a mood. I then looked up that artist’s most recent album using Spotify’s API. I explicitly leave the user with no options to change the recommended artist/album combination. The service is meant explicitly convey how others around them are feeling and encourage empathy. This would not be possible if the user could just click through to another ‘mood’ or ‘state’. I ran out of time to draw up a design for a wristband or some other smart device to capture health data and port that up to the recommendation engine. In reality, it wouldn't be hard to do with something like Apple's Health app and API. I imagine some already-existing research could point to ways of translating raw physical health data to broad emotional states like ‘stressed’, ‘agitated’, or ‘calm’.

Outcome: http://irealva.github.io/em2-assignment2/

Next Step Proposal: Build a more complete mockup including more detail about physical health data capture and conversion. 

x
Share this Project

This project is only accessible by signed in users. Be considerate and think twice before sharing.



About

~