“Looks Like Music” was created in 2013 by Yuri Suzuki. The project involves a little robot car that detects and follows a black line. When the robot encounters colored lines in its path, it translates the color into various sounds. Audience members can use markers to add color to the piece in order to compose their own music. Here is a video of the project in action: https://www.youtube.com/watch?v=_jND3HV4oh4
I thought this project was particularly interesting because it incorporated both visual and audio components. Thus the product is multidimensional – there’s a physical piece of art on the canvas that viewers create with colored markers in addition to the auditory piece of art composed by the car’s translation system. I also liked how involved the audience was in creating both components of the work. The audience members have complete control over both how the work looks and sounds. I also thought it related to our final project because it is very playful. Not only do the bright colors and markers make the process seem fun, the robot car is similar to a toy train and reminds me of playing as a child.
My only critique of the project is that it’s not
super intuitive what color corresponds to what sound. So, a viewer doesn’t
actually have total control over the musical piece that is composed. But maybe
that’s part of the beauty! And, of course, the user would eventually figure out
what each color sounds like and then could make deliberate choices to produce
I definitely want our final project to involve the audience. I think this project is unique in how the audience decisions really influence how the work is created. Suzuki’s robot car is really just a tool for a viewer to create art. Maybe our project will not be quite as independent as that, but I would definitely like audience participation to inform the final piece of art that is created.