Process
When we chose tea leaf reading as the basis for our investigation, we knew there were several key aspects of the ritual we wanted to remain untouched. Primarily, we felt the act of peering down into the cup was crucial to the embodied experience of tea leaf reading. Secondly, we felt that giving the user the opportunity to interpret their own destinies was crucial to the experience of tea leaf reading. With these guiding principles in mind, we began our explorative process.
Our initial plan was to classify certain aspects of the user’s actions while performing the tea ritual, including tea temperature, speed of stir, and how full the cup was. However, we soon felt that this level of quantification stripped some of the mystique away from the tea ritual. Not only that, but that much classification would be beyond our programming capabilities!
However, this line of thinking brought us down a different path. Dongtao came up with the idea of using the screen beneath the teacup, and generating a pattern based on microphone input that a user could interpret, much like tea leaves in tasseography. This ultimately shaped the physical design of the product - which included a base that engulfed the cup, expanding its form and elevating it from the table. And though we ultimately shifted away from the tea leaf pattern idea, this form persisted to the final prototype.
Speaking of the screen, we decided to instead approach the omens more literally - using classic tasseography symbols. Michael created these screens in a pixel art style - inspired by the mystery game Return of the Obra Dinn. We felt that presenting users with the omen in this way would be more engaging, and give users unfamiliar with the practice the ability to simply interpret, rather than attempt to identify symbols in what would essentially be random noise.
Alongside screens, we were bringing the device to life using laser-cut black acrylic. To ensure a seamless experience, the device was carefully designed with the exact measurements of the glass cup we were using. There was a specialized cutout for the specific model of screen we were assigned as well. We decided to add LED strips to the device as well, to lend a more mystical, immersive experience.
The device ultimately used the microphone and Edge impulse to classify how quickly the user was stirring the cup with the spoon, and used that input to inform the (still mostly random) omen.