I hacked a Kinect to help people express themselves through music without having to learn an instrument and landed on the cover of the Times.
On July 21st, 2011 my graduate thesis project, MOTIV, appeared in a full page article on the front page of The New York Times Business section titled, With a Wave of the Hand, Improvising on Kinect.
MOTIV used computer vision to connect a musician’s movements to expressive musical parameters like note velocity and tempo during performance, unlocking a surprisingly intuitive control over the emotional qualities of a song without needing to learn an instrument.
MOTIV expression test, 2013
As I entered my second year of grad school ing the School of Visual Arts MFA in Interaction Design. I was studying the neurological connections between music and emotion. Why does music make us feel anything at all? Turns out, our emotions are hard-wired to our body. Expression in music is directly connected to our intuitive understanding of the human body behind the sound.
Musicians love digital tools because they inspire new ways of creating, but they struggle with them too, having to tweak, abuse, and outright hack them into producing more emotionally resonant outputs. MOTIV aimed to recouple digital music-making tools to movement to unlock musical expression for all.
I was lucky to be able to build upon a rich body of open source development after the release of Microsoft’s Kinect. MOTIV owes a large debt to the OpenFrameworks community.
Early MOTIV skeleton tracking using the openNI library.
While computer vision and digital music making tools have come a long way since then, the underlying success of MOTIV was creating an experience that anyone could use to express themselves through music. Just as noble a pursuit today as it was then, no matter the underlying tech.