Research Project Archive

  • --with Vinod Menon [Stanford], David Huron [OSU], and Daniel Abrams [Stanford]
    In this work we are exploring amygdala activation in response to simple stimuli such as rising and falling intensity and rising and falling pitch tones.

  • In collaboration with the Potter Group at the Laboratory for Neuroengineering at Georgia Tech, the Brain Wave project sonified signals from cultured neurons recorded by multielectrode array and was presented as a musical piece titled BrainWaves

  • One of the most active areas of MIR research is content-based recommendation, analyzing the semantic content of songs to judge their relatedness. This information is used to make recommendations of the form: "if you like A, you'll also like B".

  • Drawn together is an artwork in which participants will interact with artificial intelligence agents to create unforeseen and original drawings and musical responses.

  • I am interested in constructing a cognitively grounded theory of raag, the fundamental melodic concept of Indian music, that explains how raags evoke emotions.

  • In Flock, a full-evening performance work commissioned by the Adrienne Arsht Center for the Performing Arts in Miami, music notation, electronic sound, and video animation are all generated in real time based on the locations of musicians, dancers, and audience members as they move and interact with each other.

  • Flou (pronounced “flew”) is not exactly a game; you do fly a ship through space, but you cannot shoot anything, score points, or win or lose. The focus, rather, is on the soundtrack: as you navigate through a 3D world and zoom through objects in space, you add loops and apply effects to an ever-evolving musical mix. You can also design your own worlds to fly through and share them with other Flou users.

  • Haile is a robotic percussionist that can listen to live players, analyze their music in real-time, and use the product of this analysis to play back in an improvisational manner. 

  • “iltur” is a series of musical compositions featuring a novel method of interaction between acoustic and electronic instruments with new musical controllers called Beatbugs.

  • The objectives are to develop computational models of improvisation and to use them to develop new technologies that support creativity in music and education.

  • Nular represents sound objects as balls in an attempt to control granular synthesis parameters through user interaction. These virtual objects move and collide with each other to determine the behavior of playing sounds.



  • Inspired by the tradition of open-form musical scores, these four piano etudes are a collection of short musical fragments with links to connect them. In performance, the pianist must use those links to jump from fragment to fragment, creating her own unique version of the composition.

  • The goal of this NSF CAREER project is to develop machine- learning (ML) models for predicting temporally structured events in the context of music, which take advantage of these complex correlations, and to use these models to help explain human musical expectation.

  • Just as key recognition is essential for understanding Western tonal music (including pop, rock, jazz, etc) -- because it is fundamental to understanding the melodic and harmonic content of a piece -- raag understanding is essential for systems that interact with Indian music.

  • Tabla and Mrdangam are the main percussion instruments of North and South India respectively. Both traditions are highly virtuosic and are organized around timbre. "melodies" are created from timbres as opposed to pithces and organized in a highly structured way. The system is analaogus to language in that small units are combined hierarchically to form larger expressions

  • We have been examining whether certain emotional responses to raag music can be traced to pitch statistics such as frequency of usage and conditional frequency of usage (i.e. how often one note follows another). We have also begun examining the role that micro-pitch structure such as pitch glides and ornaments play in evoking emotion.

  • We are developing a musical swarm robot system in which small, cell-phone based robots communicate with humans and with each other and coordinate their movement in order to explore real time algorithmic musical composition and performance.

  • ZOOZbeat is a gesture-based musical studio, simple enough for non-musicians to immediately become musically expressive but rich enough for experienced musicians to push the envelope of mobile music creation.