Shimon is an improvising robotic marimba player that is designed to create meaningful and inspiring musical interactions with humans, leading to novel musical experiences and outcomes. The robot combines computational modeling of music perception, interaction, and improvisation, with the capacity to produce melodic acoustic responses in physical and visual manners. Real-time collaboration between human and computer-based players can capitalize on the combination of their unique strengths to produce new and compelling music. The project, therefore, aims to combine human creativity, emotion, and aesthetic judgment with algorithmic computational capability of computers, allowing human and artificial players to cooperate and build off each other’s ideas. Unlike computer- and speaker-based interactive music systems, an embodied anthropomorphic robot can create familiar, acoustically rich, and visual interactions with humans. The generated sound is acoustically rich due to the complexities of real life systems, whereas in computer-generated audio acoustic nuances require intricate design and are ultimately limited by the fidelity and orientation of speakers. Moreover, unlike speaker-based systems, the visual connection between sound and motion can allow humans to anticipate, coordinate and synchronize their gestures with the robot. In order to create intuitive as well as inspiring social collaboration with humans, Shimon analyzes music based on computational models of human perception and generates algorithmic responses that are unlikely to be played by humans. When collaborating with human players, Shimon can therefore facilitate a musical experience that is not possible by any other means, inspiring players to interact with it in novel expressive manners, which leads to novel musical outcomes.

Shimon has performed with human musicians in dozens of concerts and festivals from DLD in Munich Germany through the US Science Festival in Washington DC to the Bumbershoot Festival in Seattle WA and Google IO in San Francisco.  It also performed  over video-link with conference attendees such as SIGGRAPH Asia in Tokyo and the Supercomputing Conference in New Orleans.



See video

See video

See video

See video

See video

See video

See video

See video

See video

See video

See video

See video

See video

See video

See video


Selected Reviews

Just as electronic brains have excelled at chess and “Jeopardy!,” Shimon is advancing the more subtle elements of musicianship: the cues that musicians give one another.

 Pierre Ruhe,  “Concert Review”, Arts Critic ATL, March 17, 2011


“Shimon is a real, center stage, jazz-playing musician, ….what makes Shimon exciting is not only that it can provide new levels of entertainment and expression, but that it is reaching new heights of human-robot interaction and artificial intelligence.

Thomas Marsh “Georgia Tech’s Shimon Can Jam” Robot Magazine, January, 2011


“The good people at NPR All Things Considered asked readers, and me in particular, to comment if Shimon can jam like a jazz great, or if he has the musical ability of a soulless machine. The answer, of course, is both. This robot certainly knows its John Coltrane and Thelonious Monk well enough. We call Coltrane and Monk inimitable, but their idiosyncrasies do not transcend the laws of physics. A sufficiently precise computer program could study the salient parts of their various approaches and create models to replicate them -- in fact, one just did.

Patrick Jarenwattananon, NPRJazz Blog, December 22, 2009


“The Robotic Musicianship Group at Georgia Tech Center for Music Technology just blew our minds with some videos depicting robots playing music with real people.”

Eliot Van Birskirk, “Robots Pass Musical Turing Test” Wired, November 21, 2008 


Current Students



  • Cicconet, M., Bretan, M., and Weinberg, G. (2012) "Visual cues-based anticipation for percussionist-robot interaction" In HRI 2012, 7th ACM/IEEE International Conference on Human-Robot Interaction, Boston, Massachusetts

  • Bretan, M., Cicconet, M., Nikolaidis, R., and Weinberg, G. (2012). "Developing and Composing for a Robotic Musician Using Different Modes of Interaction" In Proceedings of the 2012 International Computer Music Conference (ICMC 12), Ljubljana, Slovenia.


  • Weinberg G. (2011), “Gesture-based Human-Robot Jazz Improvisation” Extended Abstract in the Proceedings of the International Conference of Machine Learning (ICML 11), Seattle, USA.

  • Hoffman, G., Weinberg G. (2011), “Interactive Improvisation with a Robotic Marimba Player” Journal Autonomous Robots, Vol. 31. Springer Press.


  • Hoffman, G. and Weinberg, G. (2010) “Gesture-based Human-Robot Jazz Improvisation”, in Proceedings of the 2010 IEEE International Conference on Robotics and Automation (ICRA 10), Anchorage, AK.

  • Hoffman, G., Weinberg, G. (2010) “Shimon: An Interactive Improvisational Robotic Marimba Player” in Extended Abstracts Proceedings of International ACM Computer Human Interaction Conference (CHI 10), Atlanta, GA.

  • Weinberg, G., Nikolaidis, R., and Mallikurjuna, T. (2010), “A Survey of Recent Interactive Compositions for Shimon – The Perceptual and Improvisational Robotic Marimba Player” The International Conference on Intelligent Robots and Systems (IROS 2010), Taipei, Taiwan.

  • Nikolaidis, R., and Weinberg G. (2010), “Playing with the Masters: A Model for Interaction between
Robots and Music Novices” The 19th International Symposium on Robotics in Music and Art (RO-MAN 10), Viarggio, Italy.

  • Hoffman, G., and Weinberg G. (2010), “Synchronization in Human-Robot Musicianship” The 19th International Symposium on Robot and Human Interactive Communication (RO-MAN 10), Viarggio, Italy.


  • Weinberg, G., Mallikarjuna, T., Ramen (2009) “Interactive Jamming with Shimon: A Social Robotic Musician” in the Proceedings of the ACM/IEEE International Conference on Human Robot Interaction, (HRI 2009) San Diego, CA, pp. 233-234.

  • Weinberg, G., Blosser B. (2009) “A Leader-Follower Turn-taking Model Incorporating Beat Detection in Musical Human-Robot Interaction” in the Proceedings of the ACM/IEEE International Conference on Human Robot Interaction, (HRI 2009) San Diego, CA.

  • Weinberg, G., Blosser B., Mallikarjuna, T., Ramen (2009) “Human-Robot Interactive Music in the Context of a Live Jam Session”, in the Proceedings of International Conference on New Instruments for Music Expression (NIME 09), Pittsburgh, PA, pp. 70-73.


  • Weinberg, G., Driscoll S. (2007) “Introducing Pitch, Melody and Harmony into Robotic Musicianship”, Proceedings of the International Conference on New Interfaces for Musical Expression (NIME 2007), New York City, NY, pp. 228-233.