We are developing a musical swarm robot system in which small, cell-phone based robots communicate with humans and with each other and coordinate their movement in order to explore real time algorithmic musical composition and performance.
Currently, each individual swarm robot uses an android cellphone as the main control system. Using a development board that implements Google's Open Accessory Toolkit (ADK), the cellphone can be used to physical components such as servos for robot movement and motors or solenoids for the robot's main instrument, which is currently a pitched handbell. Each cellphone robot can communicate with a central server which gives information about the robots current position using an overhead camera system. Additionally, the onboard sensors on the phone, such as the compass, also can be used to aid in the robot's motion to music mappings. For interaction with humans, the robots can be controlled from the server, or humans can directly interact with them by picking up the robots. Face detection with OpenCV is implemented upon the android phone; the robot will detect the human's presence and respond with by playing a musical pattern based on the distance of the human's face to the camera. This and other modes of human and swarm robot interaction will be explored in order to create a platform that can allow for new ways of thinking about decentralized musical systems.