The Autotelematic Spider Bots by Ken Rinaldo and Matt Howard is an artificial-life, robotic installation consisting of 10 spider-like sculptures that interact with the public in real time, self-modifying their behaviors based on their interactions with the viewer, themselves, their environment, and their food source. They were experiments in allowing a robot to emerge into energy autonomy by making it able to find its recharge stations.
The spider bots see human participants in the installation with long-distance, ultrasonic eyes at the end of flexible, antennae-like necks. The robots can see a distance of 3-4 meters with their ultrasonic eyes. The robots were designed to continually seek interaction by swinging their antennae-like necks back and forth. When they find people, their interactions trigger behavioral responses that manifest themselves both immediately and over time as the series evolves.
As real spiders have multiple eyes for both long and short distances, these spider bots have new, shorter-distance infrared eyes, which allow them to see and avoid each other as they randomly forage for food. For the spider bots, food takes the form of a recharge station in the surrounding ring. The robots stay equidistant from each other and the walls in the spider ring by emitting coded, infrared pulses from small aluminum tubes placed at their midsections and create a kind of infrared apron around each spider bot.
When a spider’s infrared eye sees either a human or a fellow robot, super bright LEDs illuminate the interior of its plastic structure as an indication of its presence. The spider bots create a constantly blinking environment of LEDs, which indicate the internal processing and thinking states of the robotic spiders and allow humans to understand that they are aware. The Autotelematic Spider Bots, 2006 premiered at The Sunderland Museum and Winter Gardens commissioned by the AV Festival England, curator Honor Harger.
The spiders were able to communicate with each other through Bluetooth communications, which allowed one central robot to coordinate their activity as they interacted. The robots were designed to find their food source through random foraging, looking for a 1 Hz infrared beacon that sat under a recharge rail and deactivated for the installation until battery recharge speed and research advances.
The successful creation of a series of robots that emerge into energy autonomy through inter-communication will have to wait for a significant improvement in battery technology. As cells evolve and allow quicker charging, I hope future versions will exhibit this emergent behavior. In the studio, we were successful in having one robot find and attach itself to the recharge station with springy chelicerae (pointed appendages that biological spiders use to grasp and pull food into their mouths) at the front of the unit.
The robots also talk to each other, and the interacting public with audible chirping sounds from small, amplified speakers that are attached to their frames. The speakers amplify the twittering sound with human interaction. Viewers were able to get a sense of the “emotional” response of the spider bots through the tone of messages passed to the viewers. Higher tones associated with fear and repulsion and lower tones with regular food (charging) foraging.
One of the robots has a mini video camera and transmitters to project its vision to the wall of the installation. This signal was projected onto a screen with built-in Voronoi-like patterns, giving viewer/participants a sense of being captured in the robot’s web. The screen also showed the spiders on a larger scale than the viewers, subtly manipulating the power structure of the human/robot relationship.
These robots also defined a whole new morphology of robot walking. The legs are based on a tension compression structure and pull string mechanics. Each set of two legs acts like a kind of flexible arch held in compression by flexible plastics and monofilament or fishing line. When servo motors pull one of the lengths of the monofilament, the arc bends and allows the leg to move quite naturally at any speed. Although biological spiders have eight legs, six legs allow the robots to walk forward in a tripod gait and turn in either direction. This tripod gait simulates cockroaches and other six-legged insects.
The overall inspiration for the robots’ behavior came from a lecture by Dr. Guy Theraulaz, whose research at the Centre National de la Recherche Scientifique in France reports that ants operate on rule-driven systems. With this in mind, it became apparent that computers and software, as rule-driven systems, could be structurally coupled with the robots’ organic structure, allowing them to function within and emerge from with their environment.
The software for the robots organized into a structure I call bio-sumption architecture, which allows individual behaviors to be subsumed for the fitness of the group. When they are hungry (low battery charge), food finding is the robots’ primary conduct, and they will ignore human interaction. The programming was a variation on the subsumption architectures of Rodney Brooks out of MIT.
The robots designed in the 3D program Cinema 4d allowed a customization of motors and parts fitting to absolute accuracies. The rapid prototyping meant rapid evolution and construction of this complex morphology. The final spider bots were output into rapid prototyping plastics, providing for quick testing of the stiffness, flexibility, and translucency of the plastics. The main bodies cast in colored acrylic from an original early prototype held motors and electronics.
Models out of semi-clear polyurethane plastic impregnated with different Pantone colors to give each robot an individual quality. In essence, robots gave birth to other robots.
The architecture of the hardware, sensor integration with physical structure proceeded with distributing as much of the intelligence of the robot as possible to the integrated smart sensors and integrated motor controllers. For example, the servo motor controller can function like an autonomic nervous system, receiving and sending walking commands without tying up individual processors. Quick processing and rapid sensor activation reduced overall processor overhead.
The brains of the robots are two embedded microcontrollers, with a left- and right-hemisphere approach to parallel processing and a four-wire corpus callosum between the two hemispheres. The Autotelematic Spider Bots installation is an artificial-life chimera. A robotic spider, walking like a cockroach and insect, seeing like a bat, and twittering with the voice of an electronic bird.
The spider bots are set to evolve further as I am working on this process with students and robotics researchers at Ecole Polytechnique Federale de Lausanne in Switzerland.
Art and Science: How Scientific Research and Technological Innovation are becoming key to 21st Century Aesthetics, Thames and Hudson, by Steven Wilson, 162, 111, 129 2010
DO ROBOTS DREAM OF SPRING? – The Art of Ken Rinaldo at the Maison d’Ailleurs. A retrospective catalog has a forward by Science Fiction writer Bruce Sterling and curator Patrick Gyger. Yverdon-Les-Bains: September 19, 2010, to March 20, 2011