Autotelematic Spider Bots

Autotelematic Spider Bots 2006 (a collaboration with Matt Howard) is an artificial-life, robotic installation consisting of 10 spider-like sculptures that interact with the public in real time, self modifying their behaviors based on their interactions with the viewer, themselves, their environment, and their food source. They were experiments in allowing a robot to emerge into energy autonomy by making it able to find its recharge stations. The spider bots see human participants in the installation with long-distance, ultrasonic eyes at the end of springy, antennae-like necks. The robots can see a distance of 3-4 meters with their ultrasonic eyes. The robots were designed to constantly seek interaction by swinging their antennae-like necks back and forth. When they find people, their interactions trigger behavioral responses that manifest themselves both immediately and over time as the series evolve. As real spiders have multiple eyes for both long and short distances, these spider bots have additional, shorter-distance infrared eyes, which allow them to see and avoid each other as they randomly forage for food. For the spider bots, food takes the form of a recharge station in the surrounding ring. The robots stay equidistant from each other and the walls in the spider ring by emitting coded, infrared pulses from small aluminum tubes placed at their midsections. This creates a kind of infrared apron around each spider bot. When a spider’s infrared eye sees either a human or a fellow robot, super bright LEDs illuminate the interior of its plastic structure as an indication of this presence. This creates a constantly blinking environment of LEDs, which indicate the interior processing and thinking states of the robotic spiders and allow humans to understand that they are being seen. Autotelematic Spider Bots, 2006. The Sunderland Museum and Winter Gardens. This work was commissioned by the AV Festival England, invited by curator Honor Harger. The spiders were able to communicate with each other through Bluetooth communications, which allowed one central robot to coordinate their activity as they interacted. The robots were designed to find their food source through random foraging, looking for a 1 Hz infrared beacon that sat under a recharge rail and this has been deactivated for this installation in Switzerland as battery research advances. The successful creation of a series of robots that emerge into energy autonomy through intercommunication will have to wait for significant improvement in battery technology. As batteries evolve and allow quicker charging, I hope future versions will exhibit this emergent behavior. In the studio, we were successful in having one robot find and attach itself to the recharge station with springy chelicerae (pointed appendages that biological spiders use to grasp and pull food into their mouths) at the front of the unit. The robots also talk to each other and the interacting public with audible chirping sounds from small, amplified speakers that are attached to their frames. The speakers amplify the twittering sound in relation to human interactions. Viewers were able to get a sense of the “emotional” response of the spider bots through the tone of messages being passed to the viewers. Higher tones are associated with fear and repulsion and lower tones with normal food foraging. Autotelematic Spider Bots, 2006. The Sunderland Museum and Winter Garden, the UK. Ken Rinaldo and Matt Howard. Photo John Marshall. One of the robots has a mini video camera and transmitters to project its vision to the wall of the installation. This signal was projected onto a screen with built in voronoi-like patterns, giving viewer/participants a sense of being captured in the installation’s web. The screen also showed the spiders in larger scale than the viewers, subtly manipulating the power structure of the human/robot relationship. These robots also defined a completely new morphology of robot walking. The legs are based on a tension compression structure and pull string mechanics. Each set of two legs acts like a kind of flexible arch, which is held into compression by springy plastics and monofilament or fishing line. When servomotors pull one of the lengths of the monofilament, the arch bends and allows the leg to move quite naturally at any speed. Although biological spiders have eight legs, six legs allow the robots to walk forward in a tripodic gait and turn in either direction. This tripodic gait simulates cockroaches and other six-legged insects. The overall inspiration for the robots’ behavior came from a lecture by Dr. Guy Theraulaz, whose research at the Centre National de la Recherche Scientifique in France reports that ants operate on rule-driven systems. With this in mind, it became apparent that computers and software, as rule-driven systems, could be structurally coupled with the robots’ organic structure, allowing them to function within and emerge from with their environment. The software for the robots is organized into a structure I call bio-sumption architecture, which allows individual behaviors to be subsumed for the fitness of the group. When they are hungry, food finding is the robots’ primary behavior, and they will ignore human interaction. This is a variation on the subsumption architecture of Rodney Brooks of MIT. The robots were completely designed in the 3D program Cinema 4d. This allowed both a customization of motors and part fitting to absolute accuracies and the rapid evolution and construction of this complex morphology. The final bots were output into rapid prototyping plastics, providing for quick testing of the stiffness, flexibility, and translucency of the plastics. The colored portions were cast from original rapid prototype Models out of semi-clear polyurethane plastic impregnated with different Pantone colors to give each robot an individual quality. In essence, robots gave birth to other robots. The architecture of the hardware is based on distributing as much of the intelligence of the robot as possible to integrated smart sensors and integrated motor controllers. For example, the servo motor controller functions like an autonomic nervous system, receiving and sending walking commands without tying up individual processors. This allows quick processing and rapid sensor activation and reduced overall processor overhead. The brains of the robots are two imbedded microcontrollers, with a left- and right-hemisphere approach to parallel processing and a four-wire corpus collosum between the two hemispheres. The Autotelematic Spider Bots installation is an artificial-life chimera; a robotic spider, walking like a cockroach and insect, seeing like a bat, and twittering with the voice of an electronic bird. The spider bots are set to evolve further as I am working on this process with students and robotics researchers at Ecole Polytechnique Federale de Lausanne in Switzerland. Additional video footage: AV Festival, Utube

Translate »