DALLAS Raytheon Systems Co. is mapping out an ambitious program for building an artificial nervous system (ANS) for the soldier of the future.
The ANS will enable autonomous robots to utilize low-power, lightweight, nanotechnology-sized analog computers that communicate using secure digital pulsecodes. The military-funded project seeks to create autonomous robots capable of data fusion, mission planning, real-time learning and innovative responses in novel battlefield situations.
The design will be topped off with the Cog artificial head from MIT a four-eyed sensor platform with a simultaneous wide-angle and telephoto lens system. "We've taken nature's model, which dictates that a nervous system must interact with its environment to enable learning by self-organizing internal structures," said Andrew Penz, principal researcher on the project at Raytheon.
The ANS architecture is based on human physiological studies, in particular on the notion that learning results from a proactive relationship with the environment. In other words, the ANS must act on the world with its motor systems and then observe the results with its sensors. Learning is reinforced only if the robot has correctly predicted future sensory inputs from current motor outputs for instance, observing its own hand picking up a cup after it has directed its motor systems to do so.
"Even perception itself is an active behavior for instance, consciously 'seeing' involves critical eye movements and 'touching' is synonymous with probing finger movements," said Larry Cauller, chief scientist on the project and a professor at the University of Texas at Dallas.
The core of the ANS manages the interaction between the environment and the sensory-motor subsystems. Learning, however, occurs at a higher level, in the simulated cortex of the ANS. Planning and innovative responses to novel situations emerge from the ANS' analog correlational processing and integral pulsecode-based interconnection design. Different time scales, for multiple simultaneous motor outputs, are managed by the multiple layers.
"Our architecture mimics real nervous systems; in particular, it is not fully connected but uses a random asymmetrical scheme with axons communicating over pulsed delay lines," Cauller said.
The architecture involves a nearly equal number of feedforward and feedback interconnections. The environmental learning capabilities of the ANS depend heavily on adaptively controlling the forward and backward signal paths simultaneously. That view of neurobiology asserts that the information passed among neurons comes as a result of the temporal synchrony among signals.
"The potential power of temporal synchrony lies in the nonlinear sensitivity of neurons to simultaneous synaptic inputs," said Cauller. That novel view called neuro-interactivism claims that neurons are coincidence detectors and that their interactions follow the dynamical mathematics of chaos theory.
The cortex, where learning is stored, uses a three-layer structure, again with an equal number of feedback and feedforward signal paths. Each layer has excitatory nodes and inhibitory nodes. The intralevel connections are uniform among excitatory nodes. The interlevel connectivity, however, has different topologies depending upon whether signals come from the top down or the bottom up. The bottom-up connections are local in the layer for both excitatory and inhibitory nodes while the top-down connections are increasingly global.
"The new paradigm emphasizes the interaction of top-down cortical influences with bottom-up sensory feedback," said Cauller.
The top-down connections represent the system's analysis and predictions and thus embody the "conscious" perception of fused sensory objects.
According to Cauller, the convergence of top-down connections back to the site of bottom-up sensory inputs may hold the key to how the mind fuses separate visual, auditory and other sensory-input types into a perception of an "object."
"Such a convergence provides a possible spatial solution to the binding problem whereby the distinct sensory features of an object, such as its color, shape and smell, are bound together into a unified percept," said Cauller.
The ANS design is predicated on a futuristic nanotechnology device based on the resonant tunneling of electrons a diode that simulates the negative conductance of the ion channels that govern the behavior of neurons.
"The group now at Raytheon showed me the specs for this nanotechnology device that didn't switch well but that did really neat analog operations. When they asked me what I would build with such a device, I said, 'a brain I'd build an artificial brain,' " recalled Cauller.
The distinction between memory and processing is merged with the ANS architecture into a single processor capable of correlational operations on pulse-coded data from many different inputs simultaneously. Each processor learns to recognize repetitive time sequences and generate appropriate pulsecoded output signals. The analog correlation operations occur inside each processor; all external communications signals among processors use digital pulsecodes. "We are using each technology where it works best," said Raytheon's Penz.
The nanotechnology processors will be so small, according to Cauller, that it will be possible to put 1,000 parallel processors inside a conventional chip package. But Raytheon plans to prove the architecture using simulations before committing to producing the nanotechnology processors. Initial implementations will use networks of conventional digital microprocessors to simulate the behavior of future, nanotechnology-sized processors.
"Our first big hardware problem will be cooling the 350 Pentium processors we intend to network together in the simulation," said Penz. The ANS models the human nervous system by using pulses modeled on the action potentials that connect neurons via active transmission lines (axons) and interconnections (synapses).
"Our first big hurdle, which we just passed, was to encode the actions of a neuron into a single circuit. We finally got the simulation down to just 50 kbytes," said Cauller.
The initial prototype will use a special graphical user interface to hide the implementation details from the design-level decisions. The simulation initially will run entirely in software on a workstation, will later be accelerated by the network of 350 Pentiums and, if all goes well, will eventually be executed by the nanotechnology devices.
The initial ANS will model 4,000 "neurons" in various configurations. The first task will profile the desirable network dynamics, such as where the "good" chaotic attractors lie, as a function of network size and configuration.
The initial application of the ANS will be learning English from scratch. Audio sensors will listen to spoken words as directed by motor outputs, which move the "eyes" on the system's Cog head to read one line at a time. Autonomous interactions with the environment will train the system dynamics to learn speech automatically.
"We expect the ANS to begin babbling like a baby as it slowly learns to read in this manner," said Penz.
Raytheon's multidisciplinary research team, headed by Penz, includes experts in neuroscience, neural-network simulation, sensor and actuator hardware, and developmental psychology. Team members hail from Brown University, Georgia Tech, MIT and the University of Texas at Dallas.