Neuromorphic Electronic Systems [chapter]

2015 Encyclopedia of Computational Neuroscience  
Biological in forma tion-processing systems operate on completely different principles from those with which most engineers are familiar. For many problems, particularly those in which the input data are ill-conditioned and the computation can be specified in a relative manner, biological solutions are many orders of magnitude more effective than those we have been able to implement using digital methods. This advantage can be attributed principally to the use of elementary physical phenomena
more » ... computational primitives, and to the representation of information by the relative values of analog signals, rather than by the absolute values of digital signals. This approach requires adaptive techniques to mitigate the effects of component differences. This kind of adaptation leads naturally to systems that learn about their environment. Large-scale adaptive analog systems are more robust to component degredation and failure than are more conventional systems, and they use far less power. For this reason, adaptive analog technology can be expected to utilize the full potential of waferscale silicon fabrication. TWO TECHNOLOGIES Historically, the cost of computation has been directly related to the energy used in that computation. Today's electronic wristwatch does far more computation than the Eniac did when it was built. It is not the computation itself that costs-it is the energy consumed, and the system overhead required to supply that energy and to get rid of the heat: the boxes, the connectors, the circuit boards, the power supply, the fans, all of the superstructure that makes the system work. As the technology has evolved, it has always moved in the direction of lower energy per unitcomputation. That trend took us from vacuum tubes to transisitors, and from transistors to integrated circuits. It was the force behind the transition from n-MOS to CMOS technology that happened less than ten years ago. Today, it still is pushing us down to submicron sizes in semiconductor technology. So it pays to look at just how much capability the nervous system has in computation.There is a myth that the nervous system is slow, is built out of slimy stuff, uses ions instead of electrons, and is therefore ineffective. When the Whirlwind computer was first built back at M.I.T., they made a movie about it, which was called "Faster than Thought." The Whirwind did less computation than your wristwatch Manuscript . does. We have evolved by a factor of about 10 million in the cost of computation since the Whirlwind. Yet we still cannot begin to do the simplest computations that can be done by the brains of insects, let alone handle the tasks routinely performed by the brains of humans. So we have finally come to the point where we can see what is difficult and what is easy. Multiplying numbers to balance a bank account is not that difficult. What is difficult is processing the poorly conditioned sensory information that comes in through the lens of an eye or through the eardrum. A typical microprocessor does about 10 million operations/s, and uses about 1 W. In round numbers, it cost us about l O -' J to do one operation, the way we do it today, on a single chip. If we go off the chip to the box level, a whole computer uses about 10-5J/operation. Awhole computer is thus about two orders of magnitude less efficient than is a single chip. Back in the late 1960's we analyzed what would limit the electronic device technology as we know it; those calculations have held up quite well to the present [I]. The standard i ntegrated-ci rcu it fabricat ion processes available today allow usto build transistorsthat have minimum dimensions of about 1 p m). By ten years from now, we will have reduced these dimensions by another factor of 10, and we will be getting close to the fundamental physical limits: if we make the devices any smaller, they will stop working. It is conceiveable that a whole new class of devices will be invented-devices that are not subject to the same limitations. But certainly the ones we have thought of up to now-including the superconducting ones-will not make our circuits more than abouttwoordersof magnitude more dense than those we have today. The factor of 100 in density translates rather directly into a similar factor in computation efficiency. So the ultimate silicon technology that we can envision today will dissipate on the order of J of energy for each operation at the single chip level, and will consume a factor of 100-1000 more energy at the box level. We can compare these numbers to the energy requirements of computing in the brain. There are about 10"synapases in the brain. A nerve pulse arrives at each synapse about ten timesls, on average. So in rough numbers, the brain accomplishes 10" complex operations/s. The power dissipation of the brain is a few watts, so each operation costs only IO6 J. The brain is a factor of 1 billion more efficient than our present digital technology, and a factor of
doi:10.1007/978-1-4614-6675-8_100408 fatcat:l6ex6ys24ffo3gs32nbbhneniq