New challenge for bionics-brain-inspired computing

Shan Yu
2016 Science Press Zoological Research   unpublished
By definition, bionics is the application of biological mechanisms found in nature to artificial systems in order to achieve specific functional goals. Successful examples range from Velcro, the touch fastener inspired by the hooks of burrs, to self-cleaning material, inspired by the surface of the lotus leaf. Recently, a new trend in bionics-Brain-Inspired Computing (BIC)-has captured increasing attention. Instead of learning from burrs and leaves, BIC aims to understand the brain and then
more » ... ize its operating principles to achieve powerful and efficient information processing. In the past few decades, we have witnessed dramatic progress in information technology. Moore's law, which states that transistor density in processors doubles every two years, has been proven true for the last 50 years. As a result, we now have miniature processors in small devices (e.g., phones) that, in terms of numerical calculation and memory storage, easily dwarf the brightest human mind. Given such a condition, which aspects of the brain can still enlighten us? First, we need more energy-efficient processors. Nowadays, supercomputers or large data centers contain thousands of cores/processors, with the energy consumption rate at the megawatt scale. This severely limits the use of computing power in embedded (e.g., small, smart devices) and long distance (e.g., Mars rover) applications. In addition, with further extrapolation of Moore's law, the energy density of a microprocessor will become so high that it will start to melt. In fact, this is an important reason why it is believed that the trend described by Moore's law will come to an end, and probably soon. In contrast, the brain is extremely energy-efficient. With many capabilities that are still far beyond modern computers, the power of an adult brain is only about 20 watts. Therefore, to learn from the brain how to be "greener" is a major goal of BIC. With the knowledge obtained in neuroscience, we now know that the secret of the brain's energy efficiency involves various factors, including the co-localization of data processing and storage, highly distributed processing, and sparse activity. Neuromorphic computing aims to implement these features in microprocessors, with electronic elements mimicking the activities of individual neurons and millions of artificial neurons interacting with each other to process information (Merolla et al., 2014). In the most recent advance in this direction, IBM reported that they achieved satisfactory performance in complex pattern recognition tasks with a neuromorphic chip. Compared with conventional chips, the system reduced the energy consumption rate by many orders of magnitude (Esser et al., 2016). It is reasonable to expect that the knowledge learned from the brain will enable us to eventually combine super computing power with extremely low energy demand in the not-so-faraway future. 1
fatcat:72qymwwxavbs5jfwwebiko4xqq