Computing operations might benefit from mimicking the human brain, according to presentations at recent industry events.
First weighing in on the topic was Jeff Kodosky, cofounder and National Instruments Business and Technology Fellow. As reported by EE-Evaluation Engineering Senior Technical Editor Tom Lecklider, “Working with Big Analog Data, according to Kodosky, can be compared to the way the human brain processes vision. Some processing is done directly in the retina even before the 100 million fibers in the optic nerve transport signals to the brain. There, data analysis proceeds in stages, specialized portions of the brain handling different aspects—even with a separate area for facial recognition. The parallel with digital computation is not hard to draw. Specialized algorithms running close to the edge of the system preprocess huge amounts of raw data to reduce the amount of work required at subsequent stages. And, LabVIEW’s dataflow model continues to be the best way to describe these processes.”1
Lecklider continued, “An example showed more directly how NI technology had been used to solve a big data problem. As Kodosky explained, engineers on London’s Victoria subway line used CompactRIO in a predictive maintenance system. The line has 385 separate segments that facilitate dense but safe train scheduling. Nevertheless, the train sensing circuitry sometimes failed, and although it always failed safe by design, the line shut down until the fault could be found and corrected. With the new system, each section is monitored for fluctuations from its previously benchmarked normal behavior. If a significant deviation occurs, maintenance can be scheduled without disrupting service.”
Brain-inspired computing also was the topic of a keynote address by Karim Arabi, vice president of engineering at Qualcomm, at the 2015 International Test Conference last month. The mission, he said, is to make everything around us smarter and connected.
At one time, Arabi said, people bought phones primarily based on their audio quality, but expectations today are much more diverse. We want our devices to always be on and always aware, he said, using sensors and other mechanisms to observe activities, discover patterns, and make decisions—they must anticipate, predict, and alert.
He turned his attention to computing at the cloud and the edge. Big data and abundant computing power are tending to push computation into the cloud, he said, but with the deployment of the Internet of Things, we can’t afford to send all the data into the cloud. Consequently, the demand for processing horsepower at the edge is increasing. And for efficiency, we need the right machine for the right task.
He then cited some key specs: the human brain can store 3.5 petabytes and operates at 20 petaFLOPS on 20 W. In contrast, the IBM Sequoia can store 1.6 petabytes, operates at 16.3 petaFLOPS, and consumes 7.9 MW. The brain, he said, is efficient because it is a massively parallel machine with 86 billion neurons—“it’s the most interconnected machine in the world.”
He added that the brain has no system clock—it’s event-driven. There is no hardware/software distinction, and the same components handle processing and memory. And in contrast to our designs, which start out simple and become more complex, the brain continuously simplifies itself. Further, while computer multicore scaling improvements top out around 10 cores, our brain operates with millions of “cores” in parallel.
The brain is, in fact, a heterogeneous machine, he said, consisting of structures such as the frontal lobe, cerebellum, motor cortex, sensory cortex, temporal lobe, and occipital lobe; further, the left brain deals with logic, math, and science while the right brain handles feelings, art, and philosophy. Large sets of neurons, he said, are specialized to operate in each area; but within each area, operations are massively parallel.
Consequently, he said, multicore and heterogeneous computing constitute the first step toward brain-inspired computing. The second step is deep learning. The third is approximate computing, tolerating errors, with fault tolerance. The fourth and final step, he said, is neuromorphic computing, which “… excels at computing complex dynamics using a small set of computational primitives.”
Can brain-inspired computing ever rival the performance of the human brain? In an approach to an answer, Arabi discussed concepts such as Hebbian learning (neurons that fire together wire together, neurons that fire out of sync lose their link) and spike timing dependent plasticity.
But, he concluded, “The real challenge for neuromorphic computing is not technology, but training.” It takes people 30 years to earn a Ph.D., he said, adding that you can’t put a product on the market and wait 30 years for it to learn what it needs to know.
He also conceded that the human brain is not very good at brute-force computing; as a result, he said, “The best solution is a combination of traditional computing for brute-force calculations and brain-inspired computing for complex decisions and pattern generation.” Nevertheless, he concluded, “The computer will eventually outpace our intelligence and logic in all possible ways.”
Reference
- Lecklider, T., “Big Analog Data operations mimic human brain processing,” EE-Evaluation Engineering Online, Aug. 6, 2015.