A new paper from researchers working in the UK and Germany dives into how much power the human brain consumes when performing various tasks ? and sheds light on how humans might one day build similar computer-based artificial intelligences. Mapping biological systems isn't as sexy as the giant discoveries that propel new products or capabilities, but that's because it's the final discovery ? not the decades of painstaking work that lays the groundwork ? that tends to receive all the media attention.
This paper ? Power Consumption During Neuronal Computation ? will run in an upcoming issue of IEEE's magazine, "Engineering Intelligent Electronic Systems Based on Computational Neuroscience." Here at ET, we've discussed the brain's computational efficiency on more than one occasion. Put succinctly, the brain is more power efficient than our best supercomputers by orders of magnitude ? and understanding its structure and function is absolutely vital.
Is the brain digital or analog? Both
When we think about compute clusters in the modern era, we think about vast arrays of homogeneous or nearly-homogeneous systems. Sure, a supercomputer might combine two different types of processors ? Intel Xeon + Nvidia Tesla, for example, or Intel Xeon + Xeon Phi ? but as different as CPUs and GPUs are, they're both still digital processors. The brain, it turns out, incorporates both digital and analog signaling into itself and the two methods are used in different ways. One potential reason why is that the power efficiency of the two methods varies dramatically depending on how much bandwidth you need and how far the signal needs to travel.
The efficiency of the two systems depends on what SNR (signal to noise) ratio you need to maintain within the system.
One of the other differences between existing supercomputers and the brain is that neurons aren't all the same size and they don't all perform the same function. If you've done high school biology you may remember that neurons are broadly classified as either motor neurons, sensory neurons, and interneurons. This type of grouping ignores the subtle differences between the various structures ? the actual number of different types of neurons in the brain is estimated between several hundred and perhaps as many as 10,000 ? depending on how you classify them.
Compare that to a modern supercomputer that uses two or three (at the very most) CPU architectures to perform calculations and you'll start to see the difference between our own efforts to reach exascale-level computing and simulate the brain, and the actual biological structure. If our models approximated the biological functions, you'd have clusters of ARM Cortex M0 processors tied to banks of 15-core Xeons which pushed data to Tesla GPUs, which were also tied to some Intel Quark processors with another trunk shifting work to a group of IBM Power8 cores ? all working in perfect harmony. Just as modern CPUs have vastly different energy efficiencies, die sizes, and power consumption levels, we see exactly the same trends in neurons.
Address: 5636 Lemon Ave.
Dallas TX 75209
Phone: +1 214 5203694