A new paper from researchers working in the UK and Germany dives into how much power the human brain consumes when performing various tasks — and sheds light on how humans might one day build similar computer-based artificial intelligences. Mapping biological systems isn’t as sexy as the giant discoveries that propel new products or capabilities, but that’s because it’s the final discovery — not the decades of painstaking work that lays the groundwork — that tends to receive all the media attention.
This paper — Power Consumption During Neuronal Computation — will run in an upcoming issue of IEEE’s magazine, “Engineering Intelligent Electronic Systems Based on Computational Neuroscience.” Here at ET, we’ve discussed the brain’s computational efficiency on more than one occasion. Put succinctly, the brain is more power efficient than our best supercomputers by orders of magnitude — and understanding its structure and function is absolutely vital.
Is the brain digital or analog? Both
When we think about compute clusters in the modern era, we think about vast arrays of homogeneous or nearly-homogeneous systems. Sure, a supercomputer might combine two different types of processors — Intel Xeon + Nvidia Tesla, for example, or Intel Xeon + Xeon Phi — but as different as CPUs and GPUs are, they’re both still digital processors. The brain, it turns out, incorporates both digital and analog signaling into itself and the two methods are used in different ways. One potential reason why is that the power efficiency of the two methods varies dramatically depending on how much bandwidth you need and how far the signal needs to travel.
"Got 16 days / got a bottle and a rosary
God I wish that you were close to me"
Permissions in this forum:You cannot reply to topics in this forum