IBM experts have developed a new approach to creating high-performance neural networks, taking inspiration from the structure of the human brain. The findings of their research have been published in the journal Nature [source].
The field of generative AI has seen significant advancements with deep neural networks. However, these networks face limitations in their architecture that prevent them from achieving maximum efficiency.
Traditional neural networks separate memory blocks and data processing, resulting in high communication overhead and decreased speed and efficiency.
Inspired by the ideal model of the human brain, IBM has developed a new chip featuring 64 analog computing nuclei that closely interact, akin to synapses. This design enables high performance while consuming low energy.
The chip demonstrates an accuracy of 92.81% in the CIFAR-10 dataset, comparable to software neural networks.
According to one of the study’s authors, this new approach opens up possibilities for using AI in mobile and built-in devices, as well as reducing the energy consumption of cloud providers.
Future plans involve adding digital blocks to the chip, allowing for the implementation of fully pipelined neural networks. This enhancement will maximize the utilization of analog computing nuclei.
Overall, the new IBM chip marks a significant milestone in the development of energy-efficient AI inspired by the human brain’s architecture. This advancement will expand the applications of neural networks and decrease energy consumption.