Physics Aids AI in Energy Savings: Neuromorphic Miracles

Contemporary artificial intelligence (AI) is known for its impressive performance, but it comes with a significant energy consumption issue. As AI tackles more complex tasks, its energy demands only increase.

Addressing this challenge, scientists from the Max Planck Institute for the Science of Light in Erlangen, Germany have presented a new teaching method for AI that promises to be more efficient. Their approach is based on physical processes rather than the conventional digital artificial neural networks. The findings have been published in the journal Physical ReView X.

Based on data from the Statista statistical company, training the GPT-3 artificial intelligence model would consume approximately 1,000 megawatt-hours of energy. This is equivalent to the energy consumption of 200 households in Germany over a year.

To address the energy consumption issue in AI, scientists are exploring a new data concept called neuromorphic computing. Unlike artificial neural networks that run on traditional digital computers, neuromorphic computing mimics the parallel processing of the human brain, where numerous processes occur simultaneously rather than sequentially.

Florian Markardt, one of the study’s authors, explained that the data transfer between the processor and memory in modern neural networks consumes a tremendous amount of energy.

In collaboration with Lopez Pastor, Markardt developed an efficient teaching method for neuromorphic computers, utilizing a self-learning physical machine. They anticipate presenting the first machine of its kind within the next three years.

These advancements in neuromorphic computing offer great potential for the future development of AI. It promises to be both more effective and more energy-efficient than existing solutions.

/Reports, release notes, official announcements.