October 7, 2025

Energy-efficient thinking: What AI can learn from the brain

FIAS researchers suggest new model for energy-efficient neural information processing.

Biological brains are extremely energy-efficient. Can artificial intelligence learn a few tricks from them? Researchers at FIAS and colleagues from France describe new findings on energy-efficient information processing in the journal Nature Communications.

The human brain is a master at saving energy: it needs only about 20 watts—about as much as a modern freezer—to perform trillions of calculations per second. Can such findings contribute to a new generation of more energy-efficient artificial intelligence (AI)? Two mechanisms appear to play a decisive role in the brain's energy efficiency: The brain's nerve cells communicate via single small electrical voltage impulses called action potentials. Artificial neural networks, as commonly used in AI, on the other hand, operate at continuous activity levels, which consumes much more energy. In addition, the brain appears to only pass on information from one processing stage to the next that could not already be predicted. This describes the theory of predictive coding, according to which the brain constantly generates predictions about the future. Omitting information that was already predictable also saves energy.

However, information processing with action potentials and predictive coding do not seem to match very well. Encoding information using action potentials makes it difficult to distinguish predicted signals from actual signals. It therefore remains unclear whether the brain actually uses predictive coding and, if so, how it could achieve this with action potentials. Here, Antony N'dri and colleagues propose a new theoretical approach that could solve this problem: inhibitory synapses, which reduce the activity of nerve cells, could learn to suppress action potentials that are particularly easy to predict. This saves energy in the areas where the least new information is delivered. The working group led by FIAS Senior Fellow Jochen Triesch calls this new approach Predictive Coding Light – a “light” variant of predictive coding because signals that are particularly easy to predict are suppressed. In contrast to conventional predictive coding, not only prediction errors are forwarded to higher processing levels, but also a compressed representation of the actual data.

In their work, the researchers simulate a specific network model on a computer that implements these ideas and learns to process visual information. Remarkably, the model can reproduce a variety of biological observations from the primary visual cortex, the first stage of visual information processing in the cerebral cortex. In particular, it explains several neurobiological findings that are considered typical of predictive coding. In addition, the team tested the neural network on technical tasks such as gesture recognition and handwritten digit recognition. Their approach achieves significant energy savings without severely compromising recognition performance. “The importance of inhibitory synapses may have been underestimated so far,” explains Triesch. “They were considered to be rather unspecific ‘brakes’ that prevent brain activity from getting out of control, for example during an epileptic seizure. We suspect that inhibitory synapses play a central role in how the brain learns to encode and process sensory information in an energy-efficient manner."

However, it may be some time before these findings find their way into our smartphones. The architecture of today's AI chips is very different from that of the brain. So far, it is only a small but quickly growing community of researchers who are pushing ahead the development of neuromorphic chips, i.e., chips modeled on the brain, to make AIs more energy efficient. 

Publication: Antony W. N’dri, Thomas Barbier, Céline Teulière, Jochen Triesch: Predictive Coding Light, Nature Communications (online 06.10.2025), https://doi.org/10.1038/s41467-025-64234-z



more information

Chips modeled on the brain could make AI more energy efficient. (Cyber-brain, Kohji Asakawa / Pixabay)