New research on catastrophic forgetting of AI neural networks


New research on catastrophic forgetting of AI neural networks

As humans, we are aware of our sleeping patterns. On average, we humans need to take at least 7 hours of sleep every day. During our sleeping activity, our body experiences various changes such as a change in the heart rate or breathing, a change in metabolism flow, a change in hormone levels, etc. 

Keywords: catastrophic forgetting, artificial neural networks, rational memory, spiking neural networks 


In new research, researchers have discussed helping to mitigate the threat of catastrophic forgetting by mimicking the sleep patterns of the human brain in artificial neural networks. In previous studies, the team of researchers has reported on the importance of sleep in building rational memory, the ability to remember arbitrary or indirect associations between objects, people, or events, and protecting against forgetting old memories.

Maxim Bazhenov, Ph.D., professor of medicine and a sleep researcher at the University of California San Diego School of Medicine said, "The brain is very busy when we sleep, repeating what we have learned during the day. Sleep helps reorganize memories and presents them in the most efficient way".

Artificial intelligent neural networks make use of nervous science as their foundations. They achieve accurate monitoring of targets, providing effective information for staff decisions through three-dimensional monitoring and calculation, with unit nodes as computational modules.  

For more information on Artificial intelligent neural networks read this article: Application Analysis of Artificial Intelligent Neural Network Based on Intelligent Diagnosis

                                 


Artificial neural networks have surely achieved superhuman performance such as computational speed. But the only drawback of this network is catastrophic forgetting. Catastrophic forgetting is referred to as when artificial neural networks learn sequentially, new information overwrites previous information. 

Researcher Bazhenov said, "In contrast, the human brain learns continuously and incorporates new data into existing knowledge and it typically learns best when new training is interleaved with periods of sleep for memory consolidation".

In this research, scientists made use of spiking neural networks that artificially mimic natural neural systems. In these systems, the information is not being communicated continuously. Rather, it is at certain time points as discrete events i.e. spikes. During this research, scientists observed that when the spiking networks were trained on a new task but were trained with off-line periods that mimicked sleep, the problem of catastrophic forgetting was mitigated.

In the human brain, memories are represented by patterns of synaptic weight. According to the researchers, like the human brain, "sleep" allowed the networks to replay old memories without depending on or explicitly using old training data. 

Researcher Bazhenov said, "When we learn new information, neurons fire in specific order and this increases synapses between them. During sleep, the spiking patterns learned during our awake state are repeated spontaneously. It's called reactivation or replay. Synaptic plasticity, the capacity to be altered or molded, is still in place during sleep and it can further enhance synaptic weight patterns that represent the memory, helping to prevent forgetting or to enable transfer of knowledge from old to new tasks".

He added, "It meant that these networks could learn continuously, like humans or animals. Understanding how human brain processes information during sleep can help to augment memory in human subjects. Augmenting sleep rhythms can lead to better memory. In other projects, we use computer models to develop optimal strategies to apply stimulation during sleep, such as auditory tones, that enhance sleep rhythms and improve learning. This may be particularly important when memory is non-optimal, such as when memory declines in aging or in some conditions like Alzheimer's disease"


Story Source:
Materials provided by University of California - San Diego. The original text of this story is licensed under a Creative Commons License. Note: Content may be edited for style and length.


Journal Reference:

  1. Ryan Golden, Jean Erik Delanois, Pavel Sanda, Maxim Bazhenov. Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representationPLOS Computational Biology, 2022; 18 (11): e1010628 DOI: 10.1371/journal.pcbi.1010628