README.md

Continual Learning with Columnar Spiking Neural Networks

Continual learning is a key feature of biological neural systems, but artificial neural networks often suffer from catastrophic forgetting. Instead of backpropagation, biologically plausible learning algorithms may enable stable continual learning. This study proposes columnar-organized spiking neural networks (SNNs) with local learning rules for continual learning and catastrophic forgetting. Using CoLaNET (Columnar Layered Network), we show that its microcolumns adapt most efficiently to new tasks when they lack shared structure with prior learning. We demonstrate how CoLaNET hyperparameters govern the trade-off between retaining old knowledge (stability) and acquiring new information (plasticity). We evaluate CoLaNET on two benchmarks: Permuted MNIST (ten sequential pixel-permuted tasks) and a two-task MNIST/EMNIST setup. Our model learns ten sequential tasks effectively, maintaining 92% accuracy on each. It shows low forgetting, with only 4% performance degradation on the first task after training on nine subsequent tasks.

This repository is associated with the article available at: https://arxiv.org/abs/2506.17169

(Optional) To run CoLaNET you need ArNI-X framework (arni-x.pro for request).

Set python environment

dependencies are: tensorflow, keras, pandas, matplotlib

python --version
# Python 3.12.10
python -m venv venv
venv\Scripts\activate
pip install -r requirements.txt

Permuted MNIST tasks

baseline model: https://www.kaggle.com/code/dlarionov/continual-learning-on-permuted-mnist/

cd mnist
python .\PermutedMNIST.py

E/MNIST tasks

baseline model: https://www.kaggle.com/code/dlarionov/continual-learning-on-emnst

cd emnist
tar -xf .\emnist-balanced.zip
python .\EMNIST.py

Train CoLaNET

It runs CoLaNET with 70k images presenting each image 20 steps, resulting 1.4m total steps. After processing 60k images (1.2m steps) plasticity is disabled (controlled by -f1200000 parameter). Parameter -e2 defines configuration file.

cd CoLaNET
.\ArNIGPU.exe . -e2 -f1200000

PermutedMNIST pipine

You can run.cmd the pipe.

cd test1
tar -xf .\Experiments.zip  
python .\eval.py > test1.log

E/MNIST pipine

You can run.cmd the pipe.

cd test2
tar -xf .\Experiments.zip  
python .\eval.py > test2.log

E/MNIST heatmap

It generates monitoring files (specified by -v1) and writes output every 100,000 computation steps (specified by -F100000 parameter). The monitoring filename is hardcoded in the Python script. The network state is saved 100 steps before disabling plasticity (triggered by the -s479900 parameter).

cd test2
.\ArNIGPU.exe . -e10 -s479900 -f480000 -v1 -F100000
.\ArNIGPU.exe . -e11 -s479900 -f480000 -v1 -F100000
python .\specific.py

E/MNIST receptive fields

Описание

Continual Learning with Columnar Spiking Neural Networks

Конвейеры
0 успешных
0 с ошибкой