This repository features a minimal implementation of the (Branch) Expressive Leaky Memory neuron in PyTorch. Notebooks to train and evaluate on NeuronIO are provided, as well as pre-trained models of various sizes.
- Create the conda environment with
conda env create -f elm_env.yml
- Once installed, activate the environment with
conda activate elm_env
The models folder contains various sized Branch-ELM neuron models pre-trained on NeuronIO.
1 | 2 | 3 | 5 | 7 | 10 | 15 | 20 | 25 | 30 | 40 | 50 | 75 | 100 | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
#params | 4601 | 4708 | 4823 | 5077 | 5363 | 5852 | 6827 | 8002 | 9377 | 10952 | 14702 | 19252 | 34127 | 54002 |
AUC | 0.9437 | 0.9582 | 0.9558 | 0.9757 | 0.9827 | 0.9878 | 0.9915 | 0.9922 | 0.9926 | 0.9929 | 0.9934 | 0.9934 | 0.9938 | 0.9935 |
We also include a best effort trained ELM neuron that achieves 0.9946 AUC with
- train_elm_on_shd.ipynb: train an ELM neuron on SHD or SHD-Adding Dataset.
- train_elm_on_neuronio.ipynb: train an ELM neuron on NeuronIO Dataset.
- eval_elm_on_neuronio.ipynb: evaluate provided models on the NeuronIO Dataset.
- neuronio_train_script: script to train an ELM neuron on NeuronIO Dataset.
The src folder contains the implementation and training/evaluation utilities.
- expressive_leaky_memory_neuron.py: the implementation of the ELM model.
- neuronio: files related to visualising, training and evaluating on the NeuronIO dataset.
- shd: files related to downloading, training and evaluating on the Spiking Heidelberg Digits (SHD) dataset, and its SHD-Adding version.
Note: the PyTorch implementation seems to be about 2x slower than the jax version unfortunately.
Running the NeuronIO related code requires downloading the dataset first (~115GB).
- Download Train Data: single-neurons-as-deep-nets-nmda-train-data
- Download Test Data (Data_test): single-neurons-as-deep-nets-nmda-test-data
- For more information, please checkout the following repository: neuron_as_deep_net
Running the SHD related code is possible without seperately downloading the dataset (~0.5GB).
- The small SHD daset will automaticall be downloaded upon running the related notebook.
- A dataloader for the introduced SHD-Adding dataset is provided in /src/shd/shd_data_loader.py
- For more information onf SHD, please checkout the following website: spiking-heidelberg-datasets-shd
Running the LRA training/evaluation is not provided at the moment.
- To download the dataset, we recommend to checkout the following repository: mega
- For the input preprocessing, please refer to our preprint.
If you like what you find, and use an ELM variant or the SHD-Adding dataset, please consider citing us:
[1] Spieler, A., Rahaman, N., Martius, G., Schölkopf, B., & Levina, A. (2023). The ELM Neuron: an Efficient and Expressive Cortical Neuron Model Can Solve Long-Horizon Tasks. arXiv preprint arXiv:2306.16922.