Git Product home page Git Product logo

spyx's Introduction

โšก๐Ÿง ๐Ÿ’ป Welcome to Spyx! ๐Ÿ’ป๐Ÿง โšก

arXiv DOI PyPI version

README Art

Why use Spyx?

Spyx (pronounced "spikes") is a compact spiking neural network library built on top of DeepMind's Haiku package, offering the flexibility and extensibility of PyTorch-based frameworks while enabling the extreme perfomance of SNN libraries which implement custom CUDA kernels for their dynamics.

The library currently supports training SNNs via surrogate gradient descent and neuroevolution, with additional capabilities such as ANN2SNN conversion and Phasor Networks being planned for the future. Spyx offers a number of predefined neuron models but is designed for it to be easy to define your own and plug it into a model; the hope is to soon include definitions of SpikingRWKV and other more sophisticated model blocks into the framework.

Installation:

As with other libraries built on top of JAX, you need to install jax with GPU if you want to get the full benefit of this library. Directions for installing JAX with GPU support can be found at the following: https://github.com/google/jax#installation

The best way to install and run Spyx is if you install it into a container/environment that already has JAX and PyTorch installed.

The spyx.data submodule contains some pre-built dataloaders for use with spyx - to install the depedencies for it run the command pip install spyx[data]

Hardware Requirements:

Spyx achieves extremely high performance by maintaining the entire dataset in the GPU's vRAM; as such a decent amount of memory for both the CPU and GPU are needed to handle the dataset loading and then training. For smaller networks of only several hundred thousand parameters, the training process can be comfortably executed on even laptop GPU's with only 6GB of vRAM. For large SNNs or for neuroevolution it is recommended to use a higher memory card.

Since Spyx is developed on top of the current JAX version, it does not work on Google Colab's TPUs which use an older version. Cloud TPU support will be tested in the near future.

Research and Projects Using Spyx:

Experiments/Benchmarks used in the Spyx Paper: Benchmark Notebooks

Master's Thesis: Neuroevolution of Spiking Neural Networks DOI

*** Your projects and research could be here! ***

Contributing:

If you'd like to contribute, head on over to the issues page to find proposed enhancements and leave a comment! Also head over to the Open Neuromorphic Discord server to ask questions!

Citation:

If you find Spyx useful in your work please cite it using the following Bibtex entries:

@misc{heckel2024spyx,
    title={Spyx: A Library for Just-In-Time Compiled Optimization of Spiking Neural Networks},
    author={Kade M. Heckel and Thomas Nowotny},
    year={2024},
    eprint={2402.18994},
    archivePrefix={arXiv},
    primaryClass={cs.NE}
}
@software{kade_heckel_2024_10635178,
  author       = {Kade Heckel and
                  Steven Abreu and
                  Gregor Lenz and
                  Thomas Nowotny},
  title        = {kmheckel/spyx: v0.1.17},
  month        = feb,
  year         = 2024,
  publisher    = {Zenodo},
  version      = {camera-ready},
  doi          = {10.5281/zenodo.10635178},
  url          = {https://doi.org/10.5281/zenodo.10635178}
}

spyx's People

Contributors

biphasic avatar kmheckel avatar stevenabreu7 avatar tnowotny avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

spyx's Issues

Spyx SHD Paper benchmark needs reviewed

There might be a bug with respect to time_major = False or True that could present incorrect results since scanning over one axis is faster than scanning over the other (time steps vs channel dim...)

Convert Regularization Utilities to Higher Order Functions

I think it's appropriate for Spyx to adopt a functional approach as much as possible, so using functions that return other functions (higher order functions) rather than classes would be fitting since it helps guide the user towards JIT compiling.

  • Convert spyx.fn.silence_reg to a H.O.F.
  • Convert spyx.fn.sparsity_reg to a H.O.F.

AttributeError: module 'spyx.axn' has no attribute 'Axon'

I am experimenting cartpole with spiking neural network with spyx, but i got the following error. could you please assist?


AttributeError Traceback (most recent call last)
in <cell line: 4>()
2 init_state = (jnp.zeros(64), jnp.zeros(2))
3 policy = hk.without_apply_rng(hk.transform(controller))
----> 4 policy_params = policy.init(rng=key, x=adapter(obs), state=init_state)

2 frames
in controller(x, state)
7 core = hk.DeepRNN([
8 hk.Linear(64, with_bias=False),
----> 9 snn.LIF(64, beta=0.8, activation=spyx.axn.Axon()),
10 hk.Linear(2, with_bias=False),
11 snn.LI(2)

AttributeError: module 'spyx.axn' has no attribute 'Axon'

Adjust Neuron Models in spyx.nn to store constant betas as hk.params

Currently if the user specifies the inverse time constant/beta value it will not be tracked in the PyTree for the network, making the layer invisible when trying to export it to NIR for cross platform function.

Each neuron model needs an "else" clause that calls hk.get_parameter() but with the init argument set to the user specified value in order to fix this.

See the fixed LI neuron as an example of what needs to be done for the other neuron models (except for IF... This will need a different solution/approach to be visible.)

Add Phasor layers/network capability

It would be awesome to implement the ability to train spiking phasor networks in Spyx. JAX has the ability to support complex-valued autodifferentiation, so this should be possible. Doing this would enable extremely fast training by eliminating recurrence when learning before converting to a recurrent architecture for inference.

https://arxiv.org/abs/2204.00507

Write test cases for CI/CD

Right now there's no test cases to verify that changes to the code base work for actually training models/there's no way to detect if changes to other packages might break the library.

  • Write test functions for various neuron models on simple syntethic data traces.
  • Write test functions to compute gradients for each surrogate model to ensure no errors.

Fix Sphinx Documentation

Right now the Sphinx documentation isn't showing the members of each submodule.

  • Get Sphinx to render member functions for each submodule

Support for latency-based spike coding

  • Add functions to convert static data to spiking data via latency coding
  • add loss functions that are compatible with/allow time-to-first-spike training

Implement interface to NIR

Create functions to load/save models to HDF5 under the Neuromorphic Intermediate Representation standard to facilitate cross-platform deployment.

https://nnir.readthedocs.io/en/latest/what.html

  • Implement spyx.nir.to_nir()

  • Implement spyx.nir.from_nir()

  • Support feed-forward network import

  • Support ConvNet import

  • Support explicitly recurrent import

  • Implement FFN exporting

  • Implement CSNN exporting

  • Implement RSNN exporting

Improve Documentation

The current documentation is passable but it could always be better.

  • Make spyx.nn listings cleaner by removing unwanted special-members
  • Incorporate notebooks from the examples/research folders.
  • More math equations for surrogate gradient functions in spyx.axn
  • Write intro and quickstart pages
  • Add documentation/tutorial for NIR exporting beyond the sample notebooks.demo
  • Add better return type annotations showing the arguments for callables returned by HOFs/function builders.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.