Develop Tools & CodePaper and LLMs

RMP-Loss: Regularizing Membrane Potential Distribution for Spiking Neural Networks

Spiking Neural Networks (SNNs) as one of the biology-inspired models have received much attention recently.

Tags:

Pricing Type

  • Pricing Type: Free
  • Price Range Start($):

GitHub Link

The GitHub link is https://github.com/yfguo91/mpbn

Introduce

The project “MPBN” presents the official implementation of Membrane Potential Batch Normalization for Spiking Neural Networks, introduced at ICCV2023. The approach involves adding a Batch Normalization (BN) layer before the firing function in spiking neural networks to normalize the membrane potential again after the nonlinear activation. The provided dataset can be downloaded automatically, and instructions to begin training are given. The citation for the method is also included.

Spiking Neural Networks (SNNs) as one of the biology-inspired models have received much attention recently.

Content

Official implementation of Membrane Potential Batch Normalization for Spiking Neural Networks (ICCV2023). The spiking neuron is much more complex with the spatio-temporal dynamics. The regulated data flow after the BN layer will be disturbed again by the membrane potential updating operation before the firing function, i.e., the nonlinear activation. Therefore, we advocate adding another BN layer before the firing function to normalize the membrane potential again, called MPBN. The dataset will be download automatically.


RMP-Loss: Regularizing Membrane Potential Distribution for Spiking Neural Networks

Related

No comments

No comments...