Excitatory Synapse¶
-
class
snnpytorch.synapse.exc_synapse.SynapseLayer(num_input_neurons=10, num_output_neurons=100, synapse_time_constant=0.05, conn_prob=0.5, initial_weight_config=None)¶ Bases:
torch.nn.modules.module.ModuleClass of a synapse layer.
The iterative model for the synapse is inspired from the following papers:
‘Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike Timing Dependent Backpropagation’ by Rathi et al. , ICLR 2019, https://openreview.net/forum?id=B1xSperKvH
‘Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks’ by Wu et al., https://www.frontiersin.org/articles/10.3389/fnins.2018.00331/full
In PyTorch, the total synaptic input current to a neuron / synapse state (i) update is performed in the following manner:
\[i^{t} = \lambda i^{t-1} + wI\]where \(\lambda\) is the synaptic time constant, t is the present timestep/iteration, w is the synaptic weight, I is the input spike.
- Parameters
num_input_neurons – Number of neurons in input layer
num_output_neurons – Number of neurons in output layer
synapse_time_constant – Synapse time constant
conn_prob – Connection probability from input to output layer
initial_weight_config – Weight initialisation method and value.
Default value of initial_weight_config : {“method”: “constant”, “value”: 0.25}
-
create_connectivity() → None¶ Create random connectivity between input and output layer neurons.
-
forward(x: torch.Tensor) → torch.Tensor¶ Forward pass for this synapse layer.
- Parameters
x – Input neuron spikes
- Returns
Synapse state / Input current to the output layer
-
initialize_states(model_device='cuda:0') → None¶ Initialise synapse layer weights, connectivity, and state.
- Parameters
model_device – ‘cpu’ or ‘cuda:0’
-
update_synapse_states(x: torch.Tensor)¶ Update the synapse state.
- Parameters
x – Input neuron spikes
- Returns
Synapse state / Input current to the output layer