neuroaikit.tf.layers.snu.SNU¶
- neuroaikit.tf.layers.snu.SNU(units, activation=<function step_function>, decay=0.8, g=<function identity>, recurrent=False, lateral_inhibition=False, **args)[source]¶
This is a basic SNU layer.
- Parameters
units – Number of units to create in the layer
decay – Membrane potential decay multiplier, defaults to 0.8, i.e. 0.8 of the previous membrane potential is retained
activation – Activation function, defaults to step_function. See TF_Misc.Activations.
g – Internal state activation function that optionally constraints the state, defaults to tf.identity (no constraint)
recurrent – bool, defaults to False. If True, SNU includes recurrent connections inside entire layer.
lateral_inhibition – bool, defaults to False. If True, layer-wise lateral inhibition is used.
args – Additional arguments to the Keras layer constructor (e.g. name, trainable).
- Returns