ExpIF#
- class brainpy.state.ExpIF(in_size, R=Quantity(1., "ohm"), tau=Quantity(10., "ms"), V_th=Quantity(-30., "mV"), V_reset=Quantity(-68., "mV"), V_rest=Quantity(-65., "mV"), V_T=Quantity(-59.9, "mV"), delta_T=Quantity(3.48, "mV"), V_initializer=Constant(value=-65. mV), spk_fun=ReluGrad(alpha=0.3, width=1.0), spk_reset='soft', name=None)#
Exponential Integrate-and-Fire (ExpIF) neuron model.
This model augments the LIF neuron by adding an exponential spike-initiation term, which provides a smooth approximation of the action potential onset and improves biological plausibility for cortical pyramidal cells.
The membrane potential dynamics follow:
\[ \tau \frac{dV}{dt} = -(V - V_{rest}) + \Delta_T \exp\left(\frac{V - V_T}{\Delta_T}\right) + R \cdot I(t) \]Spike condition: If \(V \geq V_{th}\): emit spike and reset \(V = V_{reset}\) (hard reset) or \(V = V - (V_{th} - V_{reset})\) (soft reset).
- Parameters:
in_size (
Size) – Size of the input to the neuron.R (
ArrayLike, default1. * u.ohm) – Membrane resistance.tau (
ArrayLike, default10. * u.ms) – Membrane time constant.V_th (
ArrayLike, default-30. * u.mV) – Numerical firing threshold voltage.V_reset (
ArrayLike, default-68. * u.mV) – Reset voltage after spike.V_rest (
ArrayLike, default-65. * u.mV) – Resting membrane potential.V_T (
ArrayLike, default-59.9 * u.mV) – Threshold potential of the exponential term.delta_T (
ArrayLike, default3.48 * u.mV) – Spike slope factor controlling the sharpness of spike initiation.V_initializer (
Callable) – Initializer for the membrane potential state.spk_fun (
Callable, defaultsurrogate.ReluGrad()) – Surrogate gradient function for the spike generation.spk_reset (
str, default'soft') – Reset mechanism after spike generation.name (
str, optional) – Name of the neuron layer.
- V#
Membrane potential.
- Type:
HiddenState
See also
Notes
The model was first introduced by Nicolas Fourcaud-Trocmé, David Hansel, Carl van Vreeswijk and Nicolas Brunel [1]. The exponential nonlinearity was later confirmed by Badel et al. [3]. It is one of the prominent examples of a precise theoretical prediction in computational neuroscience that was later confirmed by experimental neuroscience.
The right-hand side of the above equation contains a nonlinearity that can be directly extracted from experimental data [3]. In this sense the exponential nonlinearity is not an arbitrary choice but directly supported by experimental evidence.
Even though it is a nonlinear model, it is simple enough to calculate the firing rate for constant input, and the linear response to fluctuations, even in the presence of input noise [4].
For a comprehensive treatment of this model, see [2] and [5].
References
Examples
>>> import brainpy >>> import brainstate >>> import saiunit as u >>> # Create an ExpIF neuron layer with 10 neurons >>> expif = brainpy.state.ExpIF(10, tau=10*u.ms, V_th=-30*u.mV) >>> # Initialize the state >>> expif.init_state(batch_size=1) >>> # Apply an input current and update the neuron state >>> spikes = expif.update(x=1.5*u.mA)
- get_spike(V=None)[source]#
Generate spikes based on neuron state variables.
This abstract method must be implemented by subclasses to define the spike generation mechanism. The method should use the surrogate gradient function
self.spk_funto enable gradient-based learning.- Parameters:
*args – Positional arguments (typically state variables like membrane potential)
**kwargs – Keyword arguments
- Returns:
Binary spike tensor where 1 indicates a spike and 0 indicates no spike.
- Return type:
ArrayLike- Raises:
NotImplementedError – If the subclass does not implement this method.