stdp_synapse_hom#
- class brainpy.state.stdp_synapse_hom(weight=1.0, delay=Quantity(1., 'ms'), receptor_type=0, tau_plus=Quantity(20., 'ms'), tau_minus=Quantity(20., 'ms'), lambda_=0.01, alpha=1.0, mu_plus=1.0, mu_minus=1.0, Wmax=100.0, Kplus=0.0, post=None, name=None)#
NEST-compatible
stdp_synapse_homconnection model with homogeneous plasticity parameters.stdp_synapse_homimplements pair-based spike-timing dependent plasticity (STDP) following Guetig et al. (2003) and the NEST reference implementation frommodels/stdp_synapse_hom.h. The model is identical tostdp_synapsein its plasticity dynamics but enforces thattau_plus,lambda,alpha,mu_plus,mu_minus, andWmaxare common model properties shared by all connections of this type, rather than per-connection parameters.This design mirrors NEST’s homogeneous synapse convention, where plasticity hyperparameters are set once at the model level (via
CopyModel/SetDefaults) and cannot be overridden on individual connections. Per-connection state remains limited toweightandKplus(presynaptic eligibility trace).1. Mathematical Model
The STDP dynamics are identical to
stdp_synapse. See that class for full mathematical derivation. In brief:State Variables (per connection):
w: Synaptic weight (plastic, bounded to \([0, W_{\max}]\) or \([W_{\max}, 0]\))K^+: Presynaptic eligibility trace (decays with \(\tau_+\))
Shared Plasticity Parameters (model-level):
\(\tau_+\) – Presynaptic trace time constant
\(\lambda\) – Potentiation learning rate
\(\alpha\) – Depression/potentiation ratio
\(\mu_+\) – Potentiation weight-dependence exponent
\(\mu_-\) – Depression weight-dependence exponent
\(W_{\max}\) – Maximum allowed weight magnitude
Weight Updates:
Upon presynaptic spike at time \(t_{\text{pre}}\) with dendritic delay \(d\):
Facilitation from past postsynaptic spikes in \((t_{\text{last}} - d,\, t_{\text{pre}} - d]\):
\[\hat{w} \leftarrow \hat{w} + \lambda (1 - \hat{w})^{\mu_+} K^+_{\text{eff}}\]Depression from current postsynaptic trace \(K^-(t_{\text{pre}} - d)\):
\[\hat{w} \leftarrow \hat{w} - \alpha \lambda \hat{w}^{\mu_-} K^-_{\text{eff}}\]Send weighted spike event to postsynaptic neuron.
Update presynaptic trace:
\[K^+ \leftarrow K^+ e^{(t_{\text{last}} - t_{\text{pre}}) / \tau_+} + 1\]
where \(\hat{w} = w / W_{\max}\) is the normalized weight.
2. Homogeneous Property Semantics
In NEST,
stdp_synapse_hommodels enforce that plasticity hyperparameters (tau_plus,lambda,alpha,mu_plus,mu_minus,Wmax) are common properties set once at the model level, not per-connection. This implementation replicates that constraint by:Accepting these parameters only during model construction (
__init__) or global model updates (setcalled on the model instance).Rejecting these parameters in connection-time synapse specifications passed to
check_synapse_params()(called by NEST-styleConnectAPIs).Storing a single copy of each parameter shared by all connections.
Workflow example:
>>> # Set common properties at model construction >>> stdp_model = stdp_synapse_hom( ... weight=1.0, ... delay=1.0 * u.ms, ... tau_plus=20.0 * u.ms, ... lambda_=0.01, ... alpha=1.05, ... mu_plus=1.0, ... mu_minus=1.0, ... Wmax=100.0, ... ) >>> # OK: per-connection weight at connect time >>> stdp_model.check_synapse_params({'weight': 2.5}) >>> # ERROR: cannot override common property at connect time >>> stdp_model.check_synapse_params({'lambda': 0.02}) # raises ValueError
3. Validation Semantics
Unlike
stdp_synapse, NESTstdp_synapse_homdoes not enforce:The
weight/Wmaxsign consistency check (allowing mixed signs).The
Kplus >= 0non-negativity constraint (allowing negative traces).
This implementation replicates NEST behavior by overriding the validation methods to no-ops:
_validate_non_negative()(disablesKplus >= 0check)_validate_weight_wmax_sign()(disables sign consistency check)
4. Event Timing and Ordering
Event processing follows the same sequence as
stdp_synapse:Query postsynaptic spike history in window \((t_{\text{last}} - d,\, t_{\text{pre}} - d]\)
Apply facilitation for each retrieved postsynaptic spike
Compute postsynaptic trace \(K^-\) at \(t_{\text{pre}} - d\)
Apply depression based on \(K^-\)
Schedule weighted spike event for delivery after delay \(d\)
Update presynaptic trace \(K^+\) and timestamp
t_lastspike
Note: Event timing uses on-grid spike stamps and ignores sub-step offsets.
5. Assumptions, Constraints, and Failure Modes
Constraints enforced at construction:
tau_plus > 0(presynaptic trace time constant must be positive)lambda >= 0(learning rate must be non-negative)alpha >= 0(depression/potentiation ratio must be non-negative)Wmax != 0(maximum weight must be nonzero)tau_minus > 0(postsynaptic trace time constant must be positive, inherited)
Failure modes:
Attempting to set
tau_plus,lambda,alpha,mu_plus,mu_minus, orWmaxin connection-timesyn_spec→ raisesValueErrorfromcheck_synapse_params().Setting
Wmax = 0→ division by zero in weight normalization.Very small
tau_plusortau_minus→ numerical instability in exponential decay.Large
mu_plusormu_minus→ gradient explosion in weight-dependent terms.
Computational complexity:
Per presynaptic spike: \(O(N_{\text{post}})\) where \(N_{\text{post}}\) is the number of postsynaptic spikes in the potentiation window.
Per postsynaptic spike: \(O(1)\) trace update.
- Parameters:
weight (
float,array-like, orQuantity, optional) – Per-connection parameter. Initial synaptic weight. Scalar value, dimensionless or with units (pA for current-based, nS for conductance-based). Can be positive (excitatory) or negative (inhibitory). Default:1.0.delay (
float,array-like, orQuantity, optional) – Per-connection parameter. Synaptic transmission delay in ms. Must be positive, will be discretized to integer time steps. Default:1.0 * u.ms.receptor_type (
int, optional) – Per-connection parameter. Receptor port identifier on postsynaptic neuron. Non-negative integer. Default:0.tau_plus (
floatorQuantity, optional) – Common model property. Presynaptic eligibility trace time constant \(\tau_+\) in ms. Must be strictly positive. Shared by all connections of this model type. Default:20.0 * u.ms.lambda (
float, optional) – Common model property. Potentiation learning rate \(\lambda\) (dimensionless). Must be non-negative. Shared by all connections. Default:0.01.alpha (
float, optional) – Common model property. Depression/potentiation ratio \(\alpha\) (dimensionless). Must be non-negative. Shared by all connections. Default:1.0.mu_plus (
float, optional) – Common model property. Potentiation weight-dependence exponent \(\mu_+\) (dimensionless). Controls how potentiation saturates near \(W_{\max}\). Shared by all connections. Default:1.0.mu_minus (
float, optional) – Common model property. Depression weight-dependence exponent \(\mu_-\) (dimensionless). Controls how depression saturates near zero. Shared by all connections. Default:1.0.Wmax (
float, optional) – Common model property. Maximum allowed weight magnitude \(W_{\max}\) (dimensionless or with same units asweight). Weights are clipped to \([0, W_{\max}]\) or \([W_{\max}, 0]\) depending on sign. Must be nonzero. Shared by all connections. Default:100.0.tau_minus (
floatorQuantity, optional) – Postsynaptic eligibility trace time constant \(\tau_-\) in ms. Must be strictly positive. In NEST, this belongs to the postsynaptic neuron’s archiving system; here it is stored on the synapse for standalone use. Default:20.0 * u.ms.post (
Dynamics, optional) – Default postsynaptic receiver object. Must implement spike history archiving andadd_delta_inputoradd_current_inputmethods. Default:None.event_type (
str, optional) – Type of event to transmit. Typically'spike'for STDP. Default:'spike'.name (
str, optional) – Unique identifier for this synapse instance. Default: auto-generated.
Parameter Mapping
NEST
stdp_synapse_homparameters map to this implementation as follows:NEST Parameter
brainpy.state Param
Scope
weightweightPer-connection (can vary)
delaydelayPer-connection (can vary)
receptor_typereceptor_typePer-connection (can vary)
tau_plustau_plusCommon property (model-level)
lambdalambda_Common property (model-level)
alphaalphaCommon property (model-level)
mu_plusmu_plusCommon property (model-level)
mu_minusmu_minusCommon property (model-level)
WmaxWmaxCommon property (model-level)
(postsynaptic archiving)
tau_minusSynapse-level (NEST: neuron property)
- tau_plus#
Presynaptic trace time constant (immutable common property).
- Type:
Quantity
- tau_minus#
Postsynaptic trace time constant (synapse-level parameter).
- Type:
Quantity
Notes
Design differences from NEST:
Parameter scope enforcement: NEST enforces homogeneous-property semantics at connection-time via its C++ connection API. This implementation replicates that by validating
syn_specincheck_synapse_params().Validation relaxation: NEST
stdp_synapse_homintentionally omits theweight/Wmaxsign check andKplus >= 0check present instdp_synapse. This class follows that behavior by overriding validation methods to no-ops.Postsynaptic archiving: In NEST,
tau_minusand spike history belong to the postsynaptic neuron. This implementation storestau_minuson the synapse for standalone compatibility, avoiding tight coupling to neuron archiving APIs.Event timing: NEST uses precise spike timing with sub-step offsets. This implementation uses on-grid timestamps (ignoring offsets) for simplicity.
Typical usage patterns:
Large-scale STDP networks: Define shared plasticity rules with heterogeneous initial weights and delays per connection.
Parameter space exploration: Systematically vary common properties across model instances to study learning dynamics.
Memory-efficient plasticity: Share hyperparameters across millions of connections to reduce memory footprint.
See also
stdp_synapseHeterogeneous STDP variant (per-connection plasticity parameters)
stdp_triplet_synapseTriplet-based STDP (better fit to experimental data)
stdp_dopamine_synapseReward-modulated STDP
vogels_sprekeler_synapseInhibitory STDP for E/I balance
References
Examples
Basic STDP learning with homogeneous plasticity parameters:
>>> import brainstate as bst >>> import saiunit as u >>> from brainpy_state._nest import stdp_synapse_hom >>> # Create STDP model with common properties >>> stdp = stdp_synapse_hom( ... weight=5.0, ... delay=1.0 * u.ms, ... tau_plus=20.0 * u.ms, ... tau_minus=20.0 * u.ms, ... lambda_=0.01, ... alpha=1.05, ... mu_plus=1.0, ... mu_minus=1.0, ... Wmax=100.0, ... ) >>> # Inspect model properties >>> params = stdp.get() >>> params['synapse_model'] 'stdp_synapse_hom' >>> params['tau_plus'] 20.0 * ms
Verify common property enforcement:
>>> # OK: per-connection parameters >>> stdp.check_synapse_params({'weight': 10.0, 'delay': 2.0 * u.ms}) >>> # ERROR: common properties cannot be set per-connection >>> try: ... stdp.check_synapse_params({'lambda': 0.02}) ... except ValueError as e: ... print(e) lambda cannot be specified in connect-time synapse parameters for stdp_synapse_hom; set common properties on the model itself (for example via CopyModel()/SetDefaults()).
Use in network simulation:
>>> # Define pre/post neuron populations >>> from brainpy_state._nest import iaf_psc_exp >>> pre_neurons = iaf_psc_exp(in_size=100) >>> post_neurons = iaf_psc_exp(in_size=50) >>> # Create STDP connection with shared plasticity rules >>> stdp_conn = stdp_synapse_hom( ... tau_plus=16.8 * u.ms, ... tau_minus=33.7 * u.ms, ... lambda_=0.005, ... alpha=1.05, ... Wmax=50.0, ... post=post_neurons, ... ) >>> # Connect with heterogeneous weights/delays >>> for i in range(100): ... for j in range(50): ... stdp_conn.check_synapse_params({ ... 'weight': np.random.uniform(0, 10), ... 'delay': np.random.uniform(1.0, 5.0) * u.ms, ... }) # Per-connection parameters OK
- check_synapse_params(syn_spec)[source]#
Validate connection-time synapse parameters and reject common properties.
Enforces that common model properties (
tau_plus,lambda,alpha,mu_plus,mu_minus,Wmax) cannot be specified in per-connection synapse specifications. This replicates NESTstdp_synapse_homsemantics, where plasticity hyperparameters are set once at the model level and shared by all connections.Per-connection parameters (
weight,delay,receptor_type) are allowed and should be specified insyn_specwhen creating individual connections.- Parameters:
syn_spec (
Mapping[str,object]orNone) –Connection-time synapse specification dictionary. If
None, validation is skipped (no parameters to check). Keys are parameter names, values are the requested per-connection values.Allowed keys:
'weight','delay','receptor_type', and any other per-connection state variables.Forbidden keys:
'tau_plus','lambda','alpha','mu_plus','mu_minus','Wmax'(common properties, must be set on model).- Raises:
ValueError – If
syn_speccontains any of the forbidden common-property keys (tau_plus,lambda,alpha,mu_plus,mu_minus,Wmax). The error message identifies the disallowed key and suggests setting it via model construction or global model update instead.
Notes
Design rationale: In NEST,
stdp_synapse_hommodels enforce that plasticity hyperparameters are homogeneous (shared across all connections) by preventing their specification inConnect()synapse dictionaries. This constraint is enforced at the C++ API level in NEST. This method replicates that behavior in Python by validating thesyn_specdictionary.Workflow:
Create model with common properties:
model = stdp_synapse_hom(tau_plus=20*u.ms, ...)Connect with per-connection parameters:
model.check_synapse_params({'weight': 5.0})ERROR if common properties appear:
model.check_synapse_params({'lambda': 0.02})
Implementation note: The check uses key name
'lambda'(not'lambda_') to match NEST naming conventions. Users should uselambda_in Python code but'lambda'in synapse specification dictionaries.Examples
>>> import saiunit as u >>> from brainpy_state._nest import stdp_synapse_hom >>> stdp = stdp_synapse_hom( ... weight=1.0, ... tau_plus=20.0 * u.ms, ... lambda_=0.01, ... Wmax=100.0, ... ) >>> # OK: per-connection parameters >>> stdp.check_synapse_params({'weight': 5.0, 'delay': 2.0 * u.ms}) >>> # OK: None syn_spec (no parameters to validate) >>> stdp.check_synapse_params(None) >>> # ERROR: common property >>> try: ... stdp.check_synapse_params({'lambda': 0.02}) ... except ValueError as e: ... print(e) lambda cannot be specified in connect-time synapse parameters for stdp_synapse_hom; set common properties on the model itself (for example via CopyModel()/SetDefaults()).
- get()[source]#
Return current public parameters and mutable connection state.
Retrieves all model parameters, common properties, and per-connection state variables as a dictionary. The returned dictionary includes both model-level shared parameters (
tau_plus,lambda,alpha,mu_plus,mu_minus,Wmax) and per-connection mutable state (weight,Kplus,t_lastspike). Thesynapse_modelfield identifies this as a'stdp_synapse_hom'connection.- Returns:
Dictionary with the following structure:
'synapse_model'(str): Always'stdp_synapse_hom'.'weight'(float or Quantity): Current synaptic weight.'delay'(float): Transmission delay in ms (quantized to steps).'receptor_type'(int): Receptor port identifier.'tau_plus'(Quantity): Presynaptic trace time constant (common property).'lambda'(float): Potentiation learning rate (common property).'alpha'(float): Depression/potentiation ratio (common property).'mu_plus'(float): Potentiation exponent (common property).'mu_minus'(float): Depression exponent (common property).'Wmax'(float): Maximum weight magnitude (common property).'tau_minus'(Quantity): Postsynaptic trace time constant.'Kplus'(float): Presynaptic eligibility trace (mutable state).'t_lastspike'(float): Last presynaptic spike timestamp in ms (mutable state).Additional fields inherited from
stdp_synapseandstatic_synapse.
- Return type:
Notes
The returned dictionary is a snapshot of the current state. Modifying the returned dictionary does not affect the synapse internal state. To update parameters, use
set()instead.Common property values (
tau_plus,lambda,alpha,mu_plus,mu_minus,Wmax) are shared by all connections of this model type. They cannot be modified per-connection; attempting to pass them in connection-timesyn_specwill raiseValueErrorfromcheck_synapse_params().Examples
>>> import saiunit as u >>> from brainpy_state._nest import stdp_synapse_hom >>> stdp = stdp_synapse_hom( ... weight=5.0, ... tau_plus=20.0 * u.ms, ... lambda_=0.01, ... Wmax=100.0, ... ) >>> params = stdp.get() >>> params['synapse_model'] 'stdp_synapse_hom' >>> params['tau_plus'] 20.0 * ms >>> params['lambda'] 0.01