Examples Gallery#
Welcome to the brainpy.state examples gallery! Here you’ll find complete, runnable examples demonstrating various aspects of computational neuroscience modeling.
All examples are available in the examples/ directory of the brainpy.state repository.
Classical Network Models#
These examples reproduce influential models from the computational neuroscience literature.
Implements the classic excitatory-inhibitory balanced network showing chaotic dynamics.
80% excitatory, 20% inhibitory neurons
Random sparse connectivity
Balanced excitation and inhibition
Asynchronous irregular firing
Conductance-based synaptic integration in balanced networks.
Conductance-based synapses (COBA)
Reversal potentials
More biologically realistic
Stable asynchronous activity
Current-based synaptic integration (simpler, faster variant).
Current-based synapses (CUBA)
Faster computation
Widely used for large-scale simulations
More detailed neuron model with sodium and potassium channels.
Hodgkin-Huxley neuron dynamics
Action potential generation
Biophysically detailed
Computationally intensive
Oscillations and Rhythms#
Interneuron network generating gamma oscillations (30-80 Hz).
Interneuron-based gamma
Inhibition-based synchrony
Physiologically relevant frequency
Network oscillations
Demonstrates reliable spike sequence propagation.
Feedforward architecture
Reliable spike timing
Wave propagation
Temporal coding
High-frequency oscillations (>100 Hz) in inhibitory networks.
Very fast oscillations
Gap junction coupling
Inhibitory synchrony
Pathological rhythms
Gamma Oscillation Mechanisms (Susin & Destexhe 2021)#
Series of models exploring different gamma generation mechanisms:
AI state: No oscillations, irregular firing
Background activity state
Asynchronous firing
No clear rhythm
Coherent High-frequency INhibition-based Gamma
Coherent inhibition
High-frequency gamma
Interneuron synchrony
Inhibition-based Gamma
Pure inhibitory network
Gamma through inhibition
Fast synaptic kinetics
Pyramidal-Interneuron Gamma
E-I loop generates gamma
Most common mechanism
Excitatory-inhibitory interaction
Spiking Neural Network Training#
Trains a simple spiking network using surrogate gradients.
Surrogate gradient method
LIF neuron training
Simple classification task
Gradient-based learning
Trains a spiking network on Fashion-MNIST dataset.
Fashion-MNIST dataset
Multi-layer SNN
Spike-based processing
Real-world classification
Uses readout layer for classification.
MNIST handwritten digits
Specialized readout layer
Spike counting
Classification from spike rates