Neuroscience Projects

Mean-Field Modeling of Spiking Neural Network Dynamics

One of the projects I worked on during my PhD and continue to work on is the derivation and analysis of so-called mean-field models. Mean-field models can be derived from the mathematical equations that govern the dynamics of spiking neural networks and they capture how average quantities such as the average firing rate or the average membrane potential develop over time in these networks. These average quantities are much easier to record in living brains than single neuron properties. Thus, these mean-field models allow to relate recordings of neural population averages to the underlying spiking network activity. They are a great tool to bridge multiple scales of brain organization and allow to apply a wide range of dynamical systems analysis methods. The relationship between mean-field model and spiking neural network dynamics is depicted in the figure below for a network of quadratic integrate-and-fire (QIF) neurons.

My PhD work on mean-field models of spiking neural dynamics was mostly concerned with the effects that different forms of plasticity have on the network dynamics [1,2]. Plasticity thereby refers to properties of neurons or synapses that undergo changes, depending on the level of activity of the neuron or synapse. An example for that is spike-frequency adaptation, which leads to a reduction of the firing rate response of a neuron to a given input current, if the neuron fired at a high rate prior to the input already. My co-authors and I derived the mean-field equations for a network of spiking neurons with spike-frequency adaptation and analyzed its role in neural synchronization in [1].

As part of my first postdoc with Ann Kennedy, we built up on this previous work to derive mean-field equations for a new mathematical model of single neurons (the Izhikevich neuron model) with distributed spiking thresholds [3].

[1] Gast et al. (2020) Neural computation.

[2] Gast et al. (2021) Physical Review E.

[3] Gast et al. (2023) Physical Review E.

Mean-Field Description of Bursting Dynamics in Spiking Neural Networks with Short-Term AdaptationAbstract. Bursting plays an important role in neural communication. At the population level, macroscopic bursting has been identified in populations of neurons that do not express intrinsic bursting mechanisms. For the analysis of phase transitions between bursting and non-bursting states, mean-field descriptions of macroscopic bursting behavior are a valuable tool. In this article, we derive mean-field descriptions of populations of spiking neurons and examine whether states of collective bursting behavior can arise from short-term adaptation mechanisms. Specifically, we consider synaptic depression and spike-frequency adaptation in networks of quadratic integrate-and-fire neurons. Analyzing the mean-field model via bifurcation analysis, we find that bursting behavior emerges for both types of short-term adaptation. This bursting behavior can coexist with steady-state behavior, providing a bistable regime that allows for transient switches between synchronized and nonsynchronized states of population dynamics. For all of these findings, we demonstrate a close correspondence between the spiking neural network and the mean-field model. Although the mean-field model has been derived under the assumptions of an infinite population size and all-to-all coupling inside the population, we show that this correspondence holds even for small, sparsely coupled networks. In summary, we provide mechanistic descriptions of phase transitions between bursting and steady-state population dynamics, which play important roles in both healthy neural communication and neurological disorders.
Mean-field approximations of networks of spiking neurons with short-term synaptic plasticityLow-dimensional descriptions of spiking neural network dynamics are an effective tool for bridging different scales of organization of brain structure and function. Recent advances in deriving mean-field descriptions for networks of coupled oscillators have sparked the development of a new generation of neural mass models. Of notable interest are mean-field descriptions of all-to-all coupled quadratic integrate-and-fire (QIF) neurons, which have already seen numerous extensions and applications. These extensions include different forms of short-term adaptation considered to play an important role in generating and sustaining dynamic regimes of interest in the brain. It is an open question, however, whether the incorporation of presynaptic forms of synaptic plasticity driven by single neuron activity would still permit the derivation of mean-field equations using the same method. Here we discuss this problem using an established model of short-term synaptic plasticity at the single neuron level, for which we present two different approaches for the derivation of the mean-field equations. We compare these models with a recently proposed mean-field approximation that assumes stochastic spike timings. In general, the latter fails to accurately reproduce the macroscopic activity in networks of deterministic QIF neurons with distributed parameters. We show that the mean-field models we propose provide a more accurate description of the network dynamics, although they are mathematically more involved. Using bifurcation analysis, we find that QIF networks with presynaptic short-term plasticity can express regimes of periodic bursting activity as well as bistable regimes. Together, we provide novel insight into the macroscopic effects of short-term synaptic plasticity in spiking neural networks, as well as two different mean-field descriptions for future investigations of such networks.
Macroscopic dynamics of neural networks with heterogeneous spiking thresholdsMean-field theory links the physiological properties of individual neurons to the emergent dynamics of neural population activity. These models provide an essential tool for studying brain function at different scales; however, for their application to neural populations on large scale, they need to account for differences between distinct neuron types. The Izhikevich single neuron model can account for a broad range of different neuron types and spiking patterns, thus rendering it an optimal candidate for a mean-field theoretic treatment of brain dynamics in heterogeneous networks. Here we derive the mean-field equations for networks of all-to-all coupled Izhikevich neurons with heterogeneous spiking thresholds. Using methods from bifurcation theory, we examine the conditions under which the mean-field theory accurately predicts the dynamics of the Izhikevich neuron network. To this end, we focus on three important features of the Izhikevich model that are subject here to simplifying assumptions: (i) spike-frequency adaptation, (ii) the spike reset conditions, and (iii) the distribution of single-cell spike thresholds across neurons. Our results indicate that, while the mean-field model is not an exact model of the Izhikevich network dynamics, it faithfully captures its different dynamic regimes and phase transitions. We thus present a mean-field model that can represent different neuron types and spiking dynamics. The model comprises biophysical state variables and parameters, incorporates realistic spike resetting conditions, and accounts for heterogeneity in neural spiking thresholds. These features allow for a broad applicability of the model as well as for a direct comparison to experimental data.

Neural Synchronization Properties in the Basal Ganglia

The basal ganglia are a set of interconnected neural populations, deep in the brain, that are involved in various important brain functions such as learning, habit formation, and motor control. Damage to or malfunctioning of the basal ganglia has been found in various neurological disorders such as Parkinson's disease, Tourette syndrome, or Huntington's disease. Typically, malfunctioning of the basal ganglia is associated with increased synchrony in the activity of basal ganglia neurons. For example, the decline in motor control in Parkinson's disease seems to be related to the tendency of large parts of basal ganglia neurons to fire in concert at frequencies between 12-30 Hz (beta frequency band), thus reducing their information encoding capacities.

In this project, I use mathematical modeling and computer simulations to examine the conditions under which neurons in the basal ganglia start to fire in synchrony. During my PhD, I built a mathematical model of the external pallidum, a central part of the basal ganglia, and analyzed the neural synchronization properties of the model under normal and dopamine depleted conditions [4]. Furthermore, my co-authors and I showed via this model how the complex features of parkinsonian neural synchronization that have been found in human patients could emerge at the level of the external pallidum [4, 5].


[4]  Gast et al. (2021) Journal of Neuroscience.

[5] Gong et al. (2021) Brain.

On the Role of Arkypallidal and Prototypical Neurons for Phase Transitions in the External PallidumThe external pallidum (globus pallidus pars externa [GPe]) plays a central role for basal ganglia functions and dynamics and, consequently, has been included in most computational studies of the basal ganglia. These studies considered the GPe as a homogeneous neural population. However, experimental studies have shown that the GPe contains at least two distinct cell types (prototypical and arkypallidal cells). In this work, we provide in silico insight into how pallidal heterogeneity modulates dynamic regimes inside the GPe and how they affect the GPe response to oscillatory input. We derive a mean-field model of the GPe system from a microscopic spiking neural network of recurrently coupled prototypical and arkypallidal neurons. Using bifurcation analysis, we examine the influence of dopamine-dependent changes of intrapallidal connectivity on the GPe dynamics. We find that increased self-inhibition of prototypical cells can induce oscillations, whereas increased inhibition of prototypical cells by arkypallidal cells leads to the emergence of a bistable regime. Furthermore, we show that oscillatory input to the GPe, arriving from striatum, leads to characteristic patterns of cross-frequency coupling observed at the GPe. Based on these findings, we propose two different hypotheses of how dopamine depletion at the GPe may lead to phase-amplitude coupling between the parkinsonian beta rhythm and a GPe-intrinsic γ rhythm. Finally, we show that these findings generalize to realistic spiking neural networks of sparsely coupled Type I excitable GPe neurons. SIGNIFICANCE STATEMENT Our work provides (1) insight into the theoretical implications of a dichotomous globus pallidus pars externa (GPe) organization, and (2) an exact mean-field model that allows for future investigations of the relationship between GPe spiking activity and local field potential fluctuations. We identify the major phase transitions that the GPe can undergo when subject to static or periodic input and link these phase transitions to the emergence of synchronized oscillations and cross-frequency coupling in the basal ganglia. Because of the close links between our model and experimental findings on the structure and dynamics of prototypical and arkypallidal cells, our results can be used to guide both experimental and computational studies on the role of the GPe for basal ganglia dynamics in health and disease.

The Role of Neural Heterogeneity

It is one of the main questions in neuroscience to answer how the brain functions such as stable perception, controlling body movements, and integrating novel experiences with past memories, arise from interacting neurons. Some questions that are strongly related to this main question are: 

One important aspect of neural tissue in the brain is that neurons differ dramatically in their structural composition. These differences in structural composition are also reflected in a wide range of different modes of activity that these neurons can express. We call this neural heterogeneity. In this project, we studied how the level of neural heterogeneity in a neural network affects both the overall fluctuations of neural activity that the network can express as well as the functions that can emerge in the network.  These functions can be thought of as less complex constituents of the high-level brain functions mentioned above. Examples are the formation of a local memory and the ability to reliably respond to an input signal with a precise output.

In [6], we derived mean-field equations for networks with heterogeneous spiking neurons and studied how the level of heterogeneity affects networks with different interacting neuron types. We find that neural heterogeneity has neuron type specific effects and acts as an important "knob" to tune the network to a regime where it can perform a specific function such as memory formation  particularly well.

[6] Gast et al. (2024) PNAS.