A fundamental challenge in neuroscience is to connect behavior to the underlying neural mechanisms. Networks that produce rhythmic motor behaviors, such as locomotion, provide important model systems to address this problem. A particularly good model for this purpose is the neural circuit that coordinates limb movements in the crayfish swimmeret system. During forward swimming, rhythmic movements of limbs on different segments of the crayfish abdomen progress from back to front with the same period but neighboring limbs are phase-lagged by 25% of the period. This coordination of limb movements is maintained over a wide range of frequency. We examine different biologically plausible network topologies of the underlying neural circuit and show that phase constant rhythms of 0%, 25%, 50% or 75% phase-lags can be robustly produced. In doing so, we obtain necessary conditions on the network connectivity for the crayfish’s natural stroke pattern with 25% phase-lags. We then construct a computational fluid dynamics model and show that the natural 25% back-to-front phase constant rhythm is the most efficient stroke pattern for swimming. Our results suggest that the particular network topology in the neural circuit of the crayfish swimmeret system is likely the result of evolution in favor of more effective and efficient swimming.
Neural field models with transmission delay may be cast as abstract delay differential equations (DDE). The theory of dual semigroups (also called sun-star calculus) provides a natural framework for the analysis of a broad class of delay equations, among which DDE. In particular, it may be used advantageously for the investigation of stability and bifurcation of steady states. After introducing the neural field model in its basic functional analytic setting and discussing its spectral properties, we elaborate extensively an example and derive a characteristic equation. Under certain conditions the associated equilibrium may destabilise in a Hopf bifurcation. Furthermore, two Hopf curves may intersect in a double Hopf point in a two-dimensional parameter space. We provide general formulas for the corresponding critical normal form coefficients, evaluate these numerically and interpret the results.
Frequency selectivity in the form of mode locking has been shown in stimulated nervous systems for various functional features. To understand the mechanisms behind these features, spiking neuron models have been used to study the precise timing of firing events thought to underlie frequency mode locking.
A number of neuron models have been developed and investigated in producing a large repertoire of neuronal behaviors, in attempts to develop an understanding of relevant functional features. We study a variant of the increasingly common use of the adaptive exponential leaky integrate-and-fire model, and explore mode locked solutions where the neuron is periodically stimulated. We present the analysis of mode locked solutions and their stability, and show numerical results demonstrating our analysis.
The hippocampus plays an important role in representing space (for spatial navigation) and time (for episodic memory). Spatial representation of the environment is pivotal for navigation in rodents and primates. Two types of maps, topographical and topological, may be used for spatial representation. Rodent hippocampal place cells exhibit spatially-selective firing patterns in an environment that can be decoded to determine the animal’s location, heading, and past and future trajectory. We recorded ensembles of hippocampal neurons as rodents freely foraged in one and two-dimensional spatial environments, and we used a ``decode-to-uncover'' strategy to examine the temporally structured patterns embedded in the ensemble spiking activity in the absence of observed spatial correlates during rodent navigation. Specifically, the spatial environment was represented by a finite discrete state space.
Trajectories across spatial locations (``states'') were associated with consistent hippocampal ensemble spiking patterns, which were characterized by a state transition matrix of a hidden Markov model. From this state transition matrix, we inferred a topology graph that defined the connectivity in the state space. In contrast to a topographic code, our results support the efficiency of topological coding in the presence of sparse sample size and fuzzy space mapping.
The notion of excitability was first introduced in an attempt to understand firing properties of neurons. It was Alan Hodgkin who identified three basic types (classes) of excitable axons (integrator, resonator and differentiator) distinguished by their different responses to injected steps of currents of various amplitudes.
Pioneered by Rinzel and Ermentrout, bifurcation theory explains repetitive (tonic) firing patterns for adequate steady inputs in integrator (type I) and resonator (type II) neuronal models. In contrast, the dynamic behavior of differentiator (type III) neurons cannot be explained by standard dynamical systems theory. This third type of excitable neuron encodes a dynamic change in the input and leads naturally to a transient response of the neuron.
In this talk, I will show that "canards" - peculiar mathematical creatures - are well suited to explain the nature of transient responses of neurons due to dynamic (smooth) inputs. I will apply this geometric theory to a simple driven FitzHugh-Nagumo/Morris-Lecar type neural model and to a more complicated neural model that describes paradoxical excitation due to propofol anesthesia.
In this talk I will present some methods for constructing coupled dynamical systems out of simple bistable units that allow one to realise arbitrary finite state computational systems using coupled ordinary differential equations. More precisely, suppose one has an arbitrary strongly connected finite directed graph. How does one construct a systems of coupled cells that realises this graphs as attracting heteroclinic networks, with minimal limitations? The constructions are robust with respect to a suitably constrained set of perturbations and to addition of noise. On the presence of noise, there may or may not be short-term memory effects associated with the previous path on the network. This is joint work with C. Postlethwaite (Auckland).
Biological neural circuits display both spontaneous asynchronous activity, and complex, yet ordered activity while actively responding to input. When can model neural networks demonstrate both regimes? Recently, researchers have demonstrated this capability in large, recurrently connected neural networks, or “liquid state machines", with chaotic activity. We study the transition to chaos in a family of such networks, and use principal orthogonal decomposition (POD) techniques to provide a lower-dimensional description of network activity.
We find that key characteristics of this transition depend critically on whether a fundamental neurobiological constraint — that most neurons are either excitatory or inhibitory — is satisfied. Specifically, we find that constrained networks exhibit the transition to chaos at much higher coupling strengths than unconstrained networks. This property is the consequence of the fact that the constrained system may be described as a perturbation from a system with non-trivial symmetries. These symmetries imply the presence of both fixed points and periodic orbits that continue to act as an organizing center for solutions, even for large perturbations. In comparison, spectral characteristics of the network coupling matrix are relatively uninformative about the behavior of the constrained system.
Sensory integration and sensory binding are similar problems separated by a vast methodological gulf. The dominant paradigm of binding theory is neural synchronization, while sensory integration is built on observations of bimodal neurons. These cells show large increases in firing rates for bimodal presentation of weak stimuli, but little improvement for strong stimuli, a finding known as the Principle of Inverse Enhancement. It would be useful to link these two fields so that methods from each could be used by the other. The best case for such a bridge is the rattlesnake, which has two dissimilar visual systems, one for light and one for heat. Although this sounds like a binding problem, the rattlesnake has been studied using the methods of sensory integration. Many cells in rattlesnake optic tectum are sensitive only to light but can be strongly modulated by heat stimuli, or vice versa. I simulated these cells by assuming that they are members of synchronized pairs of excitatory-coupled rate-coded neurons. I replaced the usual weak coupling assumption with Goldilocks coupling: coupling is kept as strong as possible without distorting spike amplitudes. Both assumptions are unconventional but not unjustifiable. The same synchronized neuron model, without any parameter changes, accounts for a population of cells in cat visual cortex whose firing rates are enhanced by auditory stimuli. It also produces enhancements quite similar to those described psychophysically in humans and could be used to model some human color vision transformations; I present a model of the mysterious enhancement of "yellowness" generated from oscillatory synchronization of known neural mechanisms.
The predictability of neuronal network dynamics is a central question in neuroscience. First, we present a numerical investigation of the network dynamics of coupled Hodgkin-Huxley (HH) neurons and show that there is a chaotic dynamical regime indicated by a positive largest Lyapunov exponent. In this regime, there is no numerical convergence of the solution and only statistical quantifications are reliable. Second, we introduce an efficient library-based numerical method for simulating HH neuronal networks. Our pre-computed high resolution data library can allow us to avoid resolving the spikes in detail and to evolve the HH neuron equations using much larger time steps than the typical ones used in standard methods. Meanwhile, we can achieve comparable resolution in statistical quantifications of the network activity. Finally, we present a coarse-grained event tree analysis for effectively discriminating small differences in inputs to the network dynamics.
When experiencing an ambiguous sensory stimulus (e.g., the vase-faces image) subjects may report haphazard alternations (time scale, seconds) between the possible interpretations. Various dynamical models that implement neuronal competition with reciprocal inhibition between neuronal populations show alternations behaving as noisy oscillators or as bistable systems subject to noise-driven switching. Slow negative feedback, neuronal firing adaptation or synaptic depression, sets the basic time scale (seconds) for switching. A minimal statistical model based on alternating renewal processes (with durations described by gamma distributions) captures various aspects of the percept time series.
When observers view for extended time an ambiguous visual scene with two or more different interpretations they report switching between different perceptions. We focus on a classical paradigmatic stimulus, the visual plaids, consisting of two superimposed drifting gratings with transparent intersections (Wallach '35, Hupe & Rubin '03). For visual plaids, tristable perception is experienced: one coherent percept (the gratings move together as a single pattern) and two transparent percepts (the gratings slide across one another) with alternating depth order. In order to decipher the complex mechanisms of tristable perception, we gathered a large amount of psychophysical data on tristable plaids and developed a neural network, firing rate model of interaction between neural populations. The model developed can account for the dynamical properties (transition probabilities, distributions of percept time durations, etc) observed in the experiments and predicts that adaptation is strongly involved in perceptual switching.
The malaria parasite life cycle involves three cycles the sporogony (mosquito stage), exo-erythrocytic schizogony (liver stage), and the erythrocytic schizogony (human blood stage). We consider a simplied mathematical model for malaria involving two parasite life cycles within the host namely the exo-erythrocytic and erythrocytic cycles. This study has revealed parasite replication characteristics which offer insights into the processes that allow the parasite to evade the human response during the red blood stages. First, the infection of the red blood cells by extracellular parasites during the erythrocytic cycle is characterized a reproduction number, R 0p ; that is less than one. Secondly, the asexual repro- duction of parasite during the red blood stage characterized by a reproduction number, R0m > 1; is responsible for the pathology of clinical malaria. Thirdly, we have found that the parasite depends mainly on the death of infected red blood cells to rapidly increase its population. Specifcally, the number of parasites, n1; in an infected red blood cell that dies need not be high for the parasite population to grow rapidly. We have found that for 8 ≤ n1 < 16; R0 > 1 and the parasite establishes itself while for 16 ≤ n1≤ 32; R0 < 1 and the parasite fails to establish itself. We are led to conclude that the parasite has preference for infecting older red blood cells as a strategy for evading the immune system.