ISSN:
 1551-0018

eISSN:
 1547-1063

All Issues

Volume 15, 2018

Volume 14, 2017

Volume 13, 2016

Volume 12, 2015

Volume 11, 2014

Volume 10, 2013

Volume 9, 2012

Volume 8, 2011

Volume 7, 2010

Volume 6, 2009

Volume 5, 2008

Volume 4, 2007

Volume 3, 2006

Volume 2, 2005

Volume 1, 2004

Mathematical Biosciences & Engineering

2016 , Volume 13 , Issue 3

Special issue on the Neural Coding 2014 workshop

Select all articles

Export/Reference:

Preface
Susanne Ditlevsen and Petr Lansky
2016, 13(3): i-i doi: 10.3934/mbe.201600i +[Abstract](149) +[PDF](86.7KB)
Abstract:
This Special Issue of Mathematical Biosciences and Engineering contains 11 selected papers presented at the Neural Coding 2014 workshop. The workshop was held in the royal city of Versailles in France, October 6-10, 2014. This was the 11th of a series of international workshops on this subject, the first held in Prague (1995), then Versailles (1997), Osaka (1999), Plymouth (2001), Aulla (2003), Marburg (2005), Montevideo (2007), Tainan (2009), Limassol (2010), and again in Prague (2012). Also selected papers from Prague were published as a special issue of Mathematical Biosciences and Engineering and in this way a tradition was started. Similarly to the previous workshops, this was a single track multidisciplinary event bringing together experimental and computational neuroscientists. The Neural Coding Workshops are traditionally biennial symposia. They are relatively small in size, interdisciplinary with major emphasis on the search for common principles in neural coding. The workshop was conceived to bring together scientists from different disciplines for an in-depth discussion of mathematical model-building and computational strategies. Further information on the meeting can be found at the NC2014 website at https://colloque6.inra.fr/neural_coding_2014. The meeting was supported by French National Institute for Agricultural Research, the world's leading institution in this field.
    Understanding how the brain processes information is one of the most challenging subjects in neuroscience. The papers presented in this special issue show a small corner of the huge diversity of this field, and illustrate how scientists with different backgrounds approach this vast subject. The diversity of disciplines engaged in these investigations is remarkable: biologists, mathematicians, physicists, psychologists, computer scientists, and statisticians, all have original tools and ideas by which to try to elucidate the underlying mechanisms. In this issue, emphasis is put on mathematical modeling of single neurons. A variety of problems in computational neuroscience accompanied with a rich diversity of mathematical tools and approaches are presented. We hope it will inspire and challenge the readers in their own research.
    We would like to thank the authors for their valuable contributions and the referees for their priceless effort of reviewing the manuscripts. Finally, we would like to thank Yang Kuang for supporting us and making this publication possible.
The effect of positive interspike interval correlations on neuronal information transmission
Sven Blankenburg and Benjamin Lindner
2016, 13(3): 461-481 doi: 10.3934/mbe.2016001 +[Abstract](205) +[PDF](2013.7KB)
Abstract:
Experimentally it is known that some neurons encode preferentially information about low-frequency (slow) components of a time-dependent stimulus while others prefer intermediate or high-frequency (fast) components. Accordingly, neurons can be categorized as low-pass, band-pass or high-pass information filters. Mechanisms of information filtering at the cellular and the network levels have been suggested. Here we propose yet another mechanism, based on noise shaping due to spontaneous non-renewal spiking statistics. We compare two integrate-and-fire models with threshold noise that differ solely in their interspike interval (ISI) correlations: the renewal model generates independent ISIs, whereas the non-renewal model exhibits positive correlations between adjacent ISIs. For these simplified neuron models we analytically calculate ISI density and power spectrum of the spontaneous spike train as well as approximations for input-output cross-spectrum and spike-train power spectrum in the presence of a broad-band Gaussian stimulus. This yields the spectral coherence, an approximate frequency-resolved measure of information transmission. We demonstrate that for low spiking variability the renewal model acts as a low-pass filter of information (coherence has a global maximum at zero frequency), whereas the non-renewal model displays a pronounced maximum of the coherence at non-vanishing frequency and thus can be regarded as a band-pass filter of information.
A leaky integrate-and-fire model with adaptation for the generation of a spike train
Aniello Buonocore, Luigia Caputo, Enrica Pirozzi and Maria Francesca Carfora
2016, 13(3): 483-493 doi: 10.3934/mbe.2016002 +[Abstract](232) +[PDF](600.6KB)
Abstract:
A model is proposed to describe the spike-frequency adaptation observed in many neuronal systems. We assume that adaptation is mainly due to a calcium-activated potassium current, and we consider two coupled stochastic differential equations for which an analytical approach combined with simulation techniques and numerical methods allow to obtain both qualitative and quantitative results about asymptotic mean firing rate, mean calcium concentration and the firing probability density. A related algorithm, based on the Hazard Rate Method, is also devised and described.
Successive spike times predicted by a stochastic neuronal model with a variable input signal
Giuseppe D'Onofrio and Enrica Pirozzi
2016, 13(3): 495-507 doi: 10.3934/mbe.2016003 +[Abstract](235) +[PDF](2329.8KB)
Abstract:
Two different stochastic processes are used to model the evolution of the membrane voltage of a neuron exposed to a time-varying input signal. The first process is an inhomogeneous Ornstein-Uhlenbeck process and its first passage time through a constant threshold is used to model the first spike time after the signal onset. The second process is a Gauss-Markov process identified by a particular mean function dependent on the first passage time of the first process. It is shown that the second process is also of a diffusion type. The probability density function of the maximum between the first passage time of the first and the second process is considered to approximate the distribution of the second spike time. Results obtained by simulations are compared with those following the numerical and asymptotic approximations. A general equation to model successive spike times is given. Finally, examples with specific input signals are provided.
Efficient information transfer by Poisson neurons
Lubomir Kostal and Shigeru Shinomoto
2016, 13(3): 509-520 doi: 10.3934/mbe.2016004 +[Abstract](269) +[PDF](406.9KB)
Abstract:
Recently, it has been suggested that certain neurons with Poissonian spiking statistics may communicate by discontinuously switching between two levels of firing intensity. Such a situation resembles in many ways the optimal information transmission protocol for the continuous-time Poisson channel known from information theory. In this contribution we employ the classical information-theoretic results to analyze the efficiency of such a transmission from different perspectives, emphasising the neurobiological viewpoint. We address both the ultimate limits, in terms of the information capacity under metabolic cost constraints, and the achievable bounds on performance at rates below capacity with fixed decoding error probability. In doing so we discuss optimal values of experimentally measurable quantities that can be compared with the actual neuronal recordings in a future effort.
Integrator or coincidence detector --- what shapes the relation of stimulus synchrony and the operational mode of a neuron?
Achilleas Koutsou, Jacob Kanev, Maria Economidou and Chris Christodoulou
2016, 13(3): 521-535 doi: 10.3934/mbe.2016005 +[Abstract](251) +[PDF](958.9KB)
Abstract:
The operational mode of a neuron (i.e., whether a neuron is an integrator or a coincidence detector) is in part determined by the degree of synchrony in the firing of its pre-synaptic neural population. More specifically, it is determined by the degree of synchrony that causes the neuron to fire. In this paper, we investigate the relationship between the input and the operational mode. We compare the response-relevant input synchrony, which measures the operational mode and can be determined using a membrane potential slope-based measure [7], with the spike time distance of the spike trains driving the neuron, which measures spike train synchrony and can be determined using the multivariate SPIKE-distance metric [10]. We discover that the relationship between the two measures changes substantially based on the values of the parameters of the input (firing rate and number of spike trains) and the parameters of the post-synaptic neuron (synaptic weight, membrane leak time constant and spike threshold). More importantly, we determine how the parameters interact to shape the synchrony-operational mode relationship. Our results indicate that the amount of depolarisation caused by a highly synchronous volley of input spikes, is the most influential factor in defining the relationship between input synchrony and operational mode. This is defined by the number of input spikes and the membrane potential depolarisation caused per spike, compared to the spike threshold.
Fluctuation scaling in neural spike trains
Shinsuke Koyama and Ryota Kobayashi
2016, 13(3): 537-550 doi: 10.3934/mbe.2016006 +[Abstract](234) +[PDF](515.0KB)
Abstract:
Fluctuation scaling has been observed universally in a wide variety of phenomena. In time series that describe sequences of events, fluctuation scaling is expressed as power function relationships between the mean and variance of either inter-event intervals or counting statistics, depending on measurement variables. In this article, fluctuation scaling has been formulated for a series of events in which scaling laws in the inter-event intervals and counting statistics were related. We have considered the first-passage time of an Ornstein-Uhlenbeck process and used a conductance-based neuron model with excitatory and inhibitory synaptic inputs to demonstrate the emergence of fluctuation scaling with various exponents, depending on the input regimes and the ratio between excitation and inhibition. Furthermore, we have discussed the possible implication of these results in the context of neural coding.
Effect of spontaneous activity on stimulus detection in a simple neuronal model
Marie Levakova
2016, 13(3): 551-568 doi: 10.3934/mbe.2016007 +[Abstract](249) +[PDF](524.2KB)
Abstract:
It is studied what level of a continuous-valued signal is optimally estimable on the basis of first-spike latency neuronal data. When a spontaneous neuronal activity is present, the first spike after the stimulus onset may be caused either by the stimulus itself, or it may be a result of the prevailing spontaneous activity. Under certain regularity conditions, Fisher information is the inverse of the variance of the best estimator. It can be considered as a function of the signal intensity and then indicates accuracy of the estimation for each signal level. The Fisher information is normalized with respect to the time needed to obtain an observation. The accuracy of signal level estimation is investigated in basic discharge patterns modelled by a Poisson and a renewal process and the impact of the complex interaction between spontaneous activity and a delay of the response is shown.
A model based rule for selecting spiking thresholds in neuron models
Frederik Riis Mikkelsen
2016, 13(3): 569-578 doi: 10.3934/mbe.2016008 +[Abstract](155) +[PDF](1981.7KB)
Abstract:
Determining excitability thresholds in neuronal models is of high interest due to its applicability in separating spiking from non-spiking phases of neuronal membrane potential processes. However, excitability thresholds are known to depend on various auxiliary variables, including any conductance or gating variables. Such dependences pose as a double-edged sword; they are natural consequences of the complexity of the model, but proves difficult to apply in practice, since gating variables are rarely measured.
    In this paper a technique for finding excitability thresholds, based on the local behaviour of the flow in dynamical systems, is presented. The technique incorporates the dynamics of the auxiliary variables, yet only produces thresholds for the membrane potential. The method is applied to several classical neuron models and the threshold's dependence upon external parameters is studied, along with a general evaluation of the technique.
On the properties of input-to-output transformations in neuronal networks
Andrey Olypher and Jean Vaillant
2016, 13(3): 579-596 doi: 10.3934/mbe.2016009 +[Abstract](190) +[PDF](2168.3KB)
Abstract:
Information processing in neuronal networks in certain important cases can be considered as maps of binary vectors, where ones (spikes) and zeros (no spikes) of input neurons are transformed into spikes and no spikes of output neurons. A simple but fundamental characteristic of such a map is how it transforms distances between input vectors into distances between output vectors. We advanced earlier known results by finding an exact solution to this problem for McCulloch-Pitts neurons. The obtained explicit formulas allow for detailed analysis of how the network connectivity and neuronal excitability affect the transformation of distances in neurons. As an application, we explored a simple model of information processing in the hippocampus, a brain area critically implicated in learning and memory. We found network connectivity and neuronal excitability parameter values that optimize discrimination between similar and distinct inputs. A decrease of neuronal excitability, which in biological neurons may be associated with decreased inhibition, impaired the optimality of discrimination.
A new firing paradigm for integrate and fire stochastic neuronal models
Roberta Sirovich and Luisa Testa
2016, 13(3): 597-611 doi: 10.3934/mbe.2016010 +[Abstract](198) +[PDF](2300.9KB)
Abstract:
A new definition of firing time is given in the framework of Integrate and Fire neuronal models. The classical absorption condition at the threshold is relaxed and the firing time is defined as the first time the membrane potential process lies above a fixed depolarisation level for a sufficiently long time. The mathematical properties of the new firing time are investigated both for the Perfect Integrator and the Leaky Integrator. In the latter case, a simulation study is presented to complete the analysis where analytical results are not yet achieved.
Approximation of the first passage time density of a Wiener process to an exponentially decaying boundary by two-piecewise linear threshold. Application to neuronal spiking activity
Massimiliano Tamborrino
2016, 13(3): 613-629 doi: 10.3934/mbe.2016011 +[Abstract](196) +[PDF](597.4KB)
Abstract:
The first passage time density of a diffusion process to a time varying threshold is of primary interest in different fields. Here, we consider a Brownian motion in presence of an exponentially decaying threshold to model the neuronal spiking activity. Since analytical expressions of the first passage time density are not available, we propose to approximate the curved boundary by means of a continuous two-piecewise linear threshold. Explicit expressions for the first passage time density towards the new boundary are provided. First, we introduce different approximating linear thresholds. Then, we describe how to choose the optimal one minimizing the distance to the curved boundary, and hence the error in the corresponding passage time density. Theoretical means, variances and coefficients of variation given by our method are compared with empirical quantities from simulated data. Moreover, a further comparison with firing statistics derived under the assumption of a small amplitude of the time-dependent change in the threshold, is also carried out. Finally, maximum likelihood and moment estimators of the parameters of the model are derived and applied on simulated data.

2016  Impact Factor: 1.035

Editors

Referees

Librarians

Email Alert

[Back to Top]