Special Issue "Stochastic Processes in Neuronal Modeling"

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Network Science".

Deadline for manuscript submissions: closed (31 October 2021) | Viewed by 7542

Special Issue Editors

Dr. Enrica Pirozzi
E-Mail Website
Guest Editor
Department of Mathematics and Applications "R.Caccioppoli", University of Naples Federico II, Napoli, Italy
Interests: markov processes; semi-markov processes; first passage problems; fractional dynamics; fractional brownian motion; coupled dynamics; queuing theory; leaky integrate-and-fire neuronal models; computational methods for stochastic models; stochastic simulation techniques
Prof. Dr. Eva Löcherbach
E-Mail Website
Guest Editor
Université Paris 1 Panthéon Sorbonne, 90 rue de Tolbiac, 75013 Paris, France
Interests: interacting particle systems; stochastic models in neuroscience; longtime behavior of stochastic processes; coupling and perfect simulation; Hawkes processes; chains and processes with memory of variable length

Special Issue Information

Dear Colleagues,

The aim of this Special Issue is to publish original research articles covering advances in the theory of stochastic processes and stochastic models for single and networks of neurons. In particular, with the aim of improving upon existing neuronal models or creating new models, continuous and discrete time stochastic processes will be discussed, as well as stochastic differential equations, fractional differential equations, correlated processes, first passage time problems, stochastic optimal controls, mean-field limits, interacting particle systems, statistics of stochastic processes, parameter estimation and simulation techniques.

Potential topics include, but are not limited to:

-Markov and semi-Markov processes

-Time-changed processes

-Markov chains

-Jump processes

-Coupled dynamics

-Fractional processes

-Space-time fractional equations

-Fractional Brownian motion

-Long-range dependence

-Hawkes processes

-Integrate and fire models

-Mean-field limits

-Stochastic optimal control

-Population models

-Computational methods for stochastic models

-Information theory and estimation theory for computational neuroscience

-Large deviations and limit theorems

-Numerical and simulations approaches

Prof. Enrica Pirozzi
Prof. Eva Löcherbach
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Stochastic differential equations
  • Neuronal models
  • Fractional dynamics
  • Long-range dependence
  • Population models
  • Information measures
  • Statistics
  • Simulation

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Article
Convergence in Total Variation of Random Sums
Mathematics 2021, 9(2), 194; https://doi.org/10.3390/math9020194 - 19 Jan 2021
Viewed by 537
Abstract
Let (Xn) be a sequence of real random variables, (Tn) a sequence of random indices, and (τn) a sequence of constants such that τn. The asymptotic behavior of [...] Read more.
Let (Xn) be a sequence of real random variables, (Tn) a sequence of random indices, and (τn) a sequence of constants such that τn. The asymptotic behavior of Ln=(1/τn)i=1TnXi, as n, is investigated when (Xn) is exchangeable and independent of (Tn). We give conditions for Mn=τn(LnL)M in distribution, where L and M are suitable random variables. Moreover, when (Xn) is i.i.d., we find constants an and bn such that supAB(R)|P(LnA)P(LA)|an and supAB(R)|P(MnA)P(MA)|bn for every n. In particular, LnL or MnM in total variation distance provided an0 or bn0, as it happens in some situations. Full article
(This article belongs to the Special Issue Stochastic Processes in Neuronal Modeling)
Article
Analysis of the Past Lifetime in a Replacement Model through Stochastic Comparisons and Differential Entropy
Mathematics 2020, 8(8), 1203; https://doi.org/10.3390/math8081203 - 22 Jul 2020
Cited by 5 | Viewed by 704
Abstract
A suitable replacement model for random lifetimes is extended to the context of past lifetimes. At a fixed time u an item is planned to be replaced by another one having the same age but a different lifetime distribution. We investigate the past [...] Read more.
A suitable replacement model for random lifetimes is extended to the context of past lifetimes. At a fixed time u an item is planned to be replaced by another one having the same age but a different lifetime distribution. We investigate the past lifetime of this system, given that at a larger time t the system is found to be failed. Subsequently, we perform some stochastic comparisons between the random lifetimes of the single items and the doubly truncated random variable that describes the system lifetime. Moreover, we consider the relative ratio of improvement evaluated at x ( u , t ) , which is finalized to measure the goodness of the replacement procedure. The characterization and the properties of the differential entropy of the system lifetime are also discussed. Finally, an example of application to the firing activity of a stochastic neuronal model is provided. Full article
(This article belongs to the Special Issue Stochastic Processes in Neuronal Modeling)
Show Figures

Figure 1

Article
Revealing Spectrum Features of Stochastic Neuron Spike Trains
Mathematics 2020, 8(6), 1011; https://doi.org/10.3390/math8061011 - 20 Jun 2020
Cited by 4 | Viewed by 906
Abstract
Power spectra of spike trains reveal important properties of neuronal behavior. They exhibit several peaks, whose shape and position depend on applied stimuli and intrinsic biophysical properties, such as input current density and channel noise. The position of the spectral peaks in the [...] Read more.
Power spectra of spike trains reveal important properties of neuronal behavior. They exhibit several peaks, whose shape and position depend on applied stimuli and intrinsic biophysical properties, such as input current density and channel noise. The position of the spectral peaks in the frequency domain is not straightforwardly predictable from statistical averages of the interspike intervals, especially when stochastic behavior prevails. In this work, we provide a model for the neuronal power spectrum, obtained from Discrete Fourier Transform and expressed as a series of expected value of sinusoidal terms. The first term of the series allows us to estimate the frequencies of the spectral peaks to a maximum error of a few Hz, and to interpret why they are not harmonics of the first peak frequency. Thus, the simple expression of the proposed power spectral density (PSD) model makes it a powerful interpretative tool of PSD shape, and also useful for neurophysiological studies aimed at extracting information on neuronal behavior from spike train spectra. Full article
(This article belongs to the Special Issue Stochastic Processes in Neuronal Modeling)
Show Figures

Figure 1

Article
A Semi-Markov Leaky Integrate-and-Fire Model
Mathematics 2019, 7(11), 1022; https://doi.org/10.3390/math7111022 - 29 Oct 2019
Cited by 9 | Viewed by 847
Abstract
In this paper, a Leaky Integrate-and-Fire (LIF) model for the membrane potential of a neuron is considered, in case the potential process is a semi-Markov process. Semi-Markov property is obtained here by means of the time-change of a Gauss-Markov process. This model has [...] Read more.
In this paper, a Leaky Integrate-and-Fire (LIF) model for the membrane potential of a neuron is considered, in case the potential process is a semi-Markov process. Semi-Markov property is obtained here by means of the time-change of a Gauss-Markov process. This model has some merits, including heavy-tailed distribution of the waiting times between spikes. This and other properties of the process, such as the mean, variance and autocovariance, are discussed. Full article
(This article belongs to the Special Issue Stochastic Processes in Neuronal Modeling)
Article
On the Integral of the Fractional Brownian Motion and Some Pseudo-Fractional Gaussian Processes
Mathematics 2019, 7(10), 991; https://doi.org/10.3390/math7100991 - 18 Oct 2019
Cited by 8 | Viewed by 1359
Abstract
We investigate the main statistical parameters of the integral over time of the fractional Brownian motion and of a kind of pseudo-fractional Gaussian process, obtained as a classical Gauss–Markov process from Doob representation by replacing Brownian motion with fractional Brownian motion. Possible applications [...] Read more.
We investigate the main statistical parameters of the integral over time of the fractional Brownian motion and of a kind of pseudo-fractional Gaussian process, obtained as a classical Gauss–Markov process from Doob representation by replacing Brownian motion with fractional Brownian motion. Possible applications in the context of neuronal models are highlighted. A fractional Ornstein–Uhlenbeck process is considered and relations with the integral of the pseudo-fractional Gaussian process are provided. Full article
(This article belongs to the Special Issue Stochastic Processes in Neuronal Modeling)
Show Figures

Figure 1

Article
Poincaré-Type Inequalities for Compact Degenerate Pure Jump Markov Processes
Mathematics 2019, 7(6), 518; https://doi.org/10.3390/math7060518 - 06 Jun 2019
Cited by 3 | Viewed by 1027
Abstract
We aim to prove Poincaré inequalities for a class of pure jump Markov processes inspired by the model introduced by Galves and Löcherbach to describe the behavior of interacting brain neurons. In particular, we consider neurons with degenerate jumps, i.e., which lose their [...] Read more.
We aim to prove Poincaré inequalities for a class of pure jump Markov processes inspired by the model introduced by Galves and Löcherbach to describe the behavior of interacting brain neurons. In particular, we consider neurons with degenerate jumps, i.e., which lose their memory when they spike, while the probability of a spike depends on the actual position and thus the past of the whole neural system. The process studied by Galves and Löcherbach is a point process counting the spike events of the system and is therefore non-Markovian. In this work, we consider a process describing the membrane potential of each neuron that contains the relevant information of the past. This allows us to work in a Markovian framework. Full article
(This article belongs to the Special Issue Stochastic Processes in Neuronal Modeling)
Article
Retrieving a Context Tree from EEG Data
Mathematics 2019, 7(5), 427; https://doi.org/10.3390/math7050427 - 14 May 2019
Cited by 1 | Viewed by 1469
Abstract
It has been repeatedly conjectured that the brain retrieves statistical regularities from stimuli. Here, we present a new statistical approach allowing to address this conjecture. This approach is based on a new class of stochastic processes, namely, sequences of random objects driven by [...] Read more.
It has been repeatedly conjectured that the brain retrieves statistical regularities from stimuli. Here, we present a new statistical approach allowing to address this conjecture. This approach is based on a new class of stochastic processes, namely, sequences of random objects driven by chains with memory of variable length. Full article
(This article belongs to the Special Issue Stochastic Processes in Neuronal Modeling)
Show Figures

Figure 1

Back to TopTop