Next Article in Journal
Accounting for Round-Off Errors When Using Gradient Minimization Methods
Previous Article in Journal
Projection onto the Set of Rank-Constrained Structured Matrices for Reduced-Order Controller Design
Previous Article in Special Issue
Topology Optimisation under Uncertainties with Neural Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Special Issue: Stochastic Algorithms and Their Applications

by
Stéphanie Allassonnière
1,2
1
School of Medicine, Université Paris Cité, CRC UMR 1138, INSERM, Sorbonne Université, 75006 Paris, France
2
INRIA EPI HeKA, 75006 Paris, France
Algorithms 2022, 15(9), 323; https://doi.org/10.3390/a15090323
Submission received: 5 September 2022 / Accepted: 6 September 2022 / Published: 9 September 2022
(This article belongs to the Special Issue Stochastic Algorithms and Their Applications)
Stochastic algorithms are at the core of machine learning and artificial intelligence. Stochastic gradient descent and Expectation–Maximization (EM)-like algorithms, among others, offer incredible tools to calibrate statistical models or deep networks. Their studies are of particular interest to ensure guarantees on their results, improve their convergence speed and optimize their use in machine learning problems.
The research fields of these algorithms are extremely diverse, ranging from theoretical optimization, computer vision (CV) or natural language processing (NLP), and targeting emerging applications, such as transport (for example, for autonomous vehicles) or medical data analysis (for example, to propose decision support systems).
To encourage further original research on stochastic algorithms and their applications, we set up a Special Issue of the MDPI journal Algorithms devoted to this topic. The call for papers invited articles dealing with all the aspects where stochastic algorithms are used. This includes the foundations of stochastic algorithms to promote new algorithms [1,2] and their theoretical and practical analysis, the estimation of new statistical models [3,4,5], and the introduction of topology and geometry into models to better reflect the data structures [6,7]. New applied researches were also considered [8,9], as stochastic algorithms are at the core of applicative use cases. Finally, a review paper on performances of stochastic programing [10] ends this Special Issue.
All of the articles submitted to the Special Issue were evaluated by invited experts. In many cases, their detailed comments improved the technical strength and the quality of presentation. After several rounds of revisions and reviewing, ten of the submitted articles were accepted for inclusion in the Special Issue.

Acknowledgments

Stéphanie Allassonnière, the Guest Editor of this Special Issue, would like to thank all researchers who submitted their work, the invited expert reviewers, MDPI, and the editorial office for their assistance.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Lartigue, T.; Durrleman, S.; Allassonnière, S. Deterministic Approximate EM Algorithm; Application to the Riemann Approximation EM and the Tempered EM. Algorithms 2022, 15, 78. [Google Scholar] [CrossRef]
  2. Indrapriyadarsini, S.; Mahboubi, S.; Ninomiya, H.; Kamio, T.; Asai, H. Accelerating Symmetric Rank-1 Quasi-Newton Method with Nesterov’s Gradient for Training Neural Networks. Algorithms 2022, 15, 6. [Google Scholar] [CrossRef]
  3. Górniak, P. Modeling of the 5G-Band Patch Antennas Using ANNs under the Uncertainty of the Geometrical Design Parameters Associated with the Manufacturing Process. Algorithms 2022, 15, 7. [Google Scholar] [CrossRef]
  4. Blueschke, D.; Blueschke-Nikolaeva, V.; Neck, R. Approximately Optimal Control of Nonlinear Dynamic Stochastic Problems with Learning: The OPTCON Algorithm. Algorithms 2021, 14, 181. [Google Scholar] [CrossRef]
  5. Amirghasemi, M. An Effective Decomposition-Based Stochastic Algorithm for Solving the Permutation Flow-Shop Scheduling Problem. Algorithms 2021, 14, 112. [Google Scholar] [CrossRef]
  6. Eigel, M.; Haase, M.; Neumann, J. Topology Optimisation under Uncertainties with Neural Networks. Algorithms 2022, 15, 241. [Google Scholar] [CrossRef]
  7. Jensen, M.H.; Sommer, S. Mean Estimation on the Diagonal of Product Manifolds. Algorithms 2022, 15, 92. [Google Scholar] [CrossRef]
  8. Pham, H.; Shehada, E.R.; Stahlheber, S.; Pandey, K.; Hayes, W.B. No Cell Left behind: Automated, Stochastic, Physics-Based Tracking of Every Cell in a Dense, Growing Colony. Algorithms 2022, 15, 51. [Google Scholar] [CrossRef]
  9. Tzougas, G.; Hong, N.; Ho, R. Mixed Poisson Regression Models with Varying Dispersion Arising from Non-Conjugate Mixing Distributions. Algorithms 2022, 15, 16. [Google Scholar] [CrossRef]
  10. Torres, J.; Li, C.; Apap, R.M.; Grossmann, I.E. A Review on the Performance of Linear and Mixed Integer Two-Stage Stochastic Programming Software. Algorithms 2022, 15, 103. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Allassonnière, S. Special Issue: Stochastic Algorithms and Their Applications. Algorithms 2022, 15, 323. https://doi.org/10.3390/a15090323

AMA Style

Allassonnière S. Special Issue: Stochastic Algorithms and Their Applications. Algorithms. 2022; 15(9):323. https://doi.org/10.3390/a15090323

Chicago/Turabian Style

Allassonnière, Stéphanie. 2022. "Special Issue: Stochastic Algorithms and Their Applications" Algorithms 15, no. 9: 323. https://doi.org/10.3390/a15090323

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop