Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (320)

Search Parameters:
Keywords = deterministic mathematical models

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
33 pages, 3234 KB  
Article
Advancing Cancer Research Through Stochastic Modeling: Insights into Tumor Growth, Evolution, and Treatment Response
by Tahmineh Azizi
AppliedMath 2026, 6(3), 38; https://doi.org/10.3390/appliedmath6030038 - 3 Mar 2026
Abstract
The complex and heterogeneous nature of cancer necessitates advanced modeling techniques to better understand tumor dynamics and inform treatment strategies. This paper explores the application of stochastic modeling in cancer research, focusing on five key areas: tumor growth kinetics, evolutionary dynamics of cancer, [...] Read more.
The complex and heterogeneous nature of cancer necessitates advanced modeling techniques to better understand tumor dynamics and inform treatment strategies. This paper explores the application of stochastic modeling in cancer research, focusing on five key areas: tumor growth kinetics, evolutionary dynamics of cancer, treatment response and resistance, spatial modeling of tumor progression, and clinical applications of stochastic models. We first examine how stochastic models capture the randomness in tumor growth and proliferation, providing insights into cellular behaviors that deterministic models may overlook. Next, we investigate the evolutionary dynamics that govern tumor heterogeneity and the emergence of resistance, highlighting the role of genetic mutations and environmental pressures. The paper also discusses how stochastic modeling can improve predictions of treatment responses, elucidating mechanisms behind therapy resistance in various tumor subpopulations. Furthermore, we address the significance of spatial modeling in understanding tumor interactions within their microenvironment, shedding light on processes such as metastasis. Finally, we emphasize the translational potential of these mathematical frameworks, demonstrating how they can enhance personalized medicine approaches in oncology. By integrating stochastic modeling into cancer research, this work contributes to a deeper understanding of cancer biology and paves the way for improved patient outcomes. Full article
(This article belongs to the Section Probabilistic & Statistical Mathematics)
Show Figures

Figure 1

48 pages, 3619 KB  
Article
Comparative Assessment of the Reliability of Non-Recoverable Subsystems of Mining Electronic Equipment Using Various Computational Methods
by Nikita V. Martyushev, Boris V. Malozyomov, Anton Y. Demin, Alexander V. Pogrebnoy, Georgy E. Kurdyumov, Viktor V. Kondratiev and Antonina I. Karlina
Mathematics 2026, 14(4), 723; https://doi.org/10.3390/math14040723 - 19 Feb 2026
Viewed by 218
Abstract
The assessment of reliability in non-repairable subsystems of mining electronic equipment represents a computationally challenging problem, particularly for complex and highly connected structures. This study presents a systematic comparative analysis of several deterministic approaches for reliability estimation, focusing on their computational efficiency, accuracy, [...] Read more.
The assessment of reliability in non-repairable subsystems of mining electronic equipment represents a computationally challenging problem, particularly for complex and highly connected structures. This study presents a systematic comparative analysis of several deterministic approaches for reliability estimation, focusing on their computational efficiency, accuracy, and applicability. The investigated methods include classical boundary techniques (minimal paths and cuts), analytical decomposition based on the Bayes theorem, the logic–probabilistic method (LPM) employing triangle–star transformations, and the algorithmic Structure Convolution Method (SCM), which is based on matrix reduction of the system’s connectivity graph. The reliability problem is formally represented using graph theory, where each element is modeled as a binary variable with independent failures, which is a standard and practically justified assumption for power electronic subsystems operating without common-cause coupling. Numerical experiments were carried out on canonical benchmark topologies—bridge, tree, grid, and random connected graphs—representing different levels of structural complexity. The results demonstrate that the SCM achieves exact reliability values with up to six orders of magnitude acceleration compared to the LPM for systems containing more than 20 elements, while maintaining polynomial computational complexity. Qualitatively, the compared approaches differ in the nature of the output and practical applicability: boundary methods provide fast interval estimates suitable for preliminary screening, whereas decomposition may exhibit a systematic bias for highly connected (non-series–parallel) topologies. In contrast, the SCM consistently preserves exactness while remaining computationally tractable for medium and large sparse-to-moderately dense graphs, making it preferable for repeated recalculations in design and optimization workflows. The methods were implemented in Python 3.7 using NumPy and NetworkX, ensuring transparency and reproducibility. The findings confirm that the SCM is an efficient, scalable, and mathematically rigorous tool for reliability assessment and structural optimization of large-scale non-repairable systems. The presented methodology provides practical guidelines for selecting appropriate reliability evaluation techniques based on system complexity and computational resource constraints. Full article
Show Figures

Figure 1

23 pages, 2771 KB  
Article
Mathematical Modeling for Contagious Dental Health Issue: An Early Study of Streptococcus mutans Transmission
by Sanubari Tansah Tresna, Nursanti Anggriani, Herlina Napitupulu, Wan Muhamad Amir W. Ahmad and Asty Samiati Setiawan
Mathematics 2026, 14(4), 704; https://doi.org/10.3390/math14040704 - 17 Feb 2026
Viewed by 157
Abstract
Dental caries is an example of an oral infectious disease that affects many people worldwide, but it is not well studied in deterministic mathematical modeling. Therefore, we are interested in studying the dynamics of tooth cavity disease using a deterministic modeling approach. We [...] Read more.
Dental caries is an example of an oral infectious disease that affects many people worldwide, but it is not well studied in deterministic mathematical modeling. Therefore, we are interested in studying the dynamics of tooth cavity disease using a deterministic modeling approach. We propose a delay differential equation system (DDEs) to describe the phenomenon. The breakthrough of the constructed model is the formulation of the recovery rate as a saturation function constrained by healthcare capacity and the plausibility of caries reformation. In addition, we consider two controls, such as a health campaign and a post-treatment intervention. The mathematical analysis yields equilibrium solutions and their stability, which is determined by the basic reproduction number R0. Furthermore, backward bifurcation occurs as the medical facility’s capacity decreases, driven by an increasing infectious population. The sensitivity analysis results indicate that both considered controls are the most influential parameters. The optimal control problem is formulated using the Pontryagin Maximum Principle to obtain an optimal solution in suppressing the number of caries formation cases. At the end, a numerical simulation shows that interventions reduce the risk of transmission and suppress the number of infectious individuals. The constructed model has excellent future potential, such as generating a function for relapse cases or other preventive actions into an optimal control problem. Full article
(This article belongs to the Section E3: Mathematical Biology)
Show Figures

Figure 1

23 pages, 351 KB  
Review
Modeling COVID-19 Population Dynamics with a Viral Reservoir and Human Mobility
by Jené Mercia van Schalkwyk, Peter Joseph Witbooi, Sibaliwe Maku Vyambwera and Mozart Umba Nsuami
AppliedMath 2026, 6(2), 27; https://doi.org/10.3390/appliedmath6020027 - 10 Feb 2026
Viewed by 227
Abstract
This article introduces and thoroughly examines a novel deterministic compartmental model of COVID-19 dynamics. The model uniquely incorporates compartments for symptomatic and asymptomatic individuals alongside an environmental reservoir for the pathogen. It also accounts for a steady inflow of infected visitors and a [...] Read more.
This article introduces and thoroughly examines a novel deterministic compartmental model of COVID-19 dynamics. The model uniquely incorporates compartments for symptomatic and asymptomatic individuals alongside an environmental reservoir for the pathogen. It also accounts for a steady inflow of infected visitors and a steady outflow from the removed class. The mathematical soundness of the model is established by identifying the invariant region and ensuring positivity of solutions. Notably, during surges of infected visitors, certain classes maintain positive minimum values. We analytically determine endemic equilibrium points and prove the global stability of the disease-free equilibrium. Sensitivity analysis highlights the significant roles of transmission rates and asymptomatic individuals in disease spread. Simulation results corroborate the theoretical findings and provide additional insights into the model’s predictive capabilities. Full article
Show Figures

Figure 1

18 pages, 2627 KB  
Article
Application of Machine Learning Techniques in the Prediction of Surface Geometry
by Aneta Gądek-Moszczak, Dominik Nowakowski and Norbert Radek
Materials 2026, 19(4), 661; https://doi.org/10.3390/ma19040661 - 9 Feb 2026
Viewed by 309
Abstract
The article presents an attempt by the authors to generate a digital representation of the analyzed surface layer of WC-Co-Al2O3 coating deposited by the ESD method. The WC-Co-Al2O3 surface layer is superhard and abrasion-resistant, significantly increasing the [...] Read more.
The article presents an attempt by the authors to generate a digital representation of the analyzed surface layer of WC-Co-Al2O3 coating deposited by the ESD method. The WC-Co-Al2O3 surface layer is superhard and abrasion-resistant, significantly increasing the exploitation time of working elements. The authors aim to develop a method for generating series of digital surfaces with similar geometry parameters based on data collected through profilometric analysis. Therefore, the advanced integration of machine learning (ML) techniques with classical statistical approaches for modeling and predicting stochastic processes. While traditional models such as ARMA/ARIMA and hidden Markov models (HMMs) offer mathematical rigor, they often impose assumptions of stationarity and linearity, which limits their application to complex, noisy data. This paper proposes a model for surface geometry generation based on experimental data that combines recurrent neural networks (RNNs) and Monte Carlo simulation. Additionally, the study reviews emerging methods, including generative adversarial networks (GANs) for stochastic simulation and expectation-maximization (EM) algorithms for parameter estimation. An empirical case study on WC-Co-AL2O3 surface geometries demonstrates the effectiveness of ML–stochastic hybrids in capturing both deterministic structures and random fluctuations. The findings underscore not only the benefits but also the limitations of such models, including high computational demands and interpretability challenges, while proposing future research directions toward physics-informed ML and explainable AI. Full article
(This article belongs to the Special Issue Advances in Surface Engineering: Functional Films and Coatings)
Show Figures

Figure 1

32 pages, 3134 KB  
Article
Dynamics and Sensitivity of the Lifecycle of Hepatitis B Virus
by Dmitry Grebennikov, Igor Sazonov, Rostislav Savinkov, Matvey Zakharov, Mark Sorokin, Yakov Mokin, Andreas Meyerhans and Gennady Bocharov
Pathogens 2026, 15(2), 172; https://doi.org/10.3390/pathogens15020172 - 5 Feb 2026
Viewed by 348
Abstract
A detailed mathematical model has been developed for the dynamics of hepatitis B virus (HBV) infection in a single cell. It provides a platform for a better quantitative understanding of the biochemical kinetics of the HBV lifecycle. The model is used to study [...] Read more.
A detailed mathematical model has been developed for the dynamics of hepatitis B virus (HBV) infection in a single cell. It provides a platform for a better quantitative understanding of the biochemical kinetics of the HBV lifecycle. The model is used to study the sensitivity of virus growth, providing a clear ranking of intracellular virus replication processes with respect to their contribution to net viral production. The stochastic formulation of the model enables the quantification of the variability characteristics in viral production, the probability of productive infection and the secretion of protein- and genome-deficient viral particles. An essential difference in infection efficiency between deterministic and stochastic models has been revealed. For example, in the case of MOI=1, the mean value of the total number of mature virions released during the lifecycle of the infection in the stochastic model is 1.06, whereas, in the deterministic model, its value is less than one thousandth and thus close to 0. The model is also used to quantitatively predict the effect of combinations of direct-acting antivirals, such as small interfering RNAs, capsid inhibitors and nucleoside analogues. The model shows that the inhibitory effect of siRNA on viral production is approximately two orders of magnitude higher than that of nucleoside analogues and capsid inhibitors. Full article
(This article belongs to the Section Viral Pathogens)
Show Figures

Figure 1

23 pages, 968 KB  
Article
TLOA: A Power-Adaptive Algorithm Based on Air–Ground Cooperative Jamming
by Wenpeng Wu, Zhenhua Wei, Haiyang You, Zhaoguang Zhang, Chenxi Li, Jianwei Zhan and Shan Zhao
Future Internet 2026, 18(2), 81; https://doi.org/10.3390/fi18020081 - 2 Feb 2026
Viewed by 287
Abstract
Air–ground joint jamming enables three-dimensional, distributed jamming configurations, making it effective against air–ground communication networks with complex, dynamically adjustable links. Once the jamming layout is fixed, dynamic jamming power scheduling becomes essential to conserve energy and prolong jamming duration. However, existing methods suffer [...] Read more.
Air–ground joint jamming enables three-dimensional, distributed jamming configurations, making it effective against air–ground communication networks with complex, dynamically adjustable links. Once the jamming layout is fixed, dynamic jamming power scheduling becomes essential to conserve energy and prolong jamming duration. However, existing methods suffer from poor applicability in such scenarios, primarily due to their sparse deployment and adversarial nature. To address this limitation, this paper develops a set of mathematical models and a dedicated algorithm for air–ground communication countermeasures. Specifically, we (1) randomly select communication nodes to determine the jammer operation sequence; (2) schedule the number of active jammers by sorting transmission path losses in ascending order; and (3) estimate jamming effects using electromagnetic wave propagation characteristics to adjust jamming power dynamically. This approach formally converts the original dynamic, stochastic jamming resource scheduling problem into a static, deterministic one via cognitive certainty of dynamic parameters and deterministic modeling of stochastic factors—enabling rapid adaptation to unknown, dynamic communication power strategies and resolving the coordination challenge in air–ground joint jamming. Experimental results demonstrate that the proposed Transmission Loss Ordering Algorithm (TLOA) extends the system operating duration by up to 41.6% compared to benchmark methods (e.g., genetic algorithm). Full article
(This article belongs to the Special Issue Adversarial Attacks and Cyber Security)
Show Figures

Graphical abstract

29 pages, 2336 KB  
Article
Analyzing the Impact of Vandalism, Hoarding, and Strikes on Fuel Distribution in Nigeria
by Adam Ajimoti Ishaq, Kazeem Babatunde Akande, Samuel T. Akinyemi, Adejimi A. Adeniji, Kekana C. Malesela and Kayode Oshinubi
Computation 2026, 14(2), 30; https://doi.org/10.3390/computation14020030 - 1 Feb 2026
Viewed by 467
Abstract
Fuel scarcity remains a recurrent challenge in Nigeria, with significant socioeconomic consequences despite the country’s status as a major crude oil producer. This study develops a novel deterministic mathematical model to examine the dynamics of petroleum product distribution in Nigeria’s downstream sector, with [...] Read more.
Fuel scarcity remains a recurrent challenge in Nigeria, with significant socioeconomic consequences despite the country’s status as a major crude oil producer. This study develops a novel deterministic mathematical model to examine the dynamics of petroleum product distribution in Nigeria’s downstream sector, with particular emphasis on Premium Motor Spirit (PMS). The model explicitly incorporates key disruption and behavioral mechanisms: pipeline vandalism, industrial actions, product diversion, and hoarding that collectively drive persistent fuel shortages. The model’s feasibility, positivity of solutions, and existence and uniqueness were established, ensuring consistency with real-world operational conditions. Five equilibrium points were identified, reflecting distinct operational regimes within the distribution network. A critical distribution threshold was analytically derived and numerically validated, revealing that a minimum supply of approximately 42 million liters of PMS per day is required to satisfy demand and eliminate fuel queues. Local and global stability analyses, conducted using Lyapunov functions and the Routh–Hurwitz criteria, demonstrate that stable fuel distribution is achievable under effective policy coordination and stakeholder compliance. Numerical simulations show that hoarding by private retail marketers substantially intensifies scarcity, while industrial actions by transporters exert a more severe disruption than pipeline vandalism. The results further highlight the stabilizing role of alternative transportation routes, such as rail systems, in mitigating infrastructure failures and road-based logistics risks. Although refinery sources are aggregated and rail transport is idealized, the proposed framework offers a robust and adaptable tool for policy analysis, with relevance to both oil-producing and fuel-import-dependent economies. Full article
Show Figures

Figure 1

30 pages, 2039 KB  
Article
Quantifying the Trajectory Tracking Accuracy in UGVs: The Role of Traffic Scheduling in Wi-Fi-Enabled Time-Sensitive Networking
by Elena Ferrari, Alberto Morato, Federico Tramarin, Claudio Zunino and Matteo Bertocco
Sensors 2026, 26(3), 881; https://doi.org/10.3390/s26030881 - 29 Jan 2026
Viewed by 258
Abstract
Accurate trajectory tracking is a key requirement in unmanned ground vehicles (UGVs) operating in autonomous driving, mobile robotics, and industrial automation. In wireless Time-Sensitive Networking (WTSN) scenarios, trajectory accuracy strongly depends on deterministic packet delivery, precise traffic scheduling, and time synchronization among distributed [...] Read more.
Accurate trajectory tracking is a key requirement in unmanned ground vehicles (UGVs) operating in autonomous driving, mobile robotics, and industrial automation. In wireless Time-Sensitive Networking (WTSN) scenarios, trajectory accuracy strongly depends on deterministic packet delivery, precise traffic scheduling, and time synchronization among distributed devices. This paper quantifies the impact of IEEE 802.1Qbv time-aware traffic scheduling on trajectory tracking accuracy in UGVs operating over Wi-Fi-enabled TSN networks. The analysis focuses on how misconfigured real-time (RT) and best-effort (BE) transmission windows, as well as clock misalignment between devices, affect packet reception and control performance. A mathematical framework is introduced to predict the number of correctly received RT packets based on cycle time, packet periodicity, scheduling window lengths, and synchronization offsets, enabling the a priori dimensioning of RT and BE windows. The proposed model is validated through extensive simulations conducted in an ROS–Gazebo environment, utilising Linux-based traffic shaping and scheduling tools. Results show that improper traffic scheduling and synchronization offsets can significantly degrade trajectory tracking accuracy, while correctly dimensioned scheduling windows ensure reliable packet delivery and stable control, even under imperfect synchronization. The proposed approach provides practical design guidelines for configuring wireless TSN networks supporting real-time trajectory tracking in mobile robotic systems. Full article
Show Figures

Figure 1

31 pages, 1726 KB  
Article
Entrepreneurship and Conway’s Game of Life: A Theoretical Approach from a Systemic Perspective
by Félix Oscar Socorro Márquez, Giovanni Efrain Reyes Ortiz and Harold Torrez Meruvia
Adm. Sci. 2026, 16(1), 45; https://doi.org/10.3390/admsci16010045 - 16 Jan 2026
Viewed by 562
Abstract
This study establishes a comprehensive structural isomorphism between Conway’s Game of Life and the entrepreneurial process, analysing the latter as a complex adaptive system governed by non-linear dynamics rather than linear predictability. Through a rigorous qualitative approach based on a systematic literature review [...] Read more.
This study establishes a comprehensive structural isomorphism between Conway’s Game of Life and the entrepreneurial process, analysing the latter as a complex adaptive system governed by non-linear dynamics rather than linear predictability. Through a rigorous qualitative approach based on a systematic literature review and abductive inference, the research identifies and correlates four fundamental dimensions: uncertainty, adaptability, growth, and sustainability. Transcending traditional metaphorical comparisons, this paper introduces a novel mathematical model that modifies Conway’s deterministic logic by incorporating an «Agency» variable (A). This critical addition quantifies how an entrepreneur’s internal capabilities can counterbalance environmental pressures (neighbourhood density) to determine survival thresholds, effectively transforming the simulation into a «Game of Life with Agency» where participants actively influence their viability potential (Ψ). The analysis explicitly correlates specific algorithmic configurations with real-world business phenomena: high-entropy initial states («The Soup») mirror early-stage market uncertainty where outcomes are probabilistic; «gliders» represent the necessity of strategic pivoting and continuous displacement for survival; and «oscillators» symbolise dynamic sustainability through rhythmic equilibrium rather than static permanence. Furthermore, the study validates the «Gosper Glider Gun» pattern as a model for scalable, generative growth. By bridging abstract systems theory with managerial practice, the research positions these simulations as «mental laboratories» for decision-making. The findings theoretically validate iterative methodologies like the Lean Startup and conclude that successful entrepreneurship operates on the «Edge of Chaos», providing a rigorous framework for navigating high stochastic uncertainty. Full article
(This article belongs to the Section International Entrepreneurship)
Show Figures

Figure 1

23 pages, 4040 KB  
Article
Energy-Efficient Train Control Based on Energy Consumption Estimation Model and Deep Reinforcement Learning
by Jia Liu, Yuemiao Wang, Yirong Liu, Xiaoyu Li, Fuwang Chen and Shaofeng Lu
Electronics 2025, 14(24), 4939; https://doi.org/10.3390/electronics14244939 - 16 Dec 2025
Viewed by 541
Abstract
Energy-efficient Train Control (EETC) strategy needs to meet safety, punctuality, and energy-saving requirements during train operation, and puts forward higher requirements for online use and adaptive ability. In order to meet the above requirements and reduce the dependence on an accurate mathematical model [...] Read more.
Energy-efficient Train Control (EETC) strategy needs to meet safety, punctuality, and energy-saving requirements during train operation, and puts forward higher requirements for online use and adaptive ability. In order to meet the above requirements and reduce the dependence on an accurate mathematical model of train operation, this paper proposes a train-speed trajectory-optimization method combining data-driven energy consumption estimation and deep reinforcement learning. First of all, using real subway operation data, the key unit basic resistance coefficient in train operation is analyzed by regression. Then, based on the identified model, the energy consumption experiment data of train operation is generated, into which Gaussian noise is introduced to simulate real-world sensor measurement errors and environmental uncertainties. The energy consumption estimation model based on a Backpropagation (BP) neural network is constructed and trained. Finally, the energy consumption estimation model serves as a component within the Deep Deterministic Policy Gradient (DDPG) algorithm environment, and the action adjustment mechanism and reward are designed by integrating the expert experience to complete the optimization training of the strategy network. Experimental results demonstrate that the proposed method reduces energy consumption by approximately 4.4% compared to actual manual operation data. Furthermore, it achieves a solution deviation of less than 0.3% compared to the theoretical optimal baseline (Dynamic Programming), proving its ability to approximate global optimality. In addition, the proposed algorithm can adapt to the changes in train mass, initial set running time, and halfway running time while ensuring convergence performance and trajectory energy saving during online use. Full article
(This article belongs to the Special Issue Advances in Intelligent Computing and Systems Design)
Show Figures

Figure 1

22 pages, 1728 KB  
Article
Optimization of Mixed-Model Multi-Manned Assembly Lines for Fuel–Electric Vehicle Co-Production Under Workstation Sharing
by Lingling Hu and Vatcharapol Sukhotu
World Electr. Veh. J. 2025, 16(12), 666; https://doi.org/10.3390/wevj16120666 - 11 Dec 2025
Viewed by 539
Abstract
With the rapid transformation of the automotive industry towards electric vehicles, how to achieve efficient mixed-line production of electric vehicles and fuel vehicles has become a key challenge for modern assembly systems. This study investigated the balancing problem of a mixed-model multi-manned assembly [...] Read more.
With the rapid transformation of the automotive industry towards electric vehicles, how to achieve efficient mixed-line production of electric vehicles and fuel vehicles has become a key challenge for modern assembly systems. This study investigated the balancing problem of a mixed-model multi-manned assembly line, considering workstation sharing (MMuALBP-WS), and developed a deterministic multi-objective model that integrates the heterogeneity of tasks and the coordination of shared workstations. An improved genetic algorithm was proposed, whose decoding mechanism enables different types of electric vehicle and fuel vehicle tasks to achieve dynamic collaboration within the shared workstations. A real case study from the chassis assembly line of Company W demonstrated the effectiveness of the proposed method, achieving a 25% reduction in the number of workstations, a 27% decrease in the total number of workers, and a 23.56% increase in average workstation utilization. The results confirmed that the workstation sharing mechanism significantly improved production balance, labor utilization, and flexibility, providing a practical and scalable optimization framework for the mixed-model assembly system in the era of the transition from electric vehicles to fuel vehicles. In addition to its practical significance, this study enhances the understanding of mixed-model multi-manned line balancing by incorporating workstation-sharing logic into both the mathematical modeling and optimization process, offering a theoretical basis for future extensions to more complex production environments. Full article
Show Figures

Figure 1

24 pages, 2980 KB  
Article
Monte Carlo Simulations as an Alternative for Solving Engineering Problems in Environmental Sciences: Three Case Studies
by Sergio Luis Parra-Angarita, Guillermo H. Gaviria, Juan F. Herrera-Ruiz and María del Carmen Márquez
ChemEngineering 2025, 9(6), 140; https://doi.org/10.3390/chemengineering9060140 - 9 Dec 2025
Viewed by 778
Abstract
Monte Carlo methods offer a fast, cost-effective approach for modeling environmental systems influenced by random variability. This study applied them to three abiotic cases: (I) water quality in a lentic surface water source, (II) sizing of a homogenization chamber for solid waste treatment, [...] Read more.
Monte Carlo methods offer a fast, cost-effective approach for modeling environmental systems influenced by random variability. This study applied them to three abiotic cases: (I) water quality in a lentic surface water source, (II) sizing of a homogenization chamber for solid waste treatment, and (III) removal of atmospheric particulate matter by rain. Deterministic models produced wide and inconsistent estimates: BOD5 concentrations from 5.28 to 19.81 mg/L (275% relative difference), chamber volumes from 24.12 to 116.53 m3, and particulate matter reductions with up to 60 µg/m3 per month variation. Monte Carlo simulations, by contrast, captured system variability and provided more robust outputs: a design value of 94.84 m3 for the homogenization chamber, narrower ranges for BOD5, and realistic distributions of atmospheric PM concentrations. Results show that reliance on average values introduces strong biases and mathematical incompatibilities, while the Monte Carlo approach yields quantitative predictions that are both accurate and operationally useful. This confirms its relevance as a practical tool for analyzing and designing environmental systems under uncertainty. Full article
(This article belongs to the Special Issue Innovative Approaches for the Environmental Chemical Engineering)
Show Figures

Graphical abstract

15 pages, 1007 KB  
Article
Simulated Annealing Integrated with Discrete-Event Simulation for Berth Allocation in Bulk Ports Under Demurrage Constraints
by Enrique Delahoz-Domínguez, Adel Mendoza-Mendoza and Daniel Mendoza-Casseres
Eng 2025, 6(12), 352; https://doi.org/10.3390/eng6120352 - 5 Dec 2025
Cited by 1 | Viewed by 456
Abstract
Efficient berth allocation remains a critical challenge in bulk port operations due to the stochastic nature of vessel arrivals and the complex interaction among loading resources. This study proposes an integrated optimisation–simulation framework to minimise total demurrage costs under uncertainty. The mathematical model [...] Read more.
Efficient berth allocation remains a critical challenge in bulk port operations due to the stochastic nature of vessel arrivals and the complex interaction among loading resources. This study proposes an integrated optimisation–simulation framework to minimise total demurrage costs under uncertainty. The mathematical model was formulated as a mixed-integer linear program (MILP) to determine the optimal assignment and sequencing of vessels to berths and shiploaders, subject to time-window and capacity constraints. The MILP was solved using a Simulated Annealing (SA) metaheuristic to improve computational efficiency for large-scale instances. Subsequently, the optimised berth plans were evaluated in FlexSim, a discrete-event simulation environment, to assess the operational variability arising from stochastic factors, including vessel arrival times, service durations, and loader availability. System performance was measured through vessel waiting time, berth utilisation rate, and demurrage cost variability across multiple replications. Results indicate that the proposed SA–FlexSim framework reduced average demurrage costs by 28.7% compared to the deterministic MILP and by 21.3% relative to standalone SA, confirming its effectiveness and robustness under uncertain operating conditions. The hybrid methodology provides a practical decision-support tool for terminal operators seeking to enhance scheduling reliability and cost efficiency in bulk port environments. Full article
(This article belongs to the Special Issue Supply Chain Engineering)
Show Figures

Figure 1

32 pages, 5673 KB  
Article
Modeling of Heat Treatment Processes in a Vortex Layer of Dispersed Materials
by Hanna Koshlak, Anatoliy Pavlenko, Borys Basok and Janusz Telega
Materials 2025, 18(23), 5459; https://doi.org/10.3390/ma18235459 - 3 Dec 2025
Viewed by 546
Abstract
Sustainable materials engineering necessitates the valorization of industrial by-products, such as coal fly ash, into functional, high-performance materials. This research addresses a core challenge in materials synthesis: establishing a deterministic technology for controlled porous structure formation to optimize the thermophysical properties of lightweight [...] Read more.
Sustainable materials engineering necessitates the valorization of industrial by-products, such as coal fly ash, into functional, high-performance materials. This research addresses a core challenge in materials synthesis: establishing a deterministic technology for controlled porous structure formation to optimize the thermophysical properties of lightweight thermal insulation composites. The primary objective was to investigate the structural evolution kinetics during the high-intensity thermal processing of fly ash-based precursors to facilitate precise property regulation. We developed a novel, integrated process, underpinned by mathematical modeling of simultaneous bloating and non-equilibrium heat transfer, to evaluate key operational parameters within a vortex-layer reactor (VLR). This framework enables the a priori prediction of structural outcomes. The synthesized composite granules were subjected to comprehensive characterization, quantifying apparent density, total porosity, static compressive strength, and effective thermal conductivity. The developed models and VLR technology successfully identified critical thermal exposure windows and heat flux intensities of the heating medium required for the reproducible regulation of the composite’s porous architecture. This precise structure process control yielded materials exhibiting an optimal balance between low density (<400 kg/m3) and adequate mechanical integrity (>1.0 MPa). This work validates a scalable, energy-efficient production technology for fly ash-derived porous media. The established capability for predictive control over microstructural development provides a robust engineering solution for producing porous materials, significantly contributing to waste reduction and sustainable building practices. Full article
Show Figures

Figure 1

Back to TopTop