AI Applications and Modern Industry

A special issue of Algorithms (ISSN 1999-4893). This special issue belongs to the section "Algorithms for Multidisciplinary Applications".

Deadline for manuscript submissions: 31 January 2026 | Viewed by 2429

Special Issue Editors


E-Mail Website
Guest Editor Assistant
Faculty of Engineering Technology, University of Twente, Drienerlolaan 5, 7522 NB Enschede, The Netherlands
Interests: rails; physics of failure; maintenance; remaining useful life; mechanical engineering; wear; fatigue; friction; material science

Special Issue Information

Dear Colleagues,

Modern industries require more sensors, computing power, and algorithms to run smoothly and be competitive in an increasingly global market. Artificial Intelligence algorithms contribute to optimizing processes and the safety of people and assets. Hence, investment in edge AI, classical and deep machine learning systems, business intelligence, and other engineering tools is key to competitiveness and sustainability. Those systems facilitate the automation of tasks and detection of events and patterns at the same time as they offer prognostics, greatly contributing to a deeper understanding of modern complex systems and more informed decisions.

This Special Issue aims to cover the latest research so that decisions made based on the proposed algorithms are sound and adequate, effectively contributing to facilitating management and operational decisions. Original contributions on the above aspects and related topics are encouraged.

Dr. Mateus Mendes
Guest Editor

Dr. Annemieke Meghoe
Guest Editor Assistant

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Algorithms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • artificial intelligence
  • asset management
  • clustering
  • data analysis
  • data mining
  • decision-support systems
  • deep learning
  • digital transition
  • edge AI
  • fault detection
  • knowledge-based systems
  • machine learning
  • object detection
  • optimization
  • predictive maintenance
  • time series

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

26 pages, 3721 KB  
Article
Forecasting Fossil Energy Price Dynamics with Deep Learning: Implications for Global Energy Security and Financial Stability
by Bilal Ahmed Memon
Algorithms 2025, 18(12), 776; https://doi.org/10.3390/a18120776 - 9 Dec 2025
Abstract
This study investigates the application of advanced deep learning models to forecast fossil energy prices, a critical factor influencing global economic stability. Unlike previous research, this study conducts a comparative analysis of Gated Recurrent Unit (GRU), Recurrent Neural Network (RNN), Bidirectional Long Short-Term [...] Read more.
This study investigates the application of advanced deep learning models to forecast fossil energy prices, a critical factor influencing global economic stability. Unlike previous research, this study conducts a comparative analysis of Gated Recurrent Unit (GRU), Recurrent Neural Network (RNN), Bidirectional Long Short-Term Memory (Bi-LSTM), Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and Deep Neural Network (DNN) models. The evaluation metrics employed include Root Mean Squared Error (RMSE) and Mean Absolute Percentage Error (MAPE). The results reveal that recurrent architectures, particularly GRU, LSTM, and Bi-LSTM, consistently outperform feedforward and convolutional models, demonstrating superior ability to capture temporal dependencies and nonlinear dynamics in energy markets. In contrast, the RNN and DNN show relatively weaker generalization capabilities. Additionally, visualizations of actual versus predicted prices for each model further emphasize superior forecasting accuracy of recurrent models. The results highlight the potential of deep learning in enhancing investment and policy decisions. Additionally, the results provide significant implications for policymakers and investors by emphasizing the value of accurate energy price forecasting in mitigating market volatility, improving portfolio management, and supporting evidence-based energy policies. Full article
(This article belongs to the Special Issue AI Applications and Modern Industry)
Show Figures

Figure 1

22 pages, 4023 KB  
Article
Compressive Strength of Geopolymer Concrete Prediction Using Machine Learning Methods
by Sergey A. Stel’makh, Alexey N. Beskopylny, Evgenii M. Shcherban’, Irina Razveeva, Samson Oganesyan, Diana M. Shakhalieva, Andrei Chernil’nik and Gleb Onore
Algorithms 2025, 18(12), 744; https://doi.org/10.3390/a18120744 - 26 Nov 2025
Viewed by 261
Abstract
The implementation of machine learning methods as one of the artificial intelligence technologies has allowed bringing the construction process to a new qualitative level. Significant interest in these methods is observed in predictive modeling of the building materials’ properties. In the scientific field [...] Read more.
The implementation of machine learning methods as one of the artificial intelligence technologies has allowed bringing the construction process to a new qualitative level. Significant interest in these methods is observed in predictive modeling of the building materials’ properties. In the scientific field of innovative concretes, limitations exist regarding the disclosure of intelligent algorithms’ capabilities to predict material properties when altering specific chemical elements and process parameters. This article focuses on seven machine learning techniques that are used to solve the issue in forecasting geopolymer concrete’s compressive strength: from the simplest, such as Linear Regression, to more complex and modern methods, including the TabPFNv2 generative transformer model. The dataset was formed based on 204 datasets available in the public domain, including the author’s experimental data. The leading machine learning features were selected: blast-furnace granulated slag (kg/m3); NaOH molarity; NaOH content in the alkaline activator (%); Na2SiO3 content in the alkaline activator (%); fiber type; fiber dosage (%); and curing temperature (°C). The MAE, RMSE, MAPE metrics and the R2 determination coefficient were used to evaluate the prediction quality. The kNN method (MAE = 0.37, RMSE = 0.63, MAPE = 1.62%, R2 = 0.9996) and TabPFNv2 (MAE = 0.46, RMSE = 0.64, MAPE = 1.39%, R2 = 0.9996) presented the highest accuracy in predicting compressive strength, as assessed by the chosen parameters. If computing resources are limited and interpretability is required, it is recommended to use the CatBoost or Random Forest algorithms; if a graphics processing unit and a small dataset are available, it is advisable to use TabPFN; if there is no need for manual parameter adjustment, H2O AutoML is suitable. Full article
(This article belongs to the Special Issue AI Applications and Modern Industry)
Show Figures

Figure 1

29 pages, 3845 KB  
Article
Modeling Approaches for Digital Plant Phenotyping Under Dynamic Conditions of Natural, Climatic and Anthropogenic Factors
by Bagdat Yagaliyeva, Olga Ivashchuk and Dmitry Goncharov
Algorithms 2025, 18(11), 720; https://doi.org/10.3390/a18110720 - 15 Nov 2025
Viewed by 351
Abstract
Methods, algorithms, and models for the creation and practical application of digital twins (3D models) of agricultural crops are presented, illustrating their condition under different levels of atmospheric CO2 concentration, soil, and meteorological conditions. An algorithm for digital phenotyping using machine learning [...] Read more.
Methods, algorithms, and models for the creation and practical application of digital twins (3D models) of agricultural crops are presented, illustrating their condition under different levels of atmospheric CO2 concentration, soil, and meteorological conditions. An algorithm for digital phenotyping using machine learning methods with the U2-Net architecture are proposed for segmenting plants into elements and assessing their condition. To obtain a dataset and conduct verification experiments, a prototype of a software and hardware complex has been developed that implements the process of cultivation and digital phenotyping without disturbing the microclimate inside the chamber and eliminating the subjectivity of measurements. In order to identify new data and confirm the data published in open scientific sources on the effects of CO2 on crop growth and development, plants (ten species) were grown at different CO2 concentrations (0.015–0.03% and 0.07–0.09%) with a 10-fold repetition. A model has been built and trained to distinguish between cases when plant segments need to be combined because they belong to the same leaf (p-value = 0.05), and when they belong to a separate leaf (p-value = 0.03). A knowledge base has been formed, including: 790 3D models of plants and data on their physiological characteristics. Full article
(This article belongs to the Special Issue AI Applications and Modern Industry)
Show Figures

Figure 1

19 pages, 2117 KB  
Article
Point-Wise Full-Field Physics Neural Mapping Framework via Boundary Geometry Constrained for Large Thermoplastic Deformation
by Jue Wang, Xinyi Xu, Changxin Ye and Wei Huangfu
Algorithms 2025, 18(10), 651; https://doi.org/10.3390/a18100651 - 16 Oct 2025
Viewed by 439
Abstract
Computation modeling for large thermoplastic deformation of plastic solids is critical for industrial applications like non-invasive assessment of engineering components. While deep learning-based methods have emerged as promising alternatives to traditional numerical simulations, they often suffer from systematic errors caused by geometric mismatches [...] Read more.
Computation modeling for large thermoplastic deformation of plastic solids is critical for industrial applications like non-invasive assessment of engineering components. While deep learning-based methods have emerged as promising alternatives to traditional numerical simulations, they often suffer from systematic errors caused by geometric mismatches between predicted and ground truth meshes. To overcome this limitation, we propose a novel boundary geometry-constrained neural framework that establishes direct point-wise mappings between spatial coordinates and full-field physical quantities within the deformed domain. The key contributions of this work are as follows: (1) a two-stage strategy that separates geometric prediction from physics-field resolution by constructing direct, point-wise mappings between coordinates and physical quantities, inherently avoiding errors from mesh misalignment; (2) a boundary-condition-aware encoding mechanism that ensures physical consistency under complex loading conditions; and (3) a fully mesh-free approach that operates on point clouds without structured discretization. Experimental results demonstrate that our method achieves a 36–98% improvement in prediction accuracy over deep learning baselines, offering a efficient alternative for high-fidelity simulation of large thermoplastic deformations. Full article
(This article belongs to the Special Issue AI Applications and Modern Industry)
Show Figures

Figure 1

29 pages, 1730 KB  
Article
Explaining Corporate Ratings Transitions and Defaults Through Machine Learning
by Nazário Augusto de Oliveira and Leonardo Fernando Cruz Basso
Algorithms 2025, 18(10), 608; https://doi.org/10.3390/a18100608 - 28 Sep 2025
Viewed by 963
Abstract
Credit rating transitions and defaults are critical indicators of corporate creditworthiness, yet their accurate modeling remains a persistent challenge in risk management. Traditional models such as logistic regression (LR) and structural approaches (e.g., Merton’s model) offer transparency but often fail to capture nonlinear [...] Read more.
Credit rating transitions and defaults are critical indicators of corporate creditworthiness, yet their accurate modeling remains a persistent challenge in risk management. Traditional models such as logistic regression (LR) and structural approaches (e.g., Merton’s model) offer transparency but often fail to capture nonlinear relationships, temporal dynamics, and firm heterogeneity. This study proposes a hybrid machine learning (ML) framework to explain and predict corporate rating transitions and defaults, addressing key limitations in existing literature. We benchmark four classification algorithms—LR, Random Forest (RF), Extreme Gradient Boosting (XGBoost), and Support Vector Machines (SVM)—on a structured corporate credit dataset. Our approach integrates segment-specific modeling across rating bands, out-of-time validation to simulate real-world applicability, and SHapley Additive exPlanations (SHAP) values to ensure interpretability. The results demonstrate that ensemble methods, particularly XGBoost and RF, significantly outperform LR and SVM in predictive accuracy and early warning capability. Moreover, SHAP analysis reveals differentiated drivers of rating transitions across credit quality segments, highlighting the importance of tailored monitoring strategies. This research contributes to the literature by bridging predictive performance with interpretability in credit risk modeling and offers practical implications for regulators, rating agencies, and financial institutions seeking robust, transparent, and forward-looking credit assessment tools. Full article
(This article belongs to the Special Issue AI Applications and Modern Industry)
Show Figures

Figure 1

Back to TopTop