Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (2,386)

Search Parameters:
Keywords = statistical forecast

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
24 pages, 3212 KB  
Article
Comparative Performance Analysis of Software-Based Restoration Techniques for NAVTEX Message
by Hoyeon Cho, Changui Lee and Seojeong Lee
J. Mar. Sci. Eng. 2025, 13(9), 1657; https://doi.org/10.3390/jmse13091657 - 29 Aug 2025
Abstract
Maritime transportation requires reliable navigational safety communications to ensure vessel safety and operational efficiency. The Maritime Single Window (MSW) enables vessels to submit all maritime data digitally without human intervention. NAVTEX (Navigational Telex) messages provide navigational warnings, meteorological warnings and forecasts, piracy, and [...] Read more.
Maritime transportation requires reliable navigational safety communications to ensure vessel safety and operational efficiency. The Maritime Single Window (MSW) enables vessels to submit all maritime data digitally without human intervention. NAVTEX (Navigational Telex) messages provide navigational warnings, meteorological warnings and forecasts, piracy, and search and rescue information that require integration into automated MSW system. However, NAVTEX transmissions experience message corruption when Forward Error Correction (FEC) mechanisms fail, marking unrecoverable characters with asterisks. Current standards require discarding messages exceeding 4% error rates, resulting in safety information loss. Traditional human interpretation of corrupted messages creates limitations that prevent automated MSW integration. This paper presents the application of Masked Language Modeling (MLM) with Transformer encoders for automated NAVTEX message restoration. Our approach treats asterisk characters as masked tokens, enabling bidirectional context processing to reconstruct corrupted characters. We evaluated MLM against dictionary-matching and n-gram models using 69,658 NAVTEX messages with corruption ranging from 1% to 33%. MLM achieved 85.4% restoration rate versus 44.4–64.0% for statistical methods. MLM maintained residual error rates below the 4% threshold for initial corruption up to 25%, while statistical methods exceeded this limit at 10%. This automated restoration capability supports MSW integration while preserving critical safety information during challenging transmission conditions. Full article
Show Figures

Figure 1

19 pages, 7434 KB  
Article
The Study on the Relation Between Rock Indentation Crater Morphology and Rock Mechanical Index Based on Indentation Experiments
by Zhenkun Wu, Hui Gao, Ying Yang, Songcheng Tan, Xiaohong Fang, Yule Hu and Longchen Duan
Appl. Sci. 2025, 15(17), 9410; https://doi.org/10.3390/app15179410 - 27 Aug 2025
Abstract
Understanding rock behavior under cutting tools is critical for enhancing cutting processes and forecasting rock behavior in engineering contexts. This study examines the link between mechanical properties and indentation crater morphology of six rocks using a conical indenter until initial fracture. Through indentation [...] Read more.
Understanding rock behavior under cutting tools is critical for enhancing cutting processes and forecasting rock behavior in engineering contexts. This study examines the link between mechanical properties and indentation crater morphology of six rocks using a conical indenter until initial fracture. Through indentation testing, mechanical properties (indentation stiffness index k and hardness index HI) were assessed, and crater morphology was analyzed using a 3D laser profilometer. The rocks were categorized into three groups based on specific energy: Class I (slate, shale), Class II (sandstone, marble), and Class III (granite, gneiss). The morphological features of their indentation craters were analyzed both quantitatively and qualitatively. The linear model was used to establish the relationship between crater morphology indices and mechanical properties, with model parameters determined by linear regression. Key findings include: (1) Fracture depth, cross-sectional area, and contour roundness are independent morphological indicators, serving as characteristic parameters for crater morphology, with qualitative and quantitative analyses showing consistency; (2) Post-classification linear fitting revealed statistically significant morphological prediction models, though patterns varied across rock categories due to inherent properties like structure and grain homogeneity; (3) Classification by specific energy revealed distinct mechanical and morphological differences, with significant linear relationships established for all three indicators in Classes II and III, but only roundness showing significance in Class I (non-significant for cross-sectional area and depth). However, all significant models exhibited limited explanatory power (R2 = 0.220–0.635), likely due to constrained sample sizes. Future studies should expand sample sizes to refine these findings. Full article
Show Figures

Figure 1

22 pages, 4304 KB  
Article
Intelligent Early Warning System for Supplier Delays Using Dynamic IoT-Calibrated Probabilistic Modeling in Smart Engineer-to-Order Supply Chains
by Aicha Alaoua and Mohammed Karim
Appl. Syst. Innov. 2025, 8(5), 124; https://doi.org/10.3390/asi8050124 - 27 Aug 2025
Viewed by 9
Abstract
In increasingly complex Engineer-to-Order (EtO) supply chains, accurately predicting supplier delivery delays is essential for ensuring operational resilience. This study proposes an intelligent Internet of Things (IoT)-enhanced probabilistic framework for early warning and dynamic prediction of supplier lead times in smart manufacturing contexts. [...] Read more.
In increasingly complex Engineer-to-Order (EtO) supply chains, accurately predicting supplier delivery delays is essential for ensuring operational resilience. This study proposes an intelligent Internet of Things (IoT)-enhanced probabilistic framework for early warning and dynamic prediction of supplier lead times in smart manufacturing contexts. Within this framework, three novel Early Warning Systems (EWS) are introduced: the Baseline Probabilistic Alert System (BPAS) based on fixed thresholds, the Smart IoT-Calibrated Alert System (SIoT-CAS) leveraging IoT-driven calibration, and the Adaptive IoT-Driven Risk Alert System (AID-RAS) featuring real-time threshold adaptation. Supplier lead times are modeled using statistical distributions and dynamically adjusted with IoT data to capture evolving disruptions. A comprehensive Monte Carlo simulation was conducted across varying levels of lead time uncertainty (σ), alert sensitivity (Pthreshold), and delivery constraints (Lmax), generating over 1000 synthetic scenarios per configuration. The results highlight distinct trade-offs between predictive accuracy, sensitivity, and robustness: BPAS minimizes false alarms in stable environments, SIoT-CAS improves forecasting precision through IoT calibration, and AID-RAS maximizes detection capability and resilience under high-risk conditions. Overall, the findings advance theoretical understanding of adaptive, data-driven risk modeling in EtO supply chains and provide practical guidance for selecting appropriate EWS mechanisms based on operational priorities. Furthermore, they offer actionable insights for integrating predictive EWS into MES (Manufacturing Execution System) and digital control tower platforms, thereby contributing to both academic research and industrial best practices. Full article
Show Figures

Figure 1

18 pages, 2772 KB  
Article
Temperature Prediction Using Transformer–LSTM Deep Learning Models and Sarimax from a Signal Processing Perspective
by Celalettin Kişmiroğlu and Omer Isik
Appl. Sci. 2025, 15(17), 9372; https://doi.org/10.3390/app15179372 - 26 Aug 2025
Viewed by 139
Abstract
Recent developments in machine learning (ML), deep learning (DL), and statistical signal processing have led to substantial improvements in the accuracy of time series forecasting, particularly for environmental parameters such as temperature. The accuracy of air temperature prediction is not only vital for [...] Read more.
Recent developments in machine learning (ML), deep learning (DL), and statistical signal processing have led to substantial improvements in the accuracy of time series forecasting, particularly for environmental parameters such as temperature. The accuracy of air temperature prediction is not only vital for meteorological forecasting but also critically impacts agriculture, energy management, and environmental monitoring. In this study, a comprehensive modeling approach is proposed by incorporating both data-driven learning methods and classical signal processing techniques. Specifically, statistical models such as Seasonal AutoRegressive Integrated Moving Average with eXogenous regressors (SARIMAX) are evaluated alongside modern neural network architectures, including Long Short-Term Memory (LSTM) networks and Transformer-based attention mechanisms. The implemented models utilize key atmospheric variables—humidity, pressure, and past temperature values—to predict ambient temperature for future time horizons, such as one week and six months ahead. The SARIMAX model, which is grounded in digital signal processing theory, is particularly examined for its ability to capture seasonality and trend components in structured data. Meanwhile, deep learning models excel in learning complex, nonlinear temporal dependencies. Experimental results show that while LSTM performs well in short-term predictions (mean absolute error (MAE): 2.27, mean squared error (MSE): 6.63), the attention-based Transformer model is superior in capturing the predictions in the long term (MAE: 2.99, MSE: 14.92). SARIMAX, on the other hand, demonstrates a reliable performance in the short term compared to LSTM. These findings provide valuable insights into the strengths and limitations of each modeling approach, guiding future efforts in temperature forecasting and time series analysis. Full article
Show Figures

Figure 1

22 pages, 828 KB  
Article
Stock Price Prediction Using FinBERT-Enhanced Sentiment with SHAP Explainability and Differential Privacy
by Linyan Ruan and Haiwei Jiang
Mathematics 2025, 13(17), 2747; https://doi.org/10.3390/math13172747 - 26 Aug 2025
Viewed by 146
Abstract
Stock price forecasting remains a central challenge in financial modeling due to the non-stationarity, noise, and high dimensionality of market dynamics, as well as the growing importance of unstructured textual information. In this work, we propose a multimodal prediction framework that combines FinBERT-based [...] Read more.
Stock price forecasting remains a central challenge in financial modeling due to the non-stationarity, noise, and high dimensionality of market dynamics, as well as the growing importance of unstructured textual information. In this work, we propose a multimodal prediction framework that combines FinBERT-based financial sentiment extraction with technical and statistical indicators to forecast short-term stock price movement. Contextual sentiment signals are derived from financial news headlines using FinBERT, a domain-specific transformer model fine-tuned on annotated financial text. These signals are aggregated and fused with price- and volatility-based features, forming the input to a gradient-boosted decision tree classifier (XGBoost). To ensure interpretability, we employ SHAP (SHapley Additive exPlanations), which decomposes each prediction into additive feature attributions while satisfying game-theoretic fairness axioms. In addition, we integrate differential privacy into the training pipeline to ensure robustness against membership inference attacks and protect proprietary or client-sensitive data. Empirical evaluations across multiple S&P 500 equities from 2018–2023 demonstrate that our FinBERT-enhanced model consistently outperforms both technical-only and lexicon-based sentiment baselines in terms of AUC, F1-score, and simulated trading profitability. SHAP analysis confirms that FinBERT-derived features rank among the most influential predictors. Our findings highlight the complementary value of domain-specific NLP and privacy-preserving machine learning in financial forecasting, offering a principled, interpretable, and deployable solution for real-world quantitative finance applications. Full article
Show Figures

Figure 1

29 pages, 4733 KB  
Article
Water Quality Index (WQI) Forecasting and Analysis Based on Neuro-Fuzzy and Statistical Methods
by Amar Lokman, Wan Zakiah Wan Ismail, Nor Azlina Ab Aziz and Anith Khairunnisa Ghazali
Appl. Sci. 2025, 15(17), 9364; https://doi.org/10.3390/app15179364 - 26 Aug 2025
Viewed by 248
Abstract
Water quality is crucial to the economy and ecology because a healthy aquatic eco-system supports human survival and biodiversity. We have developed the Neuro-Adapt Fuzzy Strategist (NAFS) to improve water quality index (WQI) forecasting accuracy. The objective of the developed model is to [...] Read more.
Water quality is crucial to the economy and ecology because a healthy aquatic eco-system supports human survival and biodiversity. We have developed the Neuro-Adapt Fuzzy Strategist (NAFS) to improve water quality index (WQI) forecasting accuracy. The objective of the developed model is to achieve a balance by improving prediction accuracy while preserving high interpretability and computational efficiency. Neural networks and fuzzy logic improve the NAFS model’s flexibility and prediction accuracy, while its optimized backward pass improves training convergence speed and parameter update effectiveness, contributing to better learning performance. The normalized and partial derivative computations are refined to improve the model. NAFS is compared with ANN, Adaptive Neuro-Fuzzy Inference System (ANFIS), and current machine learning (ML) models such as LSTM, GRU, and Transformer based on performance evaluation metrics. NAFS outperforms ANFIS and ANN, with MSE of 1.678. NAFS predicts water quality better than ANFIS and ANN, with RMSE of 1.295. NAFS captures complicated water quality parameter interdependencies better than ANN and ANFIS using principal component analysis (PCA) and Pearson correlation. The performance comparison shows that NAFS outperforms all baseline models with the lowest MAE, MSE, RMSE and MAPE, and the highest R2, confirming its superior accuracy. PCA is employed to reduce data dimensionality and identify the most influential water quality parameters. It reveals that two principal components account for 72% of the total variance, highlighting key contributors to WQI and supporting feature prioritization in the NAFS model. The Breusch–Pagan test reveals heteroscedasticity in residuals, justifying the use of non-linear models over linear methods. The Shapiro–Wilk test indicates non-normality in residuals. This shows that the NAFS model can handle complex, non-linear environmental variables better than previous water quality prediction research. NAFS not only can predict water quality index values but also enhance WQI estimation. Full article
(This article belongs to the Special Issue AI in Wastewater Treatment)
Show Figures

Figure 1

10 pages, 304 KB  
Proceeding Paper
A Rapid, Fully Automated Denoising Method for Time Series Utilizing Wavelet Theory
by Livio Fenga
Eng. Proc. 2025, 101(1), 18; https://doi.org/10.3390/engproc2025101018 - 25 Aug 2025
Viewed by 84
Abstract
A wavelet-based noise reduction method for time series is proposed. Traditional denoising techniques often adopt a “trial-and-error” approach, which can prove inefficient and may result in suboptimal filtering outcomes. In contrast, our method systematically selects the most suitable wavelet function from a predefined [...] Read more.
A wavelet-based noise reduction method for time series is proposed. Traditional denoising techniques often adopt a “trial-and-error” approach, which can prove inefficient and may result in suboptimal filtering outcomes. In contrast, our method systematically selects the most suitable wavelet function from a predefined set, along with its associated tuning parameters, to ensure an optimal denoising process. The denoised series produced by this approach maximizes a suitable objective function based on information-theoretic divergence. This is particularly significant in economic time series, which are frequently characterized by non-linear dynamics and erratic patterns, often influenced by measurement errors and various external disturbances. The method’s performance is evaluated using time series data derived from the Business Confidence Climate Survey, which is freely and publicly accessible via the World Wide Web through the Italian National Institute of Statistics. The results of our empirical analysis demonstrate the effectiveness of the proposed method in delivering robust filtering capabilities, adeptly distinguishing informative signals from noise, and successfully eliminating uninformative components from the time series. This capability not only enhances the clarity of the data, but also significantly improves the overall reliability of subsequent analyses, such as forecasting. Full article
Show Figures

Figure 1

32 pages, 1750 KB  
Article
Study on the Evolution and Forecast of Agricultural Raw Material Exports in Emerging Economies in Central and Eastern Europe Using Statistical Methods
by Liviu Popescu, Mirela Găman, Laurențiu-Stelian Mihai, Magdalena Mihai and Cristian Ovidiu Drăgan
Agriculture 2025, 15(17), 1811; https://doi.org/10.3390/agriculture15171811 - 25 Aug 2025
Viewed by 246
Abstract
This study examines the evolution of agricultural raw material exports in seven emerging economies of Central and Eastern Europe (Romania, Poland, Slovakia, Croatia, Bulgaria, the Czech Republic, and Hungary) from 1995 to 2023 and provides forecasts for 2024–2026 using ARIMA models. The results [...] Read more.
This study examines the evolution of agricultural raw material exports in seven emerging economies of Central and Eastern Europe (Romania, Poland, Slovakia, Croatia, Bulgaria, the Czech Republic, and Hungary) from 1995 to 2023 and provides forecasts for 2024–2026 using ARIMA models. The results indicate a general downward trend in the share of agricultural raw material exports within total exports, reflecting ongoing economic modernization and a structural shift toward higher value-added products and industrial sectors. Romania, Poland, and Hungary remain as significant players in the cereals market, while Slovakia and the Czech Republic show the most pronounced transitions toward non-agricultural industries. Croatia, however, follows an atypical trajectory, maintaining a relatively high share of agricultural exports. Statistical tests (Dickey–Fuller) confirm the non-stationarity of the initial series, necessitating differencing for ARIMA modeling. Correlation analyses reveal a synchronized regional dynamic, with strong links among Poland, Slovakia, the Czech Republic, and Bulgaria. Forecasts suggest continued decline or stabilization at low levels for most countries: Romania (0.45% in 2026), Poland (0.93%), Slovakia (0.62%), Bulgaria (0.51%), the Czech Republic (0.95%), and Hungary (0.53%), while Croatia is an exception, with a projected moderate increase to 4.19% in 2026. Although the share of raw agricultural exports is decreasing, the findings confirm that agriculture remains a strategic sector for food security and regional trade. The study recommends investments in processing, technological modernization, and export market diversification to strengthen the competitiveness and resilience of the agricultural sector in the context of global economic transformations. Full article
(This article belongs to the Section Agricultural Economics, Policies and Rural Management)
Show Figures

Figure 1

20 pages, 2346 KB  
Article
Synoptic-Scale Modulation of Surface O3, NO2, and SO2 by the North Atlantic Oscillation in São Miguel Island, Azores (2017–2021)
by Helena Cristina Vasconcelos, Ana Catarina Ferreira and Maria Gabriela Meirelles
Pollutants 2025, 5(3), 27; https://doi.org/10.3390/pollutants5030027 - 25 Aug 2025
Viewed by 172
Abstract
This study investigated the extent to which the North Atlantic Oscillation (NAO) modulated daily surface-level concentrations of ozone (O3), nitrogen dioxide (NO2), and sulfur dioxide (SO2) on São Miguel Island, Azores, between 2017 and 2021. Using validated [...] Read more.
This study investigated the extent to which the North Atlantic Oscillation (NAO) modulated daily surface-level concentrations of ozone (O3), nitrogen dioxide (NO2), and sulfur dioxide (SO2) on São Miguel Island, Azores, between 2017 and 2021. Using validated data from two air quality monitoring stations, São Gonçalo (SG) (urban background) and Ribeira Grande (RG) (semi-urban), we applied descriptive statistics, seasonal Pearson correlations, and robust linear regression models to assess pollutant responses to NAO variability. The results reveal a significant and positive association between NAO phases and O3 concentrations, particularly in spring and summer. NO2 levels exhibited a strong negative correlation with NAO during summer in urban settings, indicating enhanced atmospheric dispersion. In contrast, SO2 concentrations showed weak and inconsistent relationships with the NAO index, likely reflecting the influence of local and episodic sources. These findings demonstrate that large-scale synoptic drivers such as the NAO can significantly modulate pollutant dynamics in island environments and should be integrated into air quality forecasting and environmental health planning strategies in small island territories. Full article
Show Figures

Figure 1

30 pages, 651 KB  
Article
A Fusion of Statistical and Machine Learning Methods: GARCH-XGBoost for Improved Volatility Modelling of the JSE Top40 Index
by Israel Maingo, Thakhani Ravele and Caston Sigauke
Int. J. Financial Stud. 2025, 13(3), 155; https://doi.org/10.3390/ijfs13030155 - 25 Aug 2025
Viewed by 222
Abstract
Volatility modelling is a key feature of financial risk management, portfolio optimisation, and forecasting, particularly for market indices such as the JSE Top40 Index, which serves as a benchmark for the South African stock market. This study investigates volatility modelling of the JSE [...] Read more.
Volatility modelling is a key feature of financial risk management, portfolio optimisation, and forecasting, particularly for market indices such as the JSE Top40 Index, which serves as a benchmark for the South African stock market. This study investigates volatility modelling of the JSE Top40 Index log-returns from 2011 to 2025 using a hybrid approach that integrates statistical and machine learning techniques through a two-step approach. The ARMA(3,2) model was chosen as the optimal mean model, using the auto.arima() function from the forecast package in R (version 4.4.0). Several alternative variants of GARCH models, including sGARCH(1,1), GJR-GARCH(1,1), and EGARCH(1,1), were fitted under various conditional error distributions (i.e., STD, SSTD, GED, SGED, and GHD). The choice of the model was based on AIC, BIC, HQIC, and LL evaluation criteria, and ARMA(3,2)-EGARCH(1,1) was the best model according to the lowest evaluation criteria. Residual diagnostic results indicated that the model adequately captured autocorrelation, conditional heteroskedasticity, and asymmetry in JSE Top40 log-returns. Volatility persistence was also detected, confirming the persistence attributes of financial volatility. Thereafter, the ARMA(3,2)-EGARCH(1,1) model was coupled with XGBoost using standardised residuals extracted from ARMA(3,2)-EGARCH(1,1) as lagged features. The data was split into training (60%), testing (20%), and calibration (20%) sets. Based on the lowest values of forecast accuracy measures (i.e., MASE, RMSE, MAE, MAPE, and sMAPE), along with prediction intervals and their evaluation metrics (i.e., PICP, PINAW, PICAW, and PINAD), the hybrid model captured residual nonlinearities left by the standalone ARMA(3,2)-EGARCH(1,1) and demonstrated improved forecasting accuracy. The hybrid ARMA(3,2)-EGARCH(1,1)-XGBoost model outperforms the standalone ARMA(3,2)-EGARCH(1,1) model across all forecast accuracy measures. This highlights the robustness and suitability of the hybrid ARMA(3,2)-EGARCH(1,1)-XGBoost model for financial risk management in emerging markets and signifies the strengths of integrating statistical and machine learning methods in financial time series modelling. Full article
Show Figures

Figure 1

28 pages, 5941 KB  
Article
Assessing Climate Change Impacts on Spring Discharge in Data-Sparse Environments Using a Combined Statistical–Analytical Method: An Example from the Aggtelek Karst Area, Hungary
by Attila Kovács, Csaba Ilyés, Musab A. A. Mohammed and Péter Szűcs
Water 2025, 17(17), 2507; https://doi.org/10.3390/w17172507 - 22 Aug 2025
Viewed by 363
Abstract
This paper introduces a methodology for forecasting spring hydrographs based on projections from regional climate models. The primary study objective was to evaluate how climate change may affect spring discharge. A statistical–analytical modeling approach was developed and applied to the Jósva spring catchment [...] Read more.
This paper introduces a methodology for forecasting spring hydrographs based on projections from regional climate models. The primary study objective was to evaluate how climate change may affect spring discharge. A statistical–analytical modeling approach was developed and applied to the Jósva spring catchment in the Aggtelek Karst region of Hungary. Historical data served to establish a regression relationship between rainfall and peak discharge. This approach is particularly useful for predicting discharge in cases where only historical rainfall data are available for calibration. Baseflow recession was analyzed using a two-component exponential model, with hydrograph decompositionand parameter optimization performed on the master recession curve. Future discharge time series were generated using rainfall data from two selected regional climate model scenarios. Both scenarios suggest a decline in baseflow discharge during different periods of the 21st century. The findings indicate that climate change is likely to intensify hydrological extremes in the coming decades, irrespective of whether moderate or high CO2 emission scenarios unfold. Full article
(This article belongs to the Special Issue Climate Impact on Karst Water Resources)
Show Figures

Figure 1

36 pages, 1871 KB  
Article
Sentiment-Driven Statistical Modelling of Stock Returns over Weekends
by Pablo Kowalski Kutz and Roman N. Makarov
Computation 2025, 13(8), 201; https://doi.org/10.3390/computation13080201 - 21 Aug 2025
Viewed by 419
Abstract
We propose a two-stage statistical learning framework to investigate how financial news headlines posted over weekends affect stock returns. In the first stage, Natural Language Processing (NLP) techniques are used to extract sentiment features from news headlines, including FinBERT sentiment scores and Impact [...] Read more.
We propose a two-stage statistical learning framework to investigate how financial news headlines posted over weekends affect stock returns. In the first stage, Natural Language Processing (NLP) techniques are used to extract sentiment features from news headlines, including FinBERT sentiment scores and Impact Probabilities derived from Logistic Regression models (Binomial, Multinomial, and Bayesian). These Impact Probabilities estimate the likelihood that a given headline influences the stock’s opening price on the following trading day. In the second stage, we predict over-weekend log returns using various sets of covariates: sentiment-based features, traditional financial indicators (e.g., trading volumes, past returns), and headline counts. We evaluate multiple statistical learning algorithms—including Linear Regression, Polynomial Regression, Random Forests, and Support Vector Machines—using cross-validation and two performance metrics. Our framework is demonstrated using financial news from MarketWatch and stock data for Apple Inc. (AAPL) from 2014 to 2023. The results show that incorporating sentiment features, particularly Impact Probabilities, improves predictive accuracy. This approach offers a robust way to quantify and model the influence of qualitative financial information on stock performance, especially in contexts where markets are closed but news continues to develop. Full article
(This article belongs to the Section Computational Social Science)
Show Figures

Figure 1

19 pages, 3081 KB  
Article
Temporal and Statistical Insights into Multivariate Time Series Forecasting of Corn Outlet Moisture in Industrial Continuous-Flow Drying Systems
by Marko Simonič and Simon Klančnik
Appl. Sci. 2025, 15(16), 9187; https://doi.org/10.3390/app15169187 - 21 Aug 2025
Viewed by 255
Abstract
Corn drying is a critical post-harvest process to ensure product quality and compliance with moisture standards. Traditional optimization approaches often overlook dynamic interactions between operational parameters and environmental factors in industrial continuous flow drying systems. This study integrates statistical analysis and deep learning [...] Read more.
Corn drying is a critical post-harvest process to ensure product quality and compliance with moisture standards. Traditional optimization approaches often overlook dynamic interactions between operational parameters and environmental factors in industrial continuous flow drying systems. This study integrates statistical analysis and deep learning to predict outlet moisture content, leveraging a dataset of 3826 observations from an operational dryer. The effects of inlet moisture, target air temperature, and material discharge interval on thermal behavior of the system were evaluated through linear regression and t-test, which provided interpretable insights into process dependencies. Three neural network architectures (LSTM, GRU, and TCN) were benchmarked for multivariate time-series forecasting of outlet corn moisture, with hyperparameters optimized using grid search to ensure fair performance comparison. Results demonstrated GRU’s superior performance in the context of absolute deviations, achieving the lowest mean absolute error (MAE = 0.304%) and competitive mean squared error (MSE = 0.304%), compared to LSTM (MAE = 0.368%, MSE = 0.291%) and TCN (MAE = 0.397%, MSE = 0.315%). While GRU excelled in average prediction accuracy, LSTM’s lower MSE highlighted its robustness against extreme deviations. The hybrid methodology bridges statistical insights for interpretability with deep learning’s dynamic predictive capabilities, offering a scalable framework for real-time process optimization. By combining traditional analytical methods (e.g., regression and t-test) with deep learning-driven forecasting, this work advances intelligent monitoring and control of industrial drying systems, enhancing process stability, ensuring compliance with moisture standards, and indirectly supporting energy efficiency by reducing over drying and enabling more consistent operation. Full article
Show Figures

Figure 1

32 pages, 706 KB  
Review
Corporate Failure Prediction: A Literature Review of Altman Z-Score and Machine Learning Models Within a Technology Adoption Framework
by Christoph Braunsberger and Ewald Aschauer
J. Risk Financial Manag. 2025, 18(8), 465; https://doi.org/10.3390/jrfm18080465 - 20 Aug 2025
Viewed by 602
Abstract
Research on corporate failure prediction is focused on increasing the model’s statistical accuracy, most recently via the introduction of a variety of machine learning (ML)-based models, often overlooking the practical appeal and potential adoption barriers in the context of corporate management. This literature [...] Read more.
Research on corporate failure prediction is focused on increasing the model’s statistical accuracy, most recently via the introduction of a variety of machine learning (ML)-based models, often overlooking the practical appeal and potential adoption barriers in the context of corporate management. This literature review compares ML models with the classic, widely accepted Altman Z-score through a technology adoption lens. We map how technological features, organizational readiness, environmental pressure and user perceptions shape adoption using an integrated technology adoption framework that combines the Technology–Organization–Environment framework with the Technology Acceptance Model. The analysis shows that Z-score models offer simplicity, interpretability and low cost, suiting firms with limited analytical resources, whereas ML models deliver superior accuracy and adaptability but require advanced data infrastructure, specialized expertise and regulatory clarity. By linking the models’ characteristics with adoption determinants, the study clarifies when each model is most appropriate and sets a research agenda for long-horizon forecasting, explainable artificial intelligence and context-specific model design. These insights help managers choose failure prediction tools that fit their strategic objectives and implementation capacity. Full article
(This article belongs to the Section Business and Entrepreneurship)
Show Figures

Figure 1

15 pages, 1475 KB  
Article
Using Neural Networks to Predict the Frequency of Traffic Accidents by Province in Poland
by Piotr Gorzelańczyk, Jacek Zabel and Edgar Sokolovskij
Appl. Sci. 2025, 15(16), 9108; https://doi.org/10.3390/app15169108 - 19 Aug 2025
Viewed by 288
Abstract
Road traffic fatalities remain a significant global issue, despite a gradual decline in recent years. Although the number of accidents has decreased—partly due to reduced mobility during the pandemic—the figures remain alarmingly high. To further reduce these numbers, it is crucial to identify [...] Read more.
Road traffic fatalities remain a significant global issue, despite a gradual decline in recent years. Although the number of accidents has decreased—partly due to reduced mobility during the pandemic—the figures remain alarmingly high. To further reduce these numbers, it is crucial to identify regions with the highest accident rates and predict future trends. This study aims to forecast traffic accident occurrences across Poland’s provinces. Using official police data on annual accident statistics, we analyzed historical trends and applied predictive modeling in Statistica to estimate accident rates from 2022 to 2040. Several neural network models were employed to generate these projections. The findings indicate that a significant reduction in road accidents is unlikely in the near future, with rates expected to stabilize rather than decline. The accuracy of predictions was influenced by the random sampling distribution used in model training. Specifically, a 70-15-15 split (70% training, 15% testing, and 15% validation) yielded an average error of 1.75%, and an 80-10-10 split reduced the error to 0.63%, demonstrating the impact of sample allocation on predictive performance. These results highlight the importance of dataset partitioning in accident forecasting models. Full article
(This article belongs to the Special Issue Simulations and Experiments in Design of Transport Vehicles)
Show Figures

Figure 1

Back to TopTop