Abstract
Accurate prediction of supply chain performance, particularly profitability, as a key indicator of economic sustainability, is essential for data-driven decision-making in Industry 4.0-enabled sustainable supply chains. Traditional machine learning models often underperform due to suboptimal hyperparameter configurations, especially when dealing with high-dimensional, nonlinear operational data. To address the limitations of conventional machine learning models, which often exhibit instability and weak generalization in high-dimensional data, this study introduces a novel Salp Swarm Algorithm with Local Escaping Operator (SSALEO) to optimize XGBOOST for sustainable supply chain profit prediction. The theoretical innovation lies in the integration of LEO, which dynamically perturbs stagnant solutions to enhance convergence reliability, robustness, and interpretability compared with conventional metaheuristic–ML hybrids. This enhanced metaheuristic optimizer fine-tunes XGBOOST to deliver highly accurate predictions of supply chain profit, a critical dimension of economic sustainability. Evaluated on real-world supply chain datasets, SSALEO-XGBOOST achieves a coefficient of determination (R2 of 0.985) and significantly outperforms benchmark models across error metrics Root Mean Squared Error (RMSE), Mean Squared Error (MSE), Maximum Error (ME), and Relative Absolute Error (RAE). By leveraging this enhanced optimizer, the proposed SSALEO-XGBOOST framework achieves superior predictive accuracy and stability, enabling more consistent profit estimation and performance forecasting. For decision-makers in industry environments, the framework offers a practical tool to support data-driven sustainability assessment and digital transformation strategies, fostering intelligent and resilient industrial ecosystems.
1. Introduction
The combination of information technology innovations and globalization has fundamentally changed how companies approach supply chain management (SCM) [1]. Increasingly, the focus of competition has shifted from individual firms to entire supply chains. SCM refers to the strategic coordination of functions from sourcing and procurement to production and delivery in order to provide goods, services, and information that meet customers’ needs and contribute to long-term competitive advantage [2]. As firms navigate increasing complexity, cost pressures, and shifting customer expectations, the ability to adapt through innovation and data-driven decision-making becomes essential. The ongoing evolution of SCM is marked by the emergence of disruptive technologies that are transforming traditional models and redefining industry standards. These technological shifts are increasingly extending into the energy domain, where integrated energy systems enable flexible, low-carbon operations that underpin manufacturing ecosystems [3]. With the advent of big data, supply chains are generating vast amounts of transactional and operational data across different sectors. Unlike in the past, when decision-makers battled limited data, today’s difficulty resides in organizing, analyzing, and extracting value from enormous, sometimes unstructured data [4,5]. This data represents a critical resource for improving decision-making in areas such as process design, operational control, and profit prediction. The ability to draw meaningful insights from this data has become a key differentiator for organizations. Traditional decision support systems are often inadequate for handling the scale and complexity of modern supply chain environments. These conventional methods struggle to address the nonlinear relationships that are common in real-world modern supply chains [6]. Additionally, they have limitations in handling vast and often unstructured data generated across different sectors of the supply chain.
Traditional methods for supply chain demand prediction have long served as foundational approaches prior to the advent of machine learning. Wen [7] applied both Autoregressive Integrated Moving Average and Grey Models for shipment forecasting in collaborative transportation management. The study concluded that GM, particularly under uncertainty, offered greater robustness, although it lacked adaptability to dynamic seasonality. In response to such limitations, Tseng et al. [8] developed a hybrid GM to forecast time series data, demonstrating improved accuracy over the conventional GM model. However, the model’s efficacy was constrained by its need for manual parameter tuning. Quartey-Papafio et al. [9] compared ARIMA with GM for forecasting cocoa production, observing superior trend accuracy in GM. The ARIMA’s sensitivity to structural breaks yielded a significant drawback. Similarly, Jia et al. [10] integrated GM with backpropagation neural networks and ARIMA for demand forecasting in intelligent supply chains. Their hybrid approach yielded enhanced short-term predictive performance, although it remained reliant on data preprocessing. Lastly, Xia and Wong [11] applied exponential smoothing, ARIMA, and GM to fashion retailing, concluding that discrete grey forecasting models were more robust in handling the cyclical volatility characteristic of the industry, albeit at the cost of increased model complexity. Collectively, these studies highlight the strengths and persistent shortcomings of traditional statistical methods in addressing the intricacies of modern supply chain environments, underscoring the need for more adaptive and nonlinear-capable forecasting frameworks.
In contrast, machine learning techniques are specifically designed to process and analyze such complex datasets effectively [12,13,14,15,16]. Artificial Intelligence (AI), particularly Machine Learning (ML), has become a powerful solution for navigating the intricacies of modern supply chains [17]. ML approaches are prominent subdisciplines within AI that autonomously discover patterns in data using large datasets [18]. Among these, ensemble learning algorithms have proven especially effective, as they improve a model’s predictive accuracy and generalization by aggregating the outputs of multiple base learners. XGBOOST, an advanced implementation of the gradient boosting ensemble method that emphasizes speed and accuracy, has recently attracted significant interest and is widely used across various tasks [19]. The XGBOOST model is selected in this study for its exceptional efficiency and performance in predictive tasks, leveraging parallel computation for faster processing [20]. However, like other machine learning models, XGBOOST depends on the appropriate selection of hyperparameters [21]. While XGBOOST generally delivers superior performance in terms of speed and accuracy, its large number of parameters can lead to overfitting, especially when they are not optimally tuned. Moreover, relying on manual tuning, which often draws on expert intuition, is a time-consuming method and is known to increase the complexity of achieving optimal results, further complicating the modeling process [22]. Traditional grid searches systematically assess a predetermined set of parameter combinations but exhibit poor scalability with increasing dimensions and inadequately investigate interactions across settings. Random searches provide broad exploration of the search space but assess several inferior parameter settings without leveraging promising areas [23]. Metaheuristic algorithms (MAs) have gained prominence for addressing these challenges as intelligent optimizers that efficiently navigate complex, non-convex hyperparameter landscapes [24]. Unlike exhaustive search methods, MAs use bio-inspired strategies to balance exploration and exploitation, enabling faster convergence to near-optimal solutions.
The Salp Swarm Algorithm (SSA), introduced by Mirjalili et al. [25], is a metaheuristic that simulates the swarming behavior of salps in oceanic currents to navigate complex search spaces. While SSA is valued for its global exploration capability and simplicity, it suffers from delayed convergence and a tendency to stagnate in local optima, limiting its effectiveness in high-dimensional, complex search landscapes. Moreover, the No Free Lunch (NFL) theorem [26,27] emphasizes that a single algorithm cannot be superior in all optimization tasks. This has driven researchers to consistently refine and develop algorithms tailored to different optimization challenges [28]. Despite the growing integration of digital technologies into supply chain operations, accurately evaluating sustainability performance under the dynamic and data-intensive conditions of Industry 4.0 remains a major challenge. Existing statistical and machine-learning methods, such as ARIMA, Grey Models, and conventional ensemble learners, often struggle to capture nonlinear dependencies, adapt to rapidly shifting production contexts, and maintain robustness across heterogeneous datasets. These shortcomings limit their effectiveness for sustainability assessment, where predictive stability and interpretability are essential for operational and environmental decision-making. To bridge this gap, the present study introduces an augmented optimization framework, SSALEO-XGBOOST, that enhances convergence reliability and resilience against local minima, thereby improving predictive accuracy and generalization. This approach not only strengthens the analytical foundation for profit and performance prediction but also supports progress toward Sustainable Development Goals by enabling data-driven decision-making across SCM. Although this study emphasizes the development of an augmented optimization model, its motivation lies in strengthening the economic pillar of sustainability through predictive analytics. Profit prediction functions as a measurable proxy for evaluating operational efficiency and as a cost-optimization factor closely tied to the financial dimensions of sustainability in Industry 4.0 contexts. By providing reliable economic forecasts, the proposed SSALEO-XGBOOST framework supports evidence-based sustainability planning and strategic resource management. In this study, sustainable performance is defined in accordance with the triple bottom line (TBL) framework [29], which conceptualizes sustainability as the integration of economic, environmental, and social objectives. Given the focus of the current analysis, we operationalize the economic dimension of sustainability represented by profit as a measurable indicator of operational efficiency and resource utilization within supply chains. This study adopts the Salp Swarm Algorithm with Local Escaping Operator (SSALEO) developed by Qaraad et al. [30], which incorporates a dynamic perturbation mechanism that improves escaping stagnation phases during the optimization process. This modification enables the swarm to escape local optima more efficiently, improving convergence reliability and solution quality. To the best of our knowledge, this study represents the first application of SSALEO to hyperparameter optimization of XGBOOST for supply chain profit prediction. This integration is especially significant because, despite XGBOOST’s proven performance in predictive modeling, its effectiveness is highly sensitive to hyperparameter selection. This study addresses that gap by leveraging SSALEO to adaptively and efficiently tune XGBOOST’s parameters, mitigating overfitting and improving the model’s generalization performance. The novelty of this research resides not only in the methodological advancement of combining a metaheuristic algorithm with a high-performance ensemble learner, but also in its practical application to a critical and underexplored area, supply chain profit prediction. The main contributions of this study include:
- To develop an augmented algorithm–optimized XGBOOST model that integrates the SSALEO for accurate and robust prediction of supply chain performance.
- To evaluate the predictive accuracy, convergence stability, and generalization capability of the proposed SSALEO-XGBOOST model in comparison with conventional machine-learning and metaheuristic–ML hybrid algorithms.
- To provide actionable insights for the integration of industry technologies, enabling data-driven decision-making that supports efficient and resilient supply chain management.
The study is structured as follows: Section 2 summarizes related literature on hybrid XGBOOST models. Section 3 then discusses eXtreme Gradient Boosting (XGBOOST) and the suggested hybrid model, SSALEO-XGBOOST, which combines SSA and LEO to optimise XGBOOST’s hyperparameters. Section 4 discusses the experimental methodology and findings, benchmark validation, and the model’s application to actual supply chain profit predictions. Section 5 closes by summarizing noteworthy findings and suggesting future study directions.
2. Related Works
In recent years, the integration of machine learning with metaheuristic optimization has gained significant traction in supply chain management, particularly for tasks such as prediction and decision support. Numerous studies have improved the XGBOOST algorithm due to its robustness and superior performance. Specifically, Lu et al. proposed the combination of Weighted K-Nearest Neighbors (WKNN) and XGBOOST, with hyperparameters optimized using a Genetic Algorithm (GA) [31]. The method achieved improved accuracy, reducing the average positioning error to 1.22 m. Similarly, Nabavi et al. proposed a hybrid model that integrates XGBOOST with Grey Wolf Optimizer (GWO) for back-break prediction in blasting operations [32]. Using data from 90 blasts in the Chadormalu iron ore mine, the GWO-XGBOOST model outperformed other models, achieving the highest accuracy and predictive performance. Mohiuddin et al. introduced a Modified Wrapper-based Whale Sine–Cosine Algorithm (MWWSCA) with a Weighted Extreme Gradient Boosting classifier [33]. The hybrid approach integrates feature selection via MWWSCA to enhance prediction accuracy by avoiding local optima and balancing exploration and exploitation. To address class imbalance and improve classification performance for both binary and multi-attack scenarios, a Weighted XGBOOST model with loss function regularization was applied. The proposed algorithm demonstrated superior results on several key metrics. Kazemi et al. proposed a hybrid XGBOOST model optimized with Harris Hawk Optimization (HHO), random search, grid search, and the Dragonfly Algorithm (DA) [34]. Tested on data from copper mines, the HHO-XGBOOST model achieved superior accuracy (R2 of 0.993). Mosa et al. introduced a hybrid strategy combining Enhanced Non-Dominated Sorting Genetic Algorithm II and XGBOOST to optimize text classification [35]. Using the L-HSAB dataset, their multi-objective model maximized classification accuracy while minimizing feature count. By integrating Tabu search with NSGA-II, the approach effectively reduced feature redundancy and enhanced computational efficiency, outperforming baseline and deep learning techniques. Xi et al. introduced a machine learning model integrating XGBOOST with three metaheuristic algorithms: Whale Optimization Algorithm (WOA), Flower Pollination Algorithm (FPA), and GWO [36]. Using experimental data under various environmental exposures and tensile stress conditions, the model showed reliable accuracy across evaluation metrics. Krishna et al. introduced a machine learning model combining Ant Colony Optimization (ACO) and XGBOOST (ACXG) for early detection of two diseases [37]. The model outperformed compared methods, enhancing predictive performance and offering significant improvements in healthcare diagnostics. Ozcan et al. proposed a hybrid model combining Principal Component Analysis (PCA), XGBOOST, and Genetic Algorithm (GA) for heart disease prediction [38]. PCA is used for feature selection, XGBOOST for classification, and GA for parameter optimization. The model outperforms XGBOOST, Artificial Neural Network (ANN), and Support Vector Machine (SVM). Qiu et al. proposed a hybrid framework that leverages both XGBOOST and Sand Cat Swarm Optimization (SCSO) to predict short-term rockburst damage [39]. Using data from 254 rockburst cases, the model achieved 88.46% accuracy, outperforming other models. Key factors like Peak Particle Velocity (PPV) and Geological Structure (GS) were found to be significant predictors. The study of Zhang et al. proposes a hybrid ensemble model that incorporates Random Forest (RF), GWO, and XGBOOST, with GWO optimizing hyperparameters [40]. The RF-GWO-XGBOOST model outperformed standalone RF and XGBOOST models. Beyond algorithmic innovation, effective sustainability transformation requires integrating predictive analytics with strategic decision-making in Industry 4.0 ecosystems. Recent studies demonstrate that data-driven intelligence can guide transitions and implementation. Iqbal et al. [41] developed a strategic framework for circular-economy solutions supporting net-zero carbon in China’s construction sector, while Iqbal et al. identified top-management support, and international pressure as factors in construction supply chains through ISM-MICMAC analysis that influence the adoption of energy-efficient supply chain strategic policies [42]. Furthermore Iqbal et al. [43] underscored green-supply-chain management challenges and digital adoption barriers. Together these works reveal that predictive modeling must extend beyond accuracy to enable carbon-neutral operations, supplier sustainability, and policy alignment. The present study contributes to this strategic agenda by embedding advanced optimization within an interpretable predictive framework. While XGBOOST has demonstrated strong predictive performance across various domains, its use in profit prediction within supply chain systems remains limited. In this study, a hybrid learning framework SSALEO-XGBOOST, is introduced to enhance prediction accuracy by combining the XGBOOST, model with a modified SSA enhanced by a LEO. The suggested SSALEO-XGBOOST model aims to improve profit prediction accuracy, thereby facilitating better-informed, data-driven decision-making in supply chain management.
3. Methodology
3.1. Salp Swarm Algorithm (SSA)
The Salp Swarm Algorithm (SSA), introduced by Mirjalili et al., is a nature-inspired metaheuristic optimization method modeled on the swarming dynamics of salps in the ocean [25]. Salps form a chain-like structure to navigate and forage, with a leader at the front and followers forming a sequence. This biologically inspired mechanism is abstracted to address complex optimization problems by balancing exploration and exploitation. SSA models the optimization process by splitting the population into two parts: the leader and the followers. The leader guides the chain toward the food source, while the followers adjust their positions following the salp preceding them, emulating natural coordination and gradual convergence. Let the position of each salp in a -dimensional search space be denoted as a vector . Equation (1) specifies how the position of the leader salp, which is the first in the chain, is updated.
where is the updated position of the leader in dimension . is the position of the food source in dimension . and are the upper and lower limits of dimension . controls the exploration-exploitation balance and decreases linearly over iterations. and are random numbers in used to add stochasticity. The followers’ positions are determined as follows in Equation (2).
This update ensures the smooth propagation of position information along the chain, preserving structural integrity and promoting gradual convergence. The coefficient is dynamically updated with Equation (3):
l represents the current iteration, while L indicates the maximum number of iterations. This dynamic behavior of allows SSA to emphasize exploration during the initial iterations and gradually shift to exploitation, thereby improving the convergence toward the global optimum. SSA has demonstrated effectiveness in solving high-dimensional, nonlinear, and multimodal optimization problems due to its adaptive balance between exploration and exploitation, simplicity in implementation, and low computational complexity. Its flexibility makes it a suitable candidate for hybridization with machine learning models, where it can optimize hyperparameters.
3.2. Local Escaping Operator (LEO)
To enhance the balance between exploration and exploitation and improve local optimal avoidance capabilities of the proposed SSALEO algorithm [30], LEO is incorporated into the position update step of selected agents. The LEO is designed to assist solutions in escaping local optima during stagnation or premature convergence by introducing randomized perturbations directed toward the best-known position while maintaining diversity in the search space. The equation of LEO is expressed below in Equation (4):
where Z is expressed in Equation (5),
where is a solution at any given iteration, denotes the current best position, are random solutions, and are randomly generated alternative solutions from the population. is the escape probability (set to 0.5) to ensure a balance between exploration and exploitation, is a scaling coefficient, and the parameters and control the direction and influence of the optimal solution and the neighboring solution, and are defined in Equation (6):
where is a random number uniformly distributed, the expressions above can be further simplified using a binary control variable , as given in Equation (7).
To ensure a dynamic and effective balance between exploration and exploitation in the optimization process, LEO integrates a time-adaptive control parameter, denoted as . This parameter plays an important role in adjusting the intensity and direction of the search mechanism over iterations. It is defined in Equation (8).
where is a function of a sinusoidal oscillation designed to control perturbations stochastically, as seen in Equation (9):
Here, is an adaptive decay coefficient defined as given in Equation (10):
where and are the current and maximum number of iterations, respectively. and . This formulation ensures that larger perturbations are applied during the early stages of the search (favoring exploration), while smaller, refined updates dominate in the later stages (favoring exploitation), thus preventing premature convergence. Additionally, to update the position of an agent during the LEO application, the selection of solution is based on a binary probabilistic scheme in Equation (11).
where is a uniformly distributed random number, is a random solution from the salp population and is a new solution sampled from the problem space as expressed in Equation (12).
This selection logic can also be rewritten in a unified linear expression as expressed in Equation (13).
where is a binary variable that is set to 1 if and 0 otherwise. This mechanism allows the LEO strategy to adaptively alternate between exploiting known good solutions from the population and injecting diversity via randomly sampled candidates. In doing so, the LEO-enhanced SSA maintains both local intensification and global diversification throughout the search trajectory, effectively mitigating stagnation and enhancing convergence toward global optima.
3.3. eXtreme Gradient Boosting (XGBOOST)
XGBOOST is a scalable, efficient, and regularized machine learning algorithm based on gradient boosting decision trees (GBDT). Introduced by Chen and Guestrin [44], XGBOOST enhances the classical boosting framework through system-level and algorithmic optimizations such as regularization, weighted quantile sketch, sparsity-aware split finding, and parallelized tree construction, making it particularly suitable for structured data prediction tasks. The model builds an additive ensemble of decision trees to minimize a regularized loss function. The prediction for a sample in iteration is given in Equation (14).
where is the predicted value after iterations, represents an individual regression tree, and is the space of all possible regression trees. The objective function at iteration is defined as expressed in Equation (15).
where is a differentiable convex loss function, is the regularization term that penalizes the complexity of the model, is the number of leaves in the tree, is the vector of leaf weights, and and are regularization hyperparameters. Using the second-order Taylor expansion, the objective function is approximated as defined in Equation (16).
where and are the first and second derivatives of the loss function, respectively. The optimal structure and weights of the tree are determined by minimizing this regularized objective, which allows XGBOOST to efficiently capture complex nonlinear relationships while controlling overfitting through both and regularization.
3.4. SSALEO-XGBOOST
The choice of XGBOOST as the baseline predictive model is grounded in its capacity to implement gradient-boosted decision trees within a regularized framework, thereby balancing predictive accuracy and generalization. Its theoretical foundation in predictive learning theory enables it to capture complex nonlinear interactions while penalizing model complexity through regularization, effectively reducing the risk of overfitting. However, the performance of XGBOOST is highly sensitive to hyperparameter configuration, which directly influences model bias–variance behavior. To address this challenge, we integrate an augmented optimization mechanism based on the SSALEO. Theoretically, this aligns with adaptive optimization principles, wherein search agents in SSALEO self-adjust based on the evolving fitness landscape to maintain equilibrium between global exploration and local refinement. Within the context of sustainability performance measurement, this adaptive capability ensures that predictive models remain resilient to dynamic industrial conditions and data heterogeneity, thus providing more stable, interpretable, and actionable insights for sustainable decision-making. The proposed SSALEO-XGBOOST framework, depicted in Figure 1, is proposed to enhance the hyperparameter optimization of the XGBOOST model by leveraging SSALEO (Algorithm 1). The algorithm begins with the initialization of a population of salps, where each individual agent is denoted as a vector whose length is equal to the number of hyperparameters being optimized. In this study, the search space for the XGBOOST hyperparameters was defined as follows: the learning rate was set within the range [0.01, 0.3], max_depth in the range [3, 10], n_estimators in the range [10, 150], subsample in the range [0.5, 1.0], colsample_bytree in the range [0.5, 1.0], and gamma in the range [0.0, 5.0]. The search intervals for each hyperparameter were determined based on a combination of established literature. The learning rate exhibits the highest sensitivity, directly affecting convergence stability and prediction bias. The max_depth parameter significantly influenced model interpretability and generalization. These intervals of parameters were used to guide the metaheuristic optimization process during the search for optimal hyperparameter configuration. Each salp explores a candidate solution within this space, and the respective parameter configuration is employed to train an XGBOOST model on the training dataset. The MSE from this training process is computed and serves as the fitness value for the salp. Following this, the control coefficient C1, which regulates the balance between exploration and exploitation in SSA, is updated according to Equation (3). The population is then partitioned into leaders and followers. The positions of the leader salps are updated based on Equation (1), guiding the search toward optimal regions, while the followers adjust their positions according to Equation (2), maintaining swarm dynamics and diversity. To improve local search efficiency and mitigate stagnation, the LEO is applied to the population. As defined in Equation (4), the LEO introduces adaptive perturbations that allow agents to escape local optima and refine their solutions, thereby improving both convergence speed and final accuracy. This hybrid mechanism effectively maintains a dynamic equilibrium between global exploration and local exploitation. The salp with the lowest MSE across all iterations is ultimately selected to define the final XGBOOST configuration, which is then evaluated on the test dataset. This augmented mechanism refines feature weighting within the XGBOOST structure by guiding the optimizer toward hyperparameter regions that minimize overfitting while improving generalization. Consequently, the SSALEO-XGBOOST achieves superior stability and faster convergence relative to baseline metaheuristic–ML combinations, effectively capturing the nonlinear interdependencies characteristic of sustainable supply chain systems through achieving superior predictive accuracy in supply chain profit prediction.
| Algorithm 1: SSALEO for XGBOOST Hyperparameter Optimization |
| Input: Hyperparameter bounds, , dataset Output: Optimal XGBOOST model 1: Initialize salp population 2: Evaluate fitness using MSE on training data 3. Set best solution 4: for to do 5: Update using Equation (3) 6: for each salp do 7: if then 8: Update leader using Equation (1) 9: else 10: Update follower using Equation (2) 11: end if 12: if then 13: Apply LEO update (Equations (4)–(13)) to 14: end if 15: Clip positions within bounds 16: -evaluate 17: Update if better solution found 18: end for 19: end for 20: Return XGBOOST model with parameters from 21: Evaluate the final model with test data |
Figure 1.
SSALEO-XGBOOST Framework.
3.5. Computational Complexity
The computational complexity of the SSALEO-XGBOOST arises primarily from the iterative nature of the SSA with LEO and the repeated training of the XGBOOST model for fitness evaluation. Let denote the population size, the number of iterations, the number of hyperparameters to optimize, and the cost of training and validating an XGBOOST model. Each iteration involves updating salp positions (), applying the LEO operator probabilistically (), and evaluating fitness via model training . Consequently, the total complexity of the metaheuristic component over all iterations is . The dominant cost, , stems from the XGBOOST training process, which, for training samples and boosting rounds, has a complexity of approximately . Therefore, the overall computational complexity of the SSALEO-XGBOOST algorithm is . While computationally intensive, this design enables efficient navigation of the high-dimensional hyperparameter space and delivers superior predictive performance, justifying the additional computational cost.
4. Experiment and Discussion
4.1. Benchmark Validation
This section evaluates the novel SSALEO algorithm using a suite of standard benchmark functions prior to its application in machine learning hyperparameter optimization. To validate its optimization capabilities, experiments were conducted using the CEC2015 benchmark suite [45], which includes various function types: single-modal (F1–F2), multimodal (F3–F5), hybrid (F6–F8), and composite functions (F9–F15). Each function type serves a specific purpose in evaluating key aspects of the algorithm, such as convergence to global optima, exploration of complex landscapes, and the balance between global and local search behavior. The experiments were conducted with a maximum of 2000 iterations, 30 independent runs, a population size of 30, and 30 dimensions. Parameter settings for SSALEO and competing algorithms are listed in Table 1. The SSALEO algorithm is compared against several well-established metaheuristic optimizers, including the original SSA, GWO [46], Honey Badger Algorithm (HBA) [47], Moth Flame Optimization (MFO) [48], Sine Cosine Algorithm (SCA) [49], and Seagull Optimization Algorithm (SOA) [50]. The performance results, expressed in terms of mean and standard deviation for each test function, are summarized in Table 2. In addition, statistical significance is assessed using the Wilcoxon rank-sum test and the Friedman test, also presented in Table 2.
Table 1.
Parameter Settings.
Table 2.
Results of Optimization Algorithms on CEC2015.
The results demonstrate that SSALEO delivers improved performance, particularly on the unimodal functions (F1–F2), which are designed to evaluate an algorithm’s exploitation ability. In these cases, SSALEO outperformed SSA and other methods, achieving the best solution in F2, indicating that the integration of the LEO strategy significantly enhances the exploitation and solution refinement capabilities of the SSA. On multimodal functions (F3–F5), which contain multiple local minima and are used to evaluate exploration behavior, SSALEO again showed notable improvements. It achieved the most optimal mean performance in F4 and F5, demonstrating a strong ability to avoid local optima and maintain exploratory diversity. Notably, GWO achieved the best result on F3, emphasizing that no single algorithm universally excels across all problem landscapes. For hybrid and composite functions (F6–F15), which test the algorithm’s capacity to balance exploration and exploitation, SSALEO exhibited superior performance on F7, F8, F10, F12, F13, and F15, confirming its robustness in navigating highly complex search spaces. This overall improvement is attributed to the LEO mechanism, which introduces adaptive perturbations to refine solutions while preserving global diversity. Furthermore, the Wilcoxon rank-sum test results show statistically significant improvement for SSALEO over the comparative algorithms, with p-values below the 0.05 threshold, indicating that the observed differences are not due to chance. The Friedman test, which ranks algorithms based on their average performance across all functions, further supports these findings, ranking SSALEO as the top-performing algorithm among all those compared. These results collectively confirm that the integration of the LEO into the SSA framework substantially improves the algorithm’s effectiveness in a wide range of optimization scenarios.
Building upon the original SSA, the proposed SSALEO demonstrates significantly improved convergence characteristics, particularly on the challenging benchmark problems within the CEC2015 test suite. The convergence behavior of SSALEO across test functions is illustrated in Figure A1, which presents the iterative progression of fitness values over 2000 iterations. Upon close examination of these convergence curves, it becomes evident that SSALEO consistently progresses toward the global optimum during the intermediate phases of optimization, outperforming the original SSA and other competing algorithms. This accelerated convergence is attributed to the integration of the LEO, which introduces adaptive local perturbations that help the algorithm escape premature convergence and effectively refine elite solutions. As iterations progress, the algorithm demonstrates a pronounced shift from exploration to exploitation. Initially, SSALEO explores a wide search space to locate promising regions near the global optimum. Subsequently, it shifts into a more exploitative mode, fine-tuning candidate solutions to minimize the objective function. This dynamic transition is particularly noticeable in functions F2, F4, F5, F7, F8, F10, F12, F13, and F15, where SSALEO achieves rapid convergence and consistently outperforms alternative methods. The steep descent of the convergence curves during early and mid-stage iterations for these functions indicates SSALEO’s sensitivity to the fitness landscape and its ability to navigate efficiently toward global optima.
Further insights into the algorithm’s performance stability are provided through boxplot analysis, presented in Figure A2. Boxplots serve as a powerful visualization tool to assess the distribution and robustness of results across multiple runs. In this study, the narrow width of SSALEO’s boxplots, compared to those of the benchmark algorithms, signifies lower variance and higher consistency in solution quality. This coherence across 30 independent runs highlights the algorithm’s reliability and robustness under stochastic variation. The compact distribution further confirms the effectiveness of the LEO component in improving both the exploration and exploitation capabilities of SSA, reducing sensitivity to initialization, and improving the overall convergence reliability.
This section provides a detailed evaluation of the SSALEO algorithm, emphasizing its ability to balance exploration and exploitation. Central to the effectiveness of any metaheuristic algorithm is its ability to effectively balance exploration (global search) and exploitation (local refinement), which are essential yet often conflicting components of the optimization process. The integration of the LEO within SSA plays a critical role in achieving this equilibrium by adaptively guiding the algorithm across different phases of the search. Figure 2 presents a visual illustration of the exploration and exploitation dynamics of SSALEO while solving various CEC2015 benchmark functions. The curves represent the relative contributions of exploration and exploitation throughout the iterations, offering insight into how the algorithm modulates its search behavior. As observed, the algorithm shows a dominance of exploration in the early stages, allowing for broad traversal of the search space to identify promising regions. This is followed by a progressive and controlled transition toward exploitation, whereby the algorithm intensifies its focus on refining elite solutions.
Figure 2.
Exploration vs. Exploitation of SSALEO.
The trends depicted in Figure 2 highlight SSALEO’s capacity to maintain a harmonious balance between these two processes. This dynamic control ensures that search agents avoid premature convergence while steadily improving their solution quality. Notably, the smooth interplay between exploration and exploitation seen across all test problems demonstrates the effectiveness of the enhancements introduced in the SSALEO framework. These findings validate that SSALEO is capable of navigating complex, high-dimensional landscapes with agility and robustness, ultimately improving convergence to globally optimal solutions. Such a balance is essential for real-world applications where optimization landscapes are often nonlinear and multimodal. By orchestrating this equilibrium through adaptive mechanisms such as LEO, SSALEO achieves superior optimization performance, establishing its effectiveness as a global optimizer across diverse and challenging problem domains.
4.2. Supply Chain Prediction
4.2.1. Data
The dataset employed in this study contains detailed transactional records related to supply chain operations, with the primary goal of predicting supply chain profit. The dataset is sourced from Kaggle [51] and pertains to data from the supply chains in the United States. The dataset includes categorical and numerical features related to sales transactions, such as Sales Channel, Warehouse Code, Order Quantity, Unit Cost, Unit Price, Discount Applied, and Delivery Time. The target variable in this research is Profit. To enhance the predictive quality and ensure the reliability of the dataset, several preprocessing steps were carried out. Initially, all date fields ProcuredDate, OrderDate, ShipDate, and DeliveryDate were converted to extract a derived temporal feature, Delivery Time, representing the duration in days between order placement and final delivery. Monetary values such as Unit Cost and Unit Price were converted from string to numeric format after removing currency symbols and delimiters. The Profit variable was then calculated as the product of the unit profit, order quantity, and applicable discount. The dataset was chosen for its comprehensiveness and representativeness of modern logistics environments. It captures critical financial and operational dimensions that jointly define sustainable supply chain performance. These indicators reflect key industry priorities such as operational efficiency, demand responsiveness, and resource optimization, which align with economic and environmental sustainability metrics. Derived features such as Delivery Time were engineered to quantify process efficiency. Categorical attributes (Sales Channel and Warehouse Code) were encoded numerically using label encoding. After preprocessing, the dataset was split into 80:20 training and testing ratios. Feature scaling was conducted using standardization, transforming features to have a zero mean and unit variance. This process, represented mathematically in Equation (17), is critical to mitigate the disproportionate influence of features with larger magnitudes and to ensure uniform contribution of all predictors during training.
where is the original value, is the feature mean, is the standard deviation, and is the standardized value. The diverse scales and units of the dataset variables necessitate such normalization to prevent skewed model learning. This preprocessing not only mitigates the influence of feature-scale disparity but also ensures smoother convergence and higher predictive consistency during training. These preprocessing procedures also ensure the integrity, comparability, and relevance of the input data for predictive modeling of supply chain profit. Figure 3 illustrates the distribution of numerical dataset variables before scaling.

Figure 3.
Distribution of the Dataset.
4.2.2. Evaluation Metrics
The assessment of predictive models in machine learning relies heavily on evaluation metrics, which supply a standardized and quantitative framework. These metrics are instrumental in determining how accurately a model estimates the actual data, thereby guiding the optimization, comparison, and selection of the most effective models. In this study, a range of widely accepted regression evaluation metrics has been employed to comprehensively evaluate the model’s predictive capability. These include the , RMSE, MSE, ME, and RAE [24,52]. The detailed mathematical formulations of these metrics are as follows:
The metric in Equation (18) explains the proportion of variance in the observed data that is predictable from the independent variables. A value closer to 1 indicates a higher level of model performance.
RMSE, as given in Equation (19), provides a measure of the square root of the average error, while MSE is a widely used metric that quantifies the average of the squared differences between actual and predicted values, as given in Equation (20).
The ME metric in Equation (21) identifies the single largest error made by the model, offering insight into its worst-case performance.
RAE provides a normalized measure of the total absolute error relative to the error produced by a simple mean-based prediction.
In the above equations, represents the total number of observations, is the observed value, represents the predicted value, and refers to the mean of the observed values. Together, these metrics offer a robust assessment of model accuracy, reliability, and generalization capability, thus facilitating an informed comparison among competing predictive approaches.
4.2.3. Cross-Validation Experiment
In this experiment, for the optimizer-based XGBOOST approach, the optimizer parameters were kept as specified in Table 1, with 50 iterations to balance efficiency and complexity during training and 30 search agents. The search boundaries for the XGBOOST hyperparameters are detailed in Section 3.4. The standalone XGBOOST model was configured with the following hyperparameters: learning rate = 0.3, max_depth = 10, n_estimators = 150, subsample = 1.0, colsample_bytree = 1.0, and gamma = 5.0. The standalone KNN model was configured with 10 neighbors, and the Extreme Learning Machine (ELM) model was set with 15 hidden neurons in the single hidden layer.
The performance of the novel SSALEO-XGBOOST model was rigorously evaluated. A 5-fold cross-validation experiment comparing its results with other machine learning baselines and hybrid models is presented in Table 3 and Table 4, including SSA-XGBOOST, GWO-XGBOOST, HBA-XGBOOST, MFO-XGBOOST, SCA-XGBOOST, SOA-XGBOOST, standalone XGBOOST, ELM, and KNN. Performance was evaluated using key regression metrics: R2, RMSE, MSE, ME, and RAE. For each metric, the mean (AVG) and standard deviation (STD) across the folds to capture both central tendency and variability are reported. The evaluation results, reported in Table 3 and Table 4, demonstrate that the proposed SSALEO-XGBOOST model outperforms all baseline and competing hybrid models across all key performance metrics. On the training folds, SSALEO-XGBOOST achieved an average R2 of 0.9994, indicating significant improvement in the model’s accuracy compared to standalone XGBOOST, with a remarkably low standard deviation of 2.28 × 10−5, indicating highly stable performance across different subsets of the data. The corresponding RMSE and MSE were also lowest among all models, further reinforcing the model’s superior ability to minimize prediction error. Furthermore, there was a statistically significant improvement. The Wilcoxon signed-rank test is performed to assess whether differences in performance between SSALEO-XGBOOST and the compared models across all folds for each model were significant. The resulting p-values below 0.05 across all comparisons confirm the performance improvements. These statistical tests show that the proposed model provides a real, repeatable advantage over existing approaches.
Table 3.
Training Results of 5-Fold Cross-Validation Experiment.
Table 4.
Test Results of 5-Fold Cross-Validation Experiment.
Beyond training performance, a model’s utility lies in its ability to generalize to unseen data. On the test folds, SSALEO-XGBOOST continued to outperform other methods, achieving a mean R2 of 0.9851, and maintaining a low RMSE of 0.1221 and MSE of 1.49 × 10−2. Notably, the model achieved a STD in R2 of 5.76 × 10−4, which was the lowest among all evaluated models. This not only highlights the model’s predictive strength but also its resilience to variance across folds, a critical quality for supply chain applications where data heterogeneity is common. In contrast, baseline models such as KNN and ELM exhibited both lower accuracy and greater instability. Even hybrid models such as GWO-XGBOOST and HBA-XGBOOST showed substantial drops in R2 (0.9624 and 0.9205, respectively) and increased error variance, suggesting limited reliability under varied data conditions due to their parameter optimization strategy. The low values for RMSE and ME of the proposed model imply that it not only fits well on the entire dataset (as shown by high R2) but also maintains low average deviation on a case-by-case basis. This is particularly important in supply chain profit prediction, where even small deviations can result in significant operational cost misestimations.
4.2.4. Performance Evaluation over 20 Independent Runs
To further ensure the robustness, reliability, and reproducibility of the proposed SSALEO-XGBOOST model, 20 independent training and testing experiments were conducted. This experimental design enables a more rigorous evaluation of generalization capability. The experiment parameters remain as given in Section 4.2.3. Across these runs, the average performance metrics, standard deviations, best outcomes, and Wilcoxon p-value are presented in Table 5 and Table 6.
Table 5.
Training Performance Across 20 Independent Runs.
Table 6.
Test Performance Across 20 Independent Runs.
The results confirm that SSALEO-XGBOOST consistently achieves exceptional predictive accuracy with very low variability. It recorded the highest mean R2 value of 0.99941 with an extremely low standard deviation, indicating high stability across randomized runs. Relative to the proposed model, SSA-XGBOOST and GWO-XGBOOST showed greater variance and lower average performance. RMSE and MSE values further supported these results: SSALEO-XGBOOST exhibited the lowest RMSE (0.02425) and MSE (5.88 × 10−4) on average. SSALEO-XGBOOST achieved the best results, with an R2 of 0.99943 and an RMSE of 0.02378, outperforming all other models under optimal conditions. Furthermore, the p-values for all comparisons were well below 0.05, strongly rejecting the null hypothesis that performance differences are due to chance. In terms of error magnitude, the ME for SSALEO-XGBOOST was the lowest (0.2315), with a narrow confidence interval, suggesting that even individual predictions were consistently accurate and close to true profit values. Competing models such as MFO-XGBOOST and SOA-XGBOOST had significantly higher ME (1.317 and 1.131, respectively) and standard deviations, implying greater instability.
Evaluation on test data over 20 runs further validated the generalization capability of SSALEO-XGBOOST. It maintained a high average R2 of 0.98455, again outperforming all others, with minimal standard deviation (5.03 × 10−4). The runner-up, SSA-XGBOOST, achieved 0.9685 R2 but with higher variance, while traditional baselines like KNN and ELM achieved 0.909 and 0.748, respectively. The average RMSE for SSALEO-XGBOOST was also the lowest (0.1215), with a tight dispersion (STD of 0.00197). In comparison, GWO-XGBOOST and MFO-XGBOOST showed substantially higher errors and variability. The best RMSE achieved by SSALEO-XGBOOST was 0.1180, again underscoring its ability to generalize under favorable conditions. ME and RAE followed the same trend. The SSALEO-XGBOOST model reported the lowest ME (1.201) and RAE (0.1242), significantly lower than all other methods. Notably, these improvements were statistically significant (p < 0.05) across all metrics.
Across both training and testing evaluations, SSALEO-XGBOOST consistently demonstrated not only the highest predictive accuracy but also the lowest error variance, fulfilling key criteria of a statistically reliable machine learning model. The inclusion of standard deviation and best result metrics provides a clear picture of performance stability, while the statistical significance testing confirms that these differences are non-random and repeatable. The practical implication is that the proposed model offers superior, dependable performance for supply chain profit forecasting, making it well-suited for high-stakes decision environments where predictive confidence and generalization are critical. By balancing accuracy with stability, SSALEO-XGBOOST can significantly reduce risks related to over- or underestimation of profit margins, inventory planning, and cost allocation in dynamic supply chain settings. Compared to the baseline XGBOOST model, the proposed SSALEO-XGBOOST achieved a 13.6% higher R2 in the test phase (0.9855 vs. 0.8673), a 65.9% lower RMSE (0.1215 vs. 0.3561), and an 88.3% reduction in MSE (0.01477 vs. 0.1268), underscoring its superior generalization capability and stability. From a sustainable supply chain perspective, these performance gains translate into tangible managerial insights. Enhanced predictive accuracy enables firms to anticipate profit margins and operational inefficiencies more precisely, thereby supporting resource-efficient production scheduling, waste and energy reduction, and data-driven logistics coordination.
To complement the quantitative analysis presented in Table 3, Table 4, Table 5 and Table 6, a visual examination of the model performance is illustrated through Figure 4. These plots depict the actual versus predicted values on all data samples (spanning training and testing phases) of the best outcome of the 20 independent runs, overlaid with absolute error magnitudes shown in dark red. The analysis centers on the fidelity of each model’s predictions and the dispersion of error across data samples. As evident in Figure 4, the SSALEO-XGBOOST model achieves highly concentrated predicted values (in blue) that align closely with the actual target values (black markers) throughout the entire dataset. Importantly, the absolute error remains minimal and stable across both the training and testing regions, with very few visible spikes. This low variance in error, especially in the test segment beyond the 6000th sample index, underscores the model’s strong generalization capacity. The pronounced separation between predicted values and the error baseline reflects a robust fit and minimal deviation, aligning with the model’s superior RMSE and RAE scores. In contrast, the SSA-XGBOOST model exhibits a significantly more scattered prediction trend, with a noticeably denser error band distributed across both the training and testing partitions. Despite SSA’s inherent exploration capabilities, the absence of enhanced exploitation mechanisms such as adaptive LEO dynamics in SSALEO leads to higher residuals and greater prediction uncertainty. Similarly, GWO-XGBOOST presents moderate predictive alignment with actual values, but with visibly more frequent and higher-magnitude error peaks, particularly in the testing region, indicating a less precise adaptation to unseen data.
Figure 4.
Absolute Error Distribution of Models.
The HBA-XGBOOST and SOA-XGBOOST models further amplify this trend, showing widespread prediction dispersion and pronounced error intensities throughout the entire dataset. The error distribution remains relatively elevated even during training, suggesting suboptimal convergence or susceptibility to local optima during parameter tuning. These visual outcomes correspond well with their poorer statistical performance. Collectively, the plot comparisons substantiate that SSALEO-XGBOOST not only achieves superior numerical performance but also delivers the most stable and reliable predictions over the entire data domain. The marked reduction in error variance, particularly in unseen data, indicates the effectiveness of combining SSA’s global search with LEO. This hybridization is crucial in fine-tuning XGBOOST’s parameters, enabling precise, consistent, and generalizable modeling of complex supply chain profit dynamics.
To further validate the predictive capabilities of the proposed SSALEO-XGBOOST model, Figure 5 presents scatter plots of actual versus predicted supply chain profit values across the training and testing phases for all comparative models for the best run of 20 independent runs. These plots offer intuitive insight into model performance by visualizing the alignment of predicted outputs with ground truth values, where the dashed red line serves as the reference for perfect predictions. As evidenced in Figure 5, the SSALEO-XGBOOST model exhibits a nearly perfect correlation between actual and predicted values, with training and testing samples densely aligned along the line. The model achieves an of 0.999 on training data and 0.985 on test data, reflecting its exceptional capacity to generalize learned patterns to unseen instances. The minimal scatter and tight clustering around the ideal line indicate low variance and error, reaffirming the model’s robustness observed in earlier quantitative evaluations.

Figure 5.
Actual vs. Predicted Plots.
In contrast, the SSA-XGBOOST model, while maintaining strong performance, i.e., training of 0.987 and testing of 0.969, shows slightly more deviation from the ideal line, particularly in high-value regions of supply chain profit predictions. The increased spread suggests reduced precision, especially under conditions of higher data variability, likely due to SSA’s less adaptive exploitation mechanism in the optimization landscape compared to the LEO enhancement in SSALEO. Furthermore, it can be observed that the initial fitness values differ for various optimizer-based LightGBMs; this is attributable to the stochastic nature of population initialization and the distinct search mechanisms inherent to each algorithm, as seen in Figure 6. Among the methods evaluated, SSALEO-LightGBM and SSA-LightGBM exhibited superior initial solution quality, likely due to their effective population generation strategies. The GWO-XGBOOST and SCA-XGBOOST models also demonstrate competent performance with the test values of 0.9693 and 0.9697, respectively. However, their scatter plots reveal a wider spread, especially among high supply chain profit values, indicating underfitting and a reduced ability to tightly capture the non-linear complexities of the target variable. While these models benefit from global exploration, their convergence behavior appears less refined than SSALEO’s balanced exploration–exploitation trade-off. HBA-XGBOOST displays visibly inferior alignment. HBA, with a testing of 0.919, demonstrates widespread deviation from the ideal line, indicating underperformance in accurately capturing supply chain profit variations. The prediction points cluster well below the ideal line at higher actual values. Similarly, SOA-XGBOOST, although slightly better, shows substantial prediction spread, reinforcing the limitations of these algorithms in complex regression tasks when not enhanced by mechanisms. Collectively, these plots strongly support the conclusion that SSALEO-XGBOOST offers the most consistent, accurate, and generalizable performance among all tested models.
Figure 6.
Convergence Plot Analysis.
To gain further insight into the optimization efficiency and search dynamics of each metaheuristic-driven hybrid model, the convergence profiles in terms of mean MSE across 50 iterations are illustrated in Figure 6. This plot reflects how rapidly and effectively each algorithm minimizes the objective function (fitness) during hyperparameter tuning of the XGBOOST regressor. The convergence trajectory of the proposed SSALEO-XGBOOST stands out distinctly among all compared models. It achieves the lowest final MSE, converging sharply within the first 20 iterations to a stable minimum. This rapid and deep convergence demonstrates the superiority of SSALEO’s search strategy, which synergistically combines the global exploration of the SSA with the enhanced local exploitation mechanism introduced by LEO. The hybridization appears particularly effective in escaping local optima early in the search process, enabling swift progress toward a global optimum. In contrast, the SSA-XGBOOST model, while leveraging the base SSA framework, settles at a relatively higher fitness value. Its slower descent and premature convergence suggest that the lack of adaptive step-size diversification hinders its ability to refine solutions effectively in later iterations.
Similarly, GWO-XGBOOST and SCA-XGBOOST exhibit moderate convergence rates but also stagnate at higher MSE levels, indicating less efficient search balance and limited exploitation strength. Among the rest, SCA-XGBOOST and SOA-XGBOOST exhibit slightly more favorable convergence than HBA-XGBOOST but remain suboptimal compared to SSALEO. The poorest performance is observed in MFO-XGBOOST, whose curve remains the highest throughout the optimization process. This model converges slowly and to a substantially larger MSE value, pointing to weaker global exploration capabilities and possible premature convergence to suboptimal regions in the search space. The convergence analysis clearly affirms the optimization strength of SSALEO-XGBOOST, highlighting its efficiency in guiding the XGBOOST learner toward a more optimal solution configuration.
The mean runtime comparison across different optimization-enhanced XGBOOST models offers important insights into the computational efficiency of the proposed SSALEO-XGBOOST framework. As shown in Figure 7, the SSALEO-XGBOOST model required an average of 117.70 s to complete training and optimization, which is moderate relative to the alternatives. While this runtime is higher than simpler models such as SCA-XGBOOST, SOA-XGBOOST, and GWO-XGBOOST, it remains substantially lower than MFO-XGBOOST, while slightly higher than the traditional SSA-XGBOOST algorithms. The elevated computational time associated with SSALEO can be attributed to the integration of the LEO within the Salp Swarm framework, which introduces additional perturbation and exploration steps to escape local optima. However, this increase in time is offset by the significant improvements in prediction accuracy and robustness, as evidenced by superior R2, RMSE, and MSE performance across both training and testing phases. Compared to the baseline XGBOOST model, SSALEO-XGBOOST introduces higher runtime overhead, but this trade-off is justified in high-stakes environments. Conclusively, the computational time of SSALEO-XGBOOST is manageable and well-aligned with the demands of real-world supply chain systems, where model interpretability, accuracy, and generalizability often take precedence over minimal runtime. The model strikes a balance between convergence reliability and resource efficiency, making it a practical choice for time-sensitive yet precision-critical applications.
Figure 7.
Execution Time of the Compared Machine Learning Models.
4.2.5. Feature Importance Analysis
Understanding the relative importance of input features provides valuable insights into the underlying decision-making process of machine learning models. In this study, the feature importance ranking derived from the optimized SSALEO-XGBOOST model is presented in Figure 8, which quantifies each variable’s influence on the model’s predictive performance. The most influential predictor is Order Quantity, with a significantly dominant importance score exceeding 0.5. This result highlights the central role of order volume in determining supply chain profit, as it directly scales revenue and cost. The prominence of Order Quantity aligns well with practical supply chain economics, where profitability is highly sensitive to volume-based factors due to fixed and variable cost structures. Following this, Unit Price emerges as the second most impactful variable, with a substantial importance score of approximately 0.35. This emphasizes the direct influence of pricing strategy on profit margins. Conversely, Unit Cost holds a moderate influence, reflecting its indirect effect in shrinking or expanding the profit margin for each unit sold. Notably, features such as Discount Applied, Delivery Time, Sales Channel, and Warehouse Code exhibit minimal influence on the model’s decision structure. While discounting practices and delivery time intuitively affect customer behavior and operational efficiency, their relatively low scores suggest either limited variance in the dataset or indirect/nonlinear effects not prioritized by the gradient-boosting mechanism. Similarly, the low importance of categorical features like Sales Channel and Warehouse Code indicates that these attributes do not exhibit strong discriminatory power for profit prediction. These findings validate the model’s interpretability and emphasize the relevance of core operational variables—quantity, price, and cost-in shaping profitability.
Figure 8.
Feature Importance.
4.2.6. Operational Implications of Model Performance
The SSALEO-XGBOOST model demonstrates exceptional predictive performance, as supported by its high R2 values and low error metrics, establishing it as a statistically robust tool for predicting supply chain profitability. These metrics indicate that the model captures nearly all variance in profit outcomes while maintaining consistent precision across multiple independent runs. Beyond statistical reliability, the model’s interpretability and actionable insights enhance its utility for practical supply chain management. This section elucidates the operational implications of the model’s performance, translating technical findings into strategic recommendations for supply chain practitioners. To leverage the SSALEO-XGBOOST model effectively in operational contexts, supply chain professionals should adopt the following strategies based on the model’s insights:
- Optimize Order Quantities: Given the dominant influence of Order Quantity, organizations should prioritize demand forecasting and aggregation strategies to determine optimal order sizes. This involves aligning procurement schedules with demand patterns and negotiating bulk purchase agreements with suppliers to secure volume-based discounts, thereby reducing per-unit costs and enhancing profit margins.
- Refine Pricing Strategies: The significant role of Unit Price underscores the importance of dynamic pricing models that balance competitiveness with profitability. Managers should conduct market analyses to set prices that reflect customer willingness to pay while ensuring sufficient margins to cover costs. Price elasticity studies can further inform adjustments to maximize revenue.
- Enhance Cost Control Measures: While Unit Cost has a moderate impact, it remains a critical lever for profitability. Supply chain teams should explore cost-reduction initiatives, such as process optimization, supplier diversification, or adoption of lean inventory practices, to minimize production and procurement expenses without compromising quality.
- Integrate Model Outputs into Decision-Making: The SSALEO-XGBOOST model should be embedded within decision support systems to provide real-time profit predictions. For instance, managers can simulate the profit impact of adjusting order quantities or pricing strategies under varying market conditions, enabling data-driven decision-making.
The SSALEO-XGBOOST model transcends traditional predictive analytics by serving as a strategic decision support tool. Its ability to quantify the relative importance of operational variables bridges the gap between machine learning outputs and managerial action. By focusing on high-impact levers such as order quantity, pricing, and cost control, supply chain professionals can align tactical decisions with broader economic objectives, optimizing profitability while mitigating risks in volatile markets.
4.2.7. Managerial Implications, Limitations, and Future Research Directions
The proposed SSALEO-XGBOOST framework represents not only a methodological innovation in predictive modeling but also a significant contribution to supply chain analytics with tangible managerial implications. By integrating a hybrid metaheuristic optimization algorithm, specifically the SSALEO, into the hyperparameter tuning process of XGBOOST, the model achieves enhanced predictive accuracy and robustness. This synergistic approach advances the state-of-the-art in machine learning applications for SCM, offering a scalable, data-driven decision-support system capable of addressing complex operational challenges. The profit indicator employed in this study represents not a narrow financial outcome but a data-driven component of the broader sustainability framework, consistent with the triple bottom line (TBL) approach [29]. In smart industrial systems, economic viability is inherently linked with environmental efficiency, where improved profit margins often reflect optimized material usage, reduced waste, and lower energy consumption [53]. Thus, enhancing the predictive precision of profit outcomes through the SSALEO-XGBOOST model directly contributes to sustainability governance by enabling organizations to design resource-efficient, cost-effective, and carbon-aware production systems. Consequently, while the methodological core centers on optimization, the overarching contribution of this research lies in advancing the analytical infrastructure for sustainability-aligned decision-making in Industry environments.
Managerial Implications
From a managerial standpoint, the SSALEO-XGBoost framework equips decision-makers with a powerful tool for optimizing key supply chain performance indicators, including profitability, lead time adherence, and procurement efficiency. The model leverages historical operational data such as order volumes, delivery delays, and unit pricing to generate accurate predictions of profit margins, enabling proactive strategic adjustments. These predictive insights facilitate informed decision-making in areas such as supplier negotiation, inventory replenishment, resource allocation, and demand planning. Notably, the integration of SSALEO with XGBOOST mitigates the limitations of conventional hyperparameter tuning methods, which are often computationally expensive and suboptimal. The metaheuristic-driven optimization reduces reliance on manual tuning and heuristic experimentation, thereby accelerating model deployment in real-world operational environments. This automation enhances both the reproducibility and scalability of predictive analytics in SCM, supporting agile and resilient supply chain operations.
Furthermore, by improving model generalization and reducing overfitting, the framework contributes to more reliable forecasting under uncertain or volatile market conditions. This capability is particularly valuable in dynamic supply chain ecosystems, where responsiveness and adaptability are critical for maintaining competitive advantage. As such, the model supports a shift from reactive to anticipatory management, aligning with contemporary trends in digital supply chain transformation. Beyond operational forecasting, predictive analytics such as the SSALEO-XGBOOST framework can directly inform policy and strategic interventions in sustainable industrial management. By accurately forecasting profitability and performance trends, the model enables organizations to prioritize resource-efficient suppliers, identify carbon-intensive operations, and allocate resources toward low-emission processes. This capability aligns with emerging sustainability policies that encourage carbon neutrality, responsible production, and digital circular-economy practices. From a policy perspective, the framework provides a data-driven foundation for regulators to monitor industrial sustainability performance, evaluate compliance, and incentivize eco-efficient innovation. At the enterprise level, the model can be integrated into decision-support systems (DSS), enabling automated insights for production scheduling, inventory optimization, and supplier selection aligned with environmental performance targets. Consequently, the SSALEO-XGBOOST approach serves not only as an analytical tool but also as a strategic enabler of sustainable digital transformation, bridging the gap between predictive intelligence and policy implementation in Industry 4.0 ecosystems.
Limitations of the Study
Despite its empirical performance and methodological novelty, this study is subject to several limitations that warrant acknowledgment. First, the model’s predictive efficacy is contingent upon the quality, completeness, and representativeness of the input data. In contexts characterized by sparse, noisy, or biased datasets, such as emerging markets with underdeveloped data infrastructure, the generalizability of the framework may be limited. Second, while the SSA-LEO algorithm enhances optimization efficiency, it introduces additional computational overhead due to its iterative, population-based search mechanism. This may affect the model’s suitability for real-time decision-making or large-scale industrial applications where latency and processing constraints are critical. Third, the current implementation is based on static, retrospective datasets and does not explicitly account for temporal dynamics or exogenous disruptions such as geopolitical instability, natural disasters, or abrupt shifts in consumer behavior. These unmodeled factors can significantly impact supply chain performance and represent a notable gap in the current analytical scope. Finally, the interpretability of the hybrid model, while improved through feature importance analysis, remains constrained by the inherent complexity of ensemble learning and metaheuristic components. This may limit stakeholder trust and adoption in settings where transparency and explainability are paramount. It is important to acknowledge that this study’s analysis focuses exclusively on profitability, representing the economic dimension of the sustainability triad. The absence of environmental indicators such as carbon emissions, energy consumption, or material efficiency and social indicators constitutes a scope limitation arising from data availability in the utilized Kaggle supply chain dataset.
Future Research Directions
To extend the theoretical and practical impact of this work, we will consider these directions for future research. Future studies will adapt the SSALEO-XGBoost framework to real-time or streaming data environments by incorporating hybrid deep learning architectures such as Long Short-Term Memory (LSTM) networks or Transformer models to enable continuous learning and adaptive prediction, thereby enhancing responsiveness to dynamic supply chain conditions. Furthermore, extending the model to multi-echelon supply networks would allow for the analysis of interdependencies across suppliers, manufacturers, distributors, and retailers, supporting holistic optimization and improving the understanding of cascading disruptions and systemic risks. To increase operational relevance, future research will integrate domain-specific constraints such as production capacity limits, contractual obligations, and sustainability targets directly into the optimization process, potentially through hybrid frameworks combining metaheuristics with constraint programming. Additionally, rigorous comparative benchmarking against alternative machine learning models, rule-based expert systems, and other hybrid AI-optimization approaches across diverse operational contexts would strengthen the empirical validation of the framework’s performance and robustness. Finally, evaluating the model under simulated disruption scenarios such as demand surges, supplier failures, or logistical bottlenecks would provide critical insights into its resilience and utility for risk mitigation and contingency planning. Addressing these research avenues could transform the framework from a predictive analytics tool into a comprehensive, adaptive decision-support system, thereby bridging the gap between advanced computational methods and practical supply chain execution, and contributing to the development of more agile, resilient, and sustainable supply chain ecosystems.
5. Conclusions
This study introduced a new hybrid machine learning framework, SSALEO-XGBOOST, which integrates the SSA enhanced with a LEO into the XGBOOST model for improved supply chain profit prediction. The proposed approach addresses critical limitations associated with traditional SSA, particularly its tendency to converge prematurely and become trapped in local optima, while simultaneously overcoming the sensitivity of XGBOOST to hyperparameter configurations. Comprehensive empirical evaluations were conducted using 5-fold cross-validation and 20 independent experimental runs to ensure improvement and statistical reliability. The results demonstrated that SSALEO-XGBOOST consistently achieved higher predictive performance compared to other metaheuristic-augmented XGBOOST models and baseline algorithms, with the highest R2 of 0.985 and the lowest error rates across RMSE, MSE, ME, and RAE. In addition to predictive accuracy, the model exhibited strong generalization capability and convergence stability. Feature importance analysis further revealed that Order Quantity and Unit Price were the most influential predictors of profit, enhancing the model’s interpretability and offering actionable insights for strategic decision-making in supply chain operations. From a managerial standpoint, the model provides a data-driven decision-support mechanism enabling firms to make informed strategic choices regarding resource allocation, supply chain optimization, and carbon-efficient production planning. Its predictive accuracy empowers organizations to transition toward intelligent, resilient, and sustainable operations.
Despite these strengths, the approach has inherent limitations. While the proposed SSALEO-XGBOOST model demonstrates robust predictive capability, several limitations merit acknowledgment. First, the empirical analysis is based on a single-country dataset (U.S. supply chain records), which may constrain generalizability across different industrial or geopolitical contexts. Second, the sample size and sectoral scope limit the model’s validation in large-scale or cross-industry environments. Future work will include multi-domain datasets encompassing manufacturing, logistics, and service sectors. Third, the study’s focus on short-term profit prediction does not yet capture broader sustainability dimensions such as environmental and social performance. Future studies will extend the augmented XGBOOST framework toward circular economy evaluation, carbon-emission forecasting, and ESG analytics. Future research will investigate novel optimization algorithms, adaptive parameter control, integration with deep learning regressors, and real-time deployment strategies to enhance the framework’s applicability in dynamic, complex supply chains.
Author Contributions
N.N. Conceptualization, methodology, validation; A.B.A. writing, original draft preparation, supervision; O.R.A. Methodology, Formal Analysis, draft preparation. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
All data generated or analysed during this study are included in this published article.
Conflicts of Interest
The authors declare no conflict of interest.
Appendix A

Figure A1.
Convergence Curves of SSALEO and other Algorithms on CEC2015.

Figure A2.
Box Plots of SSALEO and other Algorithms on CEC2015.
References
- Li, S.; Ragu-Nathan, B.; Ragu-Nathan, T.S.; Subba Rao, S. The impact of supply chain management practices on competitive advantage and organizational performance. Omega 2006, 34, 107–124. [Google Scholar] [CrossRef]
- Ning, L.; Yao, D. The Impact of Digital Transformation on Supply Chain Capabilities and Supply Chain Competitive Performance. Sustainability 2023, 15, 10107. [Google Scholar] [CrossRef]
- Meng, Q.; Zu, G.; Ge, L.; Li, S.; Xu, L.; Wang, R.; He, K.; Jin, S. Dispatching Strategy for Low-Carbon Flexible Operation of Park-Level Integrated Energy System. Appl. Sci. 2022, 12, 12309. [Google Scholar] [CrossRef]
- Maheshwari, S.; Gautam, P.; Jaggi, C.K. Role of Big Data Analytics in supply chain management: Current trends and future perspectives. Int. J. Prod. Res. 2021, 59, 1875–1900. [Google Scholar] [CrossRef]
- Jiang, S.; Yuan, X. Does green credit promote real economic development? Dual empirical evidence of scale and efficiency. PLoS ONE 2025, 20, e0326961. [Google Scholar] [CrossRef] [PubMed]
- Helo, P.; Hao, Y. Artificial intelligence in operations management and supply chain management: An exploratory case study. Prod. Plan. Control 2022, 33, 1573–1590. [Google Scholar] [CrossRef]
- Wen, Y.-H. Shipment forecasting for supply chain collaborative transportation management using grey models with grey numbers. Transp. Plan. Technol. 2011, 34, 605–624. [Google Scholar] [CrossRef]
- Tseng, F.-M.; Yu, H.-C.; Tzeng, G.-H. Applied Hybrid Grey Model to Forecast Seasonal Time Series. Technol. Forecast. Soc. Change 2001, 67, 291–302. [Google Scholar] [CrossRef]
- Quartey-Papafio, T.K.; Javed, S.A.; Liu, S. Forecasting cocoa production of six major producers through ARIMA and grey models. Grey Syst. Theory Appl. 2020, 11, 434–462. [Google Scholar] [CrossRef]
- Jia, X.; Wang, J.; Ma, T.T.; Wang, Q. Grey improvement model for intelligent supply chain demand forecasting. Int. J. Manuf. Technol. Manag. 2025, 39, 334–357. [Google Scholar] [CrossRef]
- Xia, M.; Wong, W.K. A seasonal discrete grey forecasting model for fashion retailing. Knowl.-Based Syst. 2014, 57, 119–126. [Google Scholar] [CrossRef]
- Tirkolaee, E.B.; Sadeghi, S.; Mooseloo, F.M.; Vandchali, H.R.; Aeini, S. Application of Machine Learning in Supply Chain Management: A Comprehensive Overview of the Main Areas. Math. Probl. Eng. 2021, 2021, 1476043. [Google Scholar] [CrossRef]
- Shen, X.; Li, L.; Ma, Y.; Xu, S.; Liu, J.; Yang, Z.; Shi, Y. VLCIM: A Vision-Language Cyclic Interaction Model for Industrial Defect Detection. IEEE Trans. Instrum. Meas. 2025, 74, 1–13. [Google Scholar] [CrossRef]
- Chen, S.; Long, X.; Fan, J.; Jin, G. A causal inference-based root cause analysis framework using multi-modal data in large-complex system. Reliab. Eng. Syst. Saf. 2026, 265, 111520. [Google Scholar] [CrossRef]
- Qiao, Y.; Lü, J.; Wang, T.; Liu, K.; Zhang, B.; Snoussi, H. A Multihead Attention Self-Supervised Representation Model for Industrial Sensors Anomaly Detection. IEEE Trans. Ind. Inform. 2024, 20, 2190–2199. [Google Scholar] [CrossRef]
- Zhang, B.; Cai, X.; Li, G.; Li, X.; Peng, M.; Yang, M. A modified A* algorithm for path planning in the radioactive environment of nuclear facilities. Ann. Nucl. Energy 2025, 214, 111233. [Google Scholar] [CrossRef]
- Harle, S.M.; Wankhade, R.L. Machine learning techniques for predictive modelling in geotechnical engineering: A succinct review. Discov. Civ. Eng. 2025, 2, 86. [Google Scholar] [CrossRef]
- Rezaei, A.; Abdellatif, I.; Umar, A. Towards Economic Sustainability: A Comprehensive Review of Artificial Intelligence and Machine Learning Techniques in Improving the Accuracy of Stock Market Movements. Int. J. Financ. Stud. 2025, 13, 28. [Google Scholar] [CrossRef]
- Mao, F.; Chen, M.; Zhong, K.; Zeng, J.; Liang, Z. An XGBoost-assisted evolutionary algorithm for expensive multiobjective optimization problems. Inf. Sci. 2024, 666, 120449. [Google Scholar] [CrossRef]
- Ma, M.; Zhao, G.; He, B.; Li, Q.; Dong, H.; Wang, S.; Wang, Z. XGBoost-based method for flash flood risk assessment. J. Hydrol. 2021, 598, 126382. [Google Scholar] [CrossRef]
- Amjad, M.; Ahmad, I.; Ahmad, M.; Wróblewski, P.; Kamiński, P.; Amjad, U. Prediction of Pile Bearing Capacity Using XGBoost Algorithm: Modeling and Performance Evaluation. Appl. Sci. 2022, 12, 2126. [Google Scholar] [CrossRef]
- Pan, S.; Zheng, Z.; Guo, Z.; Luo, H. An optimized XGBoost method for predicting reservoir porosity using petrophysical logs. J. Pet. Sci. Eng. 2022, 208, 109520. [Google Scholar] [CrossRef]
- Mohammed, S.I.; Hussein, N.K.; Haddani, O.; Aljohani, M.; Alkahya, M.A.; Qaraad, M. Fine-Tuned Cardiovascular Risk Assessment: Locally Weighted Salp Swarm Algorithm in Global Optimization. Mathematics 2024, 12, 243. [Google Scholar] [CrossRef]
- Algwil, A.R.A.; Khalifa, W.M.S. An enhanced moth flame optimization extreme learning machines hybrid model for predicting CO2 emissions. Sci. Rep. 2025, 15, 11948. [Google Scholar] [CrossRef] [PubMed]
- Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
- Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
- Adegboye, O.R.; Feda, A.K.; Tejani, G.G.; Smerat, A.; Kumar, P.; Agyekum, E.B. Salp Navigation and Competitive based Parrot Optimizer (SNCPO) for efficient extreme learning machine training and global numerical optimization. Sci. Rep. 2025, 15, 13704. [Google Scholar] [CrossRef]
- Li, G.; Cui, Y.; Su, J. A multi-strategy improved snake optimizer for three-dimensional UAV path planning and engineering problems. arXiv 2025, arXiv:2507.14043. [Google Scholar] [CrossRef]
- Elkington, J. Accounting for the Triple Bottom Line. Meas. Bus. Excell. 1998, 2, 18–22. [Google Scholar] [CrossRef]
- Qaraad, M.; Amjad, S.; Hussein, N.K.; Mirjalili, S.; Halima, N.B.; Elhosseini, M.A. Comparing SSALEO as a Scalable Large Scale Global Optimization Algorithm to High-Performance Algorithms for Real-World Constrained Optimization Benchmark. IEEE Access 2022, 10, 95658–95700. [Google Scholar] [CrossRef]
- Lu, H.; Zhang, L.; Chen, H.; Zhang, S.; Wang, S.; Peng, H.; Zou, J. An Indoor Fingerprint Positioning Algorithm Based on WKNN and Improved XGBoost. Sensors 2023, 23, 3952. [Google Scholar] [CrossRef] [PubMed]
- Nabavi, Z.; Mirzehi, M.; Dehghani, H.; Ashtari, P. A Hybrid Model for Back-Break Prediction using XGBoost Machine learning and Metaheuristic Algorithms in Chadormalu Iron Mine. J. Min. Environ. 2023, 14, 689–712. [Google Scholar] [CrossRef]
- Mohiuddin, G.; Lin, Z.; Zheng, J.; Wu, J.; Li, W.; Fang, Y.; Wang, S.; Chen, J.; Zeng, X. Intrusion Detection using hybridized Meta-heuristic techniques with Weighted XGBoost Classifier. Expert Syst. Appl. 2023, 232, 120596. [Google Scholar] [CrossRef]
- Kazemi, M.M.K.; Nabavi, Z.; Armaghani, D.J. A novel Hybrid XGBoost Methodology in Predicting Penetration Rate of Rotary Based on Rock-Mass and Material Properties. Arab. J. Sci. Eng. 2024, 49, 5225–5241. [Google Scholar] [CrossRef]
- Mosa, M.A. Optimizing text classification accuracy: A hybrid strategy incorporating enhanced NSGA-II and XGBoost techniques for feature selection. Prog. Artif. Intell. 2025, 14, 275–299. [Google Scholar] [CrossRef]
- Xi, B.; Huang, Z.; Al-Obaidi, S.; Ferrara, L. Predicting ultra high-performance concrete self-healing performance using hybrid models based on metaheuristic optimization techniques. Constr. Build. Mater. 2023, 381, 131261. [Google Scholar] [CrossRef]
- Krishna, A.Y.; Kiran, K.R.; Sai, N.R.; Sharma, A.; Praveen, S.P.; Pandey, J. Ant Colony Optimized XGBoost for Early Diabetes Detection: A Hybrid Approach in Machine Learning. J. Intell. Syst. Internet Things 2023, 10. Available online: https://openurl.ebsco.com/contentitem/doi:10.54216%2FJISIoT.100207?sid=ebsco:plink:crawler&id=ebsco:doi:10.54216%2FJISIoT.100207 (accessed on 11 May 2025). [CrossRef]
- Ozcan, T.; Ozmen, E.P. Prediction of Heart Disease Using a Hybrid XGBoost-GA Algorithm with Principal Component Analysis: A Real Case Study. Int. J. Artif. Intell. Tools 2023, 32, 2340009. [Google Scholar] [CrossRef]
- Qiu, Y.; Zhou, J. Short-Term Rockburst Damage Assessment in Burst-Prone Mines: An Explainable XGBOOST Hybrid Model with SCSO Algorithm. Rock Mech. Rock Eng. 2023, 56, 8745–8770. [Google Scholar] [CrossRef]
- Zhang, J.; Wang, R.; Lu, Y.; Huang, J. Prediction of Compressive Strength of Geopolymer Concrete Landscape Design: Application of the Novel Hybrid RF–GWO–XGBoost Algorithm. Buildings 2024, 14, 591. [Google Scholar] [CrossRef]
- Iqbal, M.; Fan, Y.; Ahmad, N.; Ullah, I. Circular economy solutions for net-zero carbon in China’s construction sector: A strategic evaluation. J. Clean. Prod. 2025, 504, 145398. [Google Scholar] [CrossRef]
- Iqbal, M.; Ma, J.; Ahmad, N.; Ullah, Z.; Hassan, A. Energy-Efficient supply chains in construction industry: An analysis of critical success factors using ISM-MICMAC approach. Int. J. Green Energy 2023, 20, 265–283. [Google Scholar] [CrossRef]
- Iqbal, M.; Waqas, M.; Ahmad, N.; Hussain, K.; Hussain, J. Green supply chain management as a pathway to sustainable operations in the post-COVID-19 era: Investigating challenges in the Chinese scenario. Bus. Process Manag. J. 2024, 30, 1065–1087. [Google Scholar] [CrossRef]
- Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, in KDD ’16, San Francisco, CA, USA, 13–17 August 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 785–794. [Google Scholar] [CrossRef]
- Liang, J.J.; Qu, B.Y.; Suganthan, P.N.; Chen, Q. Problem Definitions and Evaluation Criteria for the CEC 2015 Competition on Learning-Based Real-Parameter Single Objective Optimization; Technical Report 201411A; Computational Intelligence Laboratory, Zhengzhou University: Zhengzhou, China; Nanyang Technological University: Singapore, 2014; Volume 29. [Google Scholar]
- Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
- Hashim, F.A.; Houssein, E.H.; Hussain, K.; Mabrouk, M.S.; Al-Atabany, W. Honey Badger Algorithm: New metaheuristic algorithm for solving optimization problems. Math. Comput. Simul. 2022, 192, 84–110. [Google Scholar] [CrossRef]
- Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
- Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
- Dhiman, G.; Kumar, V. Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems. Knowl.-Based Syst. 2019, 165, 169–196. [Google Scholar] [CrossRef]
- SupplyChainAnalysis [US-Regional-Sales]. Available online: https://www.kaggle.com/code/omarmelzeki/supplychainanalysis-us-regional-sales (accessed on 12 May 2025).
- Khajavi, H.; Rastgoo, A.; Masoumi, F. Enhanced Streamflow Forecasting for Crisis Management Based on Hybrid Extreme Gradient Boosting Model. Iran. J. Sci. Technol. Trans. Civ. Eng. 2025, 49, 5233–5254. [Google Scholar] [CrossRef]
- Bag, S.; Yadav, G.; Dhamija, P.; Kataria, K.K. Key resources for industry 4.0 adoption and its effect on sustainable production and circular economy: An empirical study. J. Clean. Prod. 2021, 281, 125233. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).