Next Article in Journal
NeuroQ: Quantum-Inspired Brain Emulation
Next Article in Special Issue
CCESC: A Crisscross-Enhanced Escape Algorithm for Global and Reservoir Production Optimization
Previous Article in Journal
Auto Deep Spiking Neural Network Design Based on an Evolutionary Membrane Algorithm
Previous Article in Special Issue
A Novel Adaptive Superb Fairy-Wren (Malurus cyaneus) Optimization Algorithm for Solving Numerical Optimization Problems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Bio-Inspired Adaptive Probability IVYPSO Algorithm with Adaptive Strategy for Backpropagation Neural Network Optimization in Predicting High-Performance Concrete Strength

1
School of Computer Science, Hubei University of Technology, Wuhan 430068, China
2
Department of Electronic Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
3
Institute for Environmental Design and Engineering, University College London, London WC1H 0NN, UK
*
Author to whom correspondence should be addressed.
Biomimetics 2025, 10(8), 515; https://doi.org/10.3390/biomimetics10080515
Submission received: 17 June 2025 / Revised: 27 July 2025 / Accepted: 5 August 2025 / Published: 6 August 2025

Abstract

Accurately predicting the compressive strength of high-performance concrete (HPC) is critical for ensuring structural integrity and promoting sustainable construction practices. However, HPC exhibits highly complex, nonlinear, and multi-factorial interactions among its constituents (such as cement, aggregates, admixtures, and curing conditions), which pose significant challenges to conventional predictive models. Traditional approaches often fail to adequately capture these intricate relationships, resulting in limited prediction accuracy and poor generalization. Moreover, the high dimensionality and noisy nature of HPC mix data increase the risk of model overfitting and convergence to local optima during optimization. To address these challenges, this study proposes a novel bio-inspired hybrid optimization model, AP-IVYPSO-BP, which is specifically designed to handle the nonlinear and complex nature of HPC strength prediction. The model integrates the ivy algorithm (IVYA) with particle swarm optimization (PSO) and incorporates an adaptive probability strategy based on fitness improvement to dynamically balance global exploration and local exploitation. This design effectively mitigates common issues such as premature convergence, slow convergence speed, and weak robustness in traditional metaheuristic algorithms when applied to complex engineering data. The AP-IVYPSO is employed to optimize the weights and biases of a backpropagation neural network (BPNN), thereby enhancing its predictive accuracy and robustness. The model was trained and validated on a dataset comprising 1030 HPC mix samples. Experimental results show that AP-IVYPSO-BP significantly outperforms traditional BPNN, PSO-BP, GA-BP, and IVY-BP models across multiple evaluation metrics. Specifically, it achieved an R2 of 0.9542, MAE of 3.0404, and RMSE of 3.7991 on the test set, demonstrating its high accuracy and reliability. These results confirm the potential of the proposed bio-inspired model in the prediction and optimization of concrete strength, offering practical value in civil engineering and materials design.

1. Introduction

With the rapid pace of urbanization and ongoing infrastructure development, high-performance concrete (HPC) has found widespread application in high-rise buildings [1], bridge engineering [2], nuclear power plants [3], and other sectors due to its excellent mechanical properties, durability, and workability [4,5]. Among the various performance indicators, concrete compressive strength is a critical factor influencing the safety and durability of engineering structures [6,7,8]. Accurate prediction of the compressive strength of HPC not only ensures structural safety and durability but also optimizes material utilization, reduces waste, and advances sustainable development within the construction sector. Traditionally, compressive strength has been determined through experimental methods. However, this approach is not only time-consuming and costly but also affected by environmental factors and human error, making it challenging to meet the dual demands of both efficiency and accuracy in engineering design [9]. Consequently, the development of an efficient, stable, and highly accurate compressive strength prediction model has become a focal point of research in civil engineering and materials science [10,11,12].
In addressing the challenge of predicting concrete strength, various modeling approaches have been proposed, including multiple linear regression (MLR) [13,14], decision tree (DT) [15,16], support vector machine (SVM) [17,18], and other statistical and machine learning models [19]. However, due to the inherent nonlinearities and multivariate coupling in concrete materials, these methods often struggle to fully capture the complex nonlinear relationships between inputs and outputs, leading to limited prediction accuracy [20]. In contrast, artificial neural networks (ANN) [21], particularly backpropagation neural networks (BPNN) [22,23,24,25], have been widely used in concrete performance prediction because of their strong nonlinear modeling capabilities and adaptability.
Despite the advantages of BPNN in terms of modeling accuracy, their training process relies on the gradient descent optimization method, which is prone to local minima and sensitive to the initialization of weights and network architecture [26]. This sensitivity can adversely affect the model’s predictive accuracy, training efficiency, and its convergence speed [27]. To mitigate these limitations, researchers have explored integrating various intelligent optimization algorithms with BPNN to optimize weights and thresholds, thereby enhancing the model’s global search capability and generalization ability. FZ El-Hassani et al. [28] developed a GA-optimized BPNN model for thyroid disease diagnosis, effectively addressing local minima and slow convergence issues. F Ma et al. [29] applied a GA-optimized BPNN to forecast regional logistics demand, improving prediction accuracy and reducing iteration times. A Abdurrakhman et al. [30] proposed a PSO-optimized adaptive BPNN model to predict and optimize the output power of biogas-fueled generators, achieving high accuracy and effective parameter tuning. Z Wang et al. [31] proposed an XGBoost-assisted OTDBO-BPNN for predicting HPC compressive strength, which achieved superior accuracy by enhancing DBO’s global search and convergence performance through four strategic improvements. In addition, recent research has demonstrated the effectiveness of combining deep learning and 3D point cloud technologies to enhance the performance of intelligent optimization models in human–machine interaction and rehabilitation engineering [32,33], providing further insight for the development of hybrid intelligent frameworks in civil engineering contexts. Meanwhile, recent studies have also emphasized the significance of co-optimizing neural networks using adaptive evolutionary algorithms [34] and the increasing industrial relevance of hybrid nature-inspired population-based optimization methods [35], further supporting the rationale of our proposed hybrid framework.
Inspired by biomimetic principles, where natural systems often balance global exploration with local exploitation to adapt efficiently to complex environments [36,37], this paper introduces a novel BPNN model optimized using an adaptive probability hybrid ivy algorithm (IVYA) [38] and particle swarm optimization (PSO) [39] based on fitness improvement (AP-IVYPSO). The IVYA, drawing inspiration from the natural growth process of ivy plants, mimics biological behaviors to perform efficient local search, while PSO simulates the social behavior of bird flocking for global search. By combining these biomimetic algorithms, the proposed model effectively balances exploration and exploitation, addressing challenges faced by traditional optimization techniques.
The proposed model leverages the global search capabilities of PSO and the local search characteristics of the IVYA. By employing a fitness-based adaptive probability strategy, the model dynamically adjusts the update rules of PSO and IVYA, improving the accuracy, stability, and generalization ability of concrete strength predictions. To evaluate the model’s effectiveness, experiments were conducted using a publicly available concrete dataset and compared with traditional models such as BPNN, PSO-BP, GA-BP, and IVY-BP. The results show that the AP-IVYPSO-BP model outperforms these models across various evaluation metrics, particularly in enhancing the robustness and prediction accuracy of the model. The main contributions of this paper can be summarized as follows:
  • We introduce a new hybrid algorithm, AP-IVYPSO, which combines the IVYA and PSO, and incorporates an adaptive probability strategy based on fitness improvement. This biomimetic-inspired approach strikes a balance between global search capability and local search efficiency, effectively addressing the challenges faced by single optimization algorithms—such as getting stuck in local optima, slow convergence, and instability—when dealing with complex nonlinear problems.
  • Through comparison experiments with 10 widely recognized optimization (PSO, IVYA, HFPSO, HJSPSO, BOA, WOA, GOOSE, PSOBOA, NSM-BO, and FDB-AGSK.) algorithms on 26 benchmark test functions, AP-IVYPSO demonstrates exceptional optimization capability and high stability.
  • When applied to optimize a BP neural network, AP-IVYPSO effectively overcomes the local optima issue typically faced by traditional gradient descent methods, significantly improving the stability of the model. In comparison to existing prediction models, the AP-IVYPSO-BP model outperforms in multiple evaluation metrics (R2 = 0.9542, MAE = 3.0404, and RMSE = 3.7991), further validating the superior performance of the proposed approach.
The remainder of this paper is organized as follows: Section 2 introduces the fundamentals of BPNN, PSO, and the IVYA. Section 3 presents the construction and parameter optimization mechanism of the AP-IVYPSO-BP model; Section 4 details the experimental setup, performance evaluation, and comparison with baseline models. Section 5 presents the concluding remarks of this study and delineates prospective directions for subsequent research endeavors.

2. Materials and Methods

2.1. BP Neural Network

In recent years, the BPNN, as a classical multilayer feedforward neural network, has been widely applied in nonlinear modeling, regression prediction, and classification tasks [40,41]. Its primary strength stems from effectively capturing intricate nonlinear relationships and its strong generalization capability with a relatively interpretable model structure.
BPNN is generally composed of three fundamental parts: a layer for input, one or several hidden layers, and a layer for output, with interconnections defined by adjustable weights. During training, the network processes input data through forward propagation to produce outputs and then applies error backpropagation to iteratively adjust the weights and biases based on the difference between predicted and expected values. This learning mechanism enables the model to progressively minimize prediction errors and extract deep correlations among input features.
Although the use of gradient descent during training may lead to local optima, BPNN remain popular due to their structural flexibility and adaptability. To further improve convergence speed and prediction accuracy, recent studies have integrated BPNNs with intelligent optimization algorithms, resulting in hybrid models with enhanced robustness and global search capabilities.
In this study, the BP neural network is utilized as the foundational model for predicting concrete strength. Instead of relying on the traditional BP backpropagation training method, its weights and biases are optimized through an external global search using a swarm intelligence algorithm.
Each candidate solution X i represents an initial set of network parameters. The swarm intelligence algorithm iteratively adjusts these parameters to minimize the prediction error on the validation dataset. This approach mitigates common issues associated with conventional training techniques, such as local minima and vanishing gradients.

2.2. PSO Algorithm

PSO, introduced by Kennedy and Eberhart in 1995, is a population-based metaheuristic inspired by the collective behavior observed in bird flocks and fish schools during foraging. By exchanging information among individuals within a population, the algorithm efficiently balances exploration of the global search space and exploitation of promising local regions to address complex optimization problems [42,43].
In PSO, each individual—referred to as a particle—represents a candidate solution and is characterized by its position vector X i = [ x i 1 , x i 2 , x i 3 , , x i q , , x i U ] and velocity vector V i = v i 1 , v i 2 , v i 3 , , v i q , , v i U , , where i = 1, 2, …   N , N denotes the population size, q = 1,2 , , U , and U denotes the problem dimensionality. At each iteration, a particle’s trajectory is updated based on three fundamental influences:
(1)
Inertia term: This retains a particle’s previous velocity, aiding in the continuation of its current search direction and contributing to global exploration.
(2)
Cognitive term: This component reflects the particle’s own experience, guiding its movement toward its personal best position P b e s t , i .
(3)
Social term: Representing collective intelligence, this steers particles toward the globally best-known position G b e s t discovered by the swarm.
In the searching phase, each particle’s location is affected by its personal best position within its vicinity P b e s t , i and the overall best position found by the swarm G b e s t of the entire population.
The formulas for updating the particle’s position and velocity are presented in Equation (1) and Equation (2), respectively.
X i k + 1 = X i k + V i k + 1 ,
V i k + 1 = ω V i k + c 1 ξ 1 P b e s t , i X i k + c 2 ξ 2 G b e s t X i k ,
where ω denotes the inertia weight that regulates the trade-off between exploration and exploitation, k denotes the current iteration index, and k = 1 , 2 , , T , T is the maximum number of iteration. The parameters c 1 and c 2 are cognitive and social acceleration coefficients, while ξ 1 and ξ 2 are uniformly distributed random numbers in the range [0, 1], introducing stochasticity into the search process. The termination condition is defined as reaching the maximum number of iterations, ensuring a balance between convergence quality and computational efficiency.
In the task of predicting concrete compressive strength, PSO is employed to optimize the parameters of the BP neural network. Each particle represents a set of initial parameters for the BP network, with the particle’s trajectory in the search space reflecting the path taken to find the optimal network weights and biases. Each particle’s position vector encodes a set of weights and biases for the BP neural network, and the population consists of several particles that co-evolve within the search space. PSO dynamically adjusts each particle’s velocity and position based on both its individual best position and the global best position, guiding the entire population towards a more optimal solution. The search space is defined by the complex, non-convex space of network parameters, and the fitness function is evaluated using the RMSE on the validation set. Through the iterative particle search process, an optimal set of network parameters can be identified more rapidly, improving prediction accuracy and enhancing the model’s robustness.
Figure 1 illustrates the operational workflow of the PSO algorithm.

2.3. Ivy Algorithm

The IVYA is an optimization method based on swarm intelligence, drawing inspiration from the adaptive and exploratory nature of vine plants in the natural world. Vines exhibit dynamic behaviors such as climbing, stretching, and expanding as they seek vital resources like sunlight and nutrients. This biological strategy provides a conceptual foundation for tackling global optimization problems. The IVYA emulates several phases of ivy development, including propagation, vertical climbing, and lateral spreading [40]. The algorithm consists of the following four primary stages:
(1)
Population Initialization. At the outset, a population of potential solutions is generated. Let N denote the number of individuals and U denote the dimensionality of the optimization problem. Each individual i t h is represented as a U -dimensional vector: I i = ( I i 1 , , I i U ) , where i { 1,2 , , N } and N denotes the population size. The entire ivy population is given by I = ( I 1 , , I i , , I K p o p ) . The initial positions of the individuals are randomly determined within the defined search boundaries using Equation (3):
I i = I m i n + U ( 0,1 ) ( I m a x I m i n ) , i = 1 , , N ,
where U ( 0,1 ) denotes a U -dimensional vector containing random numbers uniformly distributed between 0 and 1 and represents the Hadamard product between two vectors. The search boundaries are defined by I m i n and I m a x , which represent the lower and upper bounds of the decision space, respectively.
(2)
Controlled Growth Dynamics. The population evolves in a structured manner that mimics ivy growth. The rate of growth G V is assumed to vary over time, described via a differential Equation (4):
d G V ( t ) d t = ψ G V ( t ) φ ( G V ( t ) ) ,
where ψ is a velocity factor, φ is a nonlinear correction term, and G V is the current growth rate. Individual updates are defined using Equation (5):
Δ G V i ( t + 1 ) = U ( 0,1 ) 2 ( N ( 0,1 ) Δ G V i ( t ) ) ,
where G V i ( t + 1 ) and G V i ( t ) are the growth variations at successive time steps and N ( 0,1 ) is a Gaussian-distributed random vector.
(3)
Sunlight-Driven Adaptation. In nature, vines grow directionally towards light sources, often attaching to structures that support upward movement. This phototropic tendency is captured in the IVYA by encouraging individuals to improve based on their best-performing neighbor. The optimal peer I i i for individual I i is chosen using Equation (6):
I i i = I j 1 s ,   if   I i = I j s   and   j > 1 I i ,   if   I i = I b e s t ,
where the variable j represents the index of individual I i in the population sorted by fitness from best to worst, with j { 1,2 , , N } . The selection procedure is as follows:
  • Sort the population according to fitness values in descending order, producing a sorted sequence I 1 s , I 2 s , , I N s .
  • Find the position j of individual I i in this sorted list.
  • If j > 1 , the optimal peer I i i is the immediate better-ranked neighbor I j 1 s .
  • If j = 1 , meaning I i is the current best individual I b e s t , then I i i = I b e s t itself.
This selection ensures that each individual learns from the nearest superior peer in the fitness ranking, mimicking the natural tendency of vines to grow towards more favorable structures.
The new state of individual I i is calculated with Equations (7) and (8):
I i n e w 1 = K ( 1 , U ) ( I i i I i ) + I i + K ( 1 , U ) Δ G V i , i = 1,2 , 3 , , K ,
Δ G V i = I i ( I m a x I m i n ) , Iter = 1 U ( 0,1 ) 2 ( N ( 0,1 ) Δ G v i ) , Iter > 1 ,
where K ( 1 , U ) represents a vector with absolute values of the normal distribution components, enhancing diversity and exploration in the search space.
To replicate this behavior, the algorithm allows each individual I i to identify and refer to the nearest neighbor I i i with superior fitness as a guide for its own evolution process. This mechanism, which mimics the natural tendency of vines to grow toward favorable conditions, is illustrated in Figure 2.
The last primary stage is as follows:
(4)
Growth behavior and evolutionary adjustment. Once an individual I i has explored the global space and located its closest high-quality neighbor I i i it proceeds to align its search direction toward the current global best solution I b e s t . This phase emphasizes exploiting the local region around I b e s t to refine the solution, as formulated in Equations (9) and (10).
I i n e w = I b e s t ( U ( 0,1 ) + N ( 0,1 ) Δ G V i ) ,
Δ G V i n e w = I i n e w ( I m a x I m i n ) ,
In this study, the IVYA is employed to optimize the initial weights and biases of the BP neural network, thereby enhancing the accuracy of the network’s prediction of concrete compressive strength. In practice, each IVYA individual is represented as a vector, which encodes a complete set of neural network weights and bias parameters. The population consists of multiple such individuals, and the search space is defined by the high-dimensional, non-convex error function space associated with the BP network parameters. IVYA efficiently explores this complex space through mechanisms such as simulating vine extension, selecting growth nodes, and perturbing local solutions. In each iteration of the algorithm, a new growth node is generated, corresponding to a new configuration of neural network parameters, which is then used to construct the corresponding BP model and assess its prediction performance on both the training and validation datasets. The fitness function is defined as the root mean square error (RMSE) on the validation set, with the goal of minimizing this error to improve the network’s generalization ability. This approach effectively mitigates common challenges in neural network training, such as vanishing gradients and local optima.
The search process is terminated when the maximum number of iterations is reached. Figure 3 illustrates the procedural flow of the ivy algorithm.

3. Construction and Parameter Optimization Mechanism of the AP-IVYPSO-BP Model

This section presents a detailed description of the construction and parameter optimization mechanism of the AP-IVYPSO-BP model. To address the challenges faced by traditional BPNN, including issues of local optima and low convergence efficiency during training, this paper introduces an AP-IVYPSO to optimize the BPNN. The proposed model integrates the global search capabilities of PSO with the local search features of the IVYA. Unlike simple hybrid strategies that alternate update steps or linearly weight two algorithms, the AP-IVYPSO model adaptively switches between PSO and IVYA based on a fitness-driven probability function. This mechanism ensures seamless cooperation between the fast exploration of PSO and the refined exploitation of IVYA. Through an adaptive probability mechanism based on fitness improvement, the model dynamically adjusts the update strategies of PSO and IVYA. The strategy mimics the natural adaptation behavior observed in vine plants exposed to sunlight, which selectively grows towards better environmental conditions—an inspiration that guides this adaptive optimization framework. This combination effectively enhances the accuracy, stability, and generalization ability of concrete compressive strength prediction.
Specifically, in addressing the typical regression problem of predicting concrete strength, the BP network’s training process is redefined as a parameter optimization problem, where the AP-IVYPSO algorithm directly updates the parameters within the network architecture through iterative adjustments.

3.1. Implementation Mechanism of AP-IVYPSO

To improve the dynamic balance between global search capability and local refinement ability in complex engineering problems, this paper introduces an intelligent optimization algorithm based on an adaptive probability-guided mechanism, named AP-IVYPSO. The method incorporates an adaptive probability control mechanism into the core iteration process, dynamically adjusting the tendency of individuals to select search strategies at various stages. This facilitates the complementary coordination between the global exploration ability of PSO and the local fine search capability of the IVYA. By guiding the switching of search strategies at different stages of the evolutionary process, AP-IVYPSO effectively enhances both the global convergence performance and the local convergence accuracy of the algorithm. As a result, it achieves adaptive adjustment of the search direction and a seamless integration of multi-stage search behaviors. The method demonstrates excellent adaptability and robustness, making it particularly suitable for solving complex, nonlinear, and multimodal engineering optimization problems.

3.1.1. Adaptive Probability-Guided Mechanism

In the AP-IVYPSO algorithm, the core idea is to dynamically determine whether the current individual will use the PSO update strategy or the IVYA update strategy in each iteration, based on an adaptive probability control mechanism. This mechanism is calculated using the Equation (11):
P a d a p t = e x p 5 t T ,
where t is the current iteration number and T is the maximum number of iterations. e x p x represents the natural exponential function, which is the exponential function with the base of the natural constant e 2.71828 . The term 5 t T represents the position of the current process in the iteration. Multiplying by 5 is used to control the rate of decay, which likely refers to how quickly the influence of certain parameters reduces over time in the algorithm.
Each individual generates a random number U ( 0,1 ) during the iteration and selects the search strategy based on the following rule:
  • If U ( 0,1 ) < P a d a p t , the PSO update strategy is selected;
  • Otherwise, the IVYA update strategy is chosen.
The function of this mechanism is as follows:
  • Early iterations ( t T ): at this stage,   P a d a p t 1 , indicating that individuals are more likely to adopt the PSO strategy, which enhances global search capabilities by exploring a wider solution space.
  • Later iterations ( t T ): at this stage,   P a d a p t 0 , at which point the algorithm shifts to using the IVYA strategy, emphasizing local refinement and fine-tuning of solutions.
The core innovation of this mechanism is that it quantifies the trade-off between global and local search through a decaying probability function. Early in the optimization, a high P a d a p t favors PSO, enabling the swarm to explore broadly and avoid premature convergence. As the search progresses, P a d a p t decreases, making IVYA more likely to dominate, which improves fine-tuning and convergence precision.
This approach is grounded in adaptive optimization theory, where dynamically adjusting exploration and exploitation according to convergence state is a proven strategy for avoiding local optima in multimodal problems.

3.1.2. Global Search Strategy with PSO

When the condition U ( 0,1 ) < P a d a p t is met, individual i will adopt the standard PSO strategy to update its position and velocity, as specified in Equations (1) and (2). This strategy is guided by the individual’s best historical experience and the global best information from the entire population, offering strong global search capabilities and parallel information sharing.
During the early iterations, with a high value of P a d a p t , individuals are more likely to adopt the PSO strategy. This encourages the population to quickly expand the search space, avoid local optima, and enhance both the global exploration ability and search diversity of the algorithm. At this stage, the algorithm can gather richer search information on a global scale, which provides a solid foundation for the later local search phase. This leads to improved overall optimization efficiency, allowing the model to refine solutions more effectively in subsequent stages.

3.1.3. Local or Global Search Strategy with IVYA

When U ( 0,1 ) > P a d a p t , the individual adopts the IVYA strategy, with its search behavior guided by the “vine disturbance mechanism” to achieve either local development or global exploration.
First, an adaptive disturbance threshold is generated as Equation (12):
β 1 = 1 + U ( 0,1 ) 2 ,
where U ( 0,1 ) refers to a random number uniformly distributed in the interval [0, 1]. If the current individual’s fitness f i is less than β 1 G b e s t , the individual is considered to be in a potentially optimal region, and a local search is performed as Equation (13):
X i n e w = X i + | N ( 0,1 ) | ( X j X i ) + N ( 0,1 ) G V i ,
where X j   refers to another individual randomly selected from the population and N 0,1 represents a standard normal random variable.
Otherwise, global search is performed as Equation (14):
X i n e w = G b e s t ( U ( 0,1 ) + N ( 0,1 ) G V i ) ,
This strategy effectively balances exploration and exploitation. Under the control of the adaptive probability mechanism, the algorithm adjusts the search behavior at different stages of the optimization process, using more local search behavior in promising areas and broader global search behavior when exploring new regions. This adaptive mechanism ensures that the algorithm can efficiently explore the solution space while avoiding local optima.
The IVYA’s role here is crucial: its vine disturbance mechanism simulates a localized perturbation process where solutions ‘grow’ along promising paths but with stochastic fine-scale adjustments. This is particularly important to compensate for PSO’s tendency to converge prematurely in high-dimensional, separable search spaces.
Moreover, the IVYA search contributes both local and global search modes. If the solution is near a potential optimum, the local mode is triggered to exploit it further. If the solution is suboptimal, global search provides a chance to escape poor regions. This behavior, governed by the disturbance threshold β1, ensures search robustness across various fitness landscapes.

3.1.4. GV Update and Greedy Selection Mechanism

The vine disturbance variable G V controls the search disturbance amplitude in the IVYA strategy, and its update mechanism is given by Equation (15):
G V = G V ( U ( 0,1 ) 2 N ( 0,1 ) )
This update rule mimics the natural mutation and contraction behavior observed in vine growth, enabling the search process to be dynamically adjusted. By introducing controlled randomness, this mechanism enhances the algorithm’s capability to escape local optima. The update procedure involves checking and correcting the new position to ensure it stays within predefined boundaries, evaluating its fitness based on the objective function, and then applying a greedy selection mechanism. If the new position’s fitness is better than the current one, it replaces the old solution. Furthermore, if it outperforms the individual’s historical best or the global best solution, the respective records are updated accordingly. This approach guarantees effective evolution of the population each generation while maintaining diversity, which is essential for preventing premature convergence and improving overall search performance.

3.1.5. Time Complexity Analysis

The overall time complexity of the AP-IVYPSO algorithm proposed in this paper can be estimated based on its iterative structure. Let N represent the population size, U the problem dimension, and T the maximum number of iterations. In each iteration, every individual performs either a PSO or IVY update operation, determined by an adaptive probability strategy aimed at improving fitness. Both update operations involve vector calculations of dimension U and a single evaluation of the objective function.
The primary computational costs per generation include updating the position and velocity of each individual, perturbing the vine variable G V , and calculating the fitness for all individuals. Assuming that the complexity of evaluating the objective function is O ( U ) , the computational complexity for each iteration is O ( N U ) . Therefore, the total time complexity for the entire algorithm is given by Equation (16):
O ( T N U )
This complexity increases linearly with the number of iterations, population size, and problem dimension, which ensures good scalability for practical engineering optimization problems. Notably, this paper balances algorithm performance with computational efficiency. In the experiments, the population size N is consistently set to 30, which helps ensure the algorithm achieves high computational efficiency while maintaining its optimization capabilities.

3.2. Performance Testing of AP-IVYPSO

To validate the performance and effectiveness of AP-IVYPSO, this study selected 26 widely used benchmark test functions [44,45,46,47], which include 15 single-peak test functions (f1–f15) and 11 multi-peak test functions (f16–f26). Single-peak test functions have a single global optimum and relatively simple search spaces. The detailed information about the test functions can be found in Table 1. The optimization process mainly focuses on evaluating the algorithm’s convergence speed and accuracy. With no local optima to interfere with the search, single-peak functions are ideal for testing the algorithm’s local search capability and convergence stability.
In contrast, multi-peak test functions feature multiple local optima and one or more global optima, creating a more complex search space structure. These functions test the algorithm’s ability to escape from local optima and assess its global search performance, making them useful for evaluating the algorithm’s robustness and exploration capabilities in complex environments.
By testing both single-peak and multi-peak functions, we can comprehensively measure the algorithm’s performance across different problem types. Additionally, the 30 independent runs of AP-IVYPSO were compared with the results of eight other widely recognized, high-performance optimization algorithms: PSO, IVYA, HFPSO [48], HJSPSO [49], BOA [50], WOA [51], GOOSE [52], PSOBOA [53], NSM_BO [54], and FDB_AGSK [55]. The parameter configurations for each algorithm can be found in Table 2. Three numerical evaluation metrics were used: the best fitness value, the average value, and the standard deviation, with the formulas described as Equation (17) through (19):
B e s t = m i n 1 i R   f i ,
A v g = i = 1 R   ( f i ) R ,
S t d = 1 R 1 i = 1 R   ( f i A v g ) 2 ,
where R represents the number of runs, set to 30 in this case. The maximum number of iterations for the algorithms is set to 500.
All experiments were performed on a Windows 10 operating system, equipped with a 32 GB of RAM and Intel (R) Core (TM) i9-14900KF processor (3.10 GHz), using the Matlab R2023a environment.

3.2.1. Numerical Results Analysis

Table 3 presents the best fitness values and rankings achieved by AP-IVYPSO and the other eight algorithms across 26 test functions. AP-IVYPSO achieved the best fitness value on 21 (f1–f4, f6, f8–f20, f24–f26) test functions, demonstrating its strong accuracy and local search capability. However, on functions f22 and f23, it ranked 10 and 4, respectively, indicating average performance.
Table 4 shows the average fitness values, standard deviations, and rankings of AP-IVYPSO and the other eight algorithms across 26 test functions. AP-IVYPSO achieved the best average fitness value on 23 (f1–f6, f8–f21, f24–f26) test functions, highlighting its excellent global search capability and stability. On f7, f22, and f23, however, it ranked 8, 8, and 6, respectively, reflecting average performance compared to the best-performing algorithms on these functions.

3.2.2. Convergence Curve Analysis

Appendix A.1 presents the convergence curves of the proposed AP-IVYPSO-BP algorithm and comparison algorithms, showing how the average fitness values of the benchmark test functions change with the number of objective function evaluations. This provides a more comprehensive representation of the optimization process and overall performance of each algorithm. The x-axis indicates the number of objective function evaluations (with a maximum of 15,000), while the y-axis displays the average fitness values obtained from 30 independent runs, thereby minimizing the random fluctuations that could arise from a single trial.
The figure reveals that the AP-IVYPSO algorithm exhibits faster convergence and superior average performance across 18 test functions (f1–f4, f8–f9, f11, f13–f20, and f24–f26), highlighting its robust global optimization capability. However, on functions f5, f7, f22, and f23, certain comparison algorithms achieved better optimization results, suggesting that these algorithms also demonstrate strong adaptability to specific problem instances.

3.2.3. Friedman Ranking and Wilcoxon Signed-Rank Test

To comprehensively evaluate the performance of the proposed AP-IVYPSO algorithm on 26 test functions, the Friedman test was employed to rank the nine algorithms. Based on the Friedman test scores, AP-IVYPSO achieved the lowest average rank (1.8587), securing first place and demonstrating the best overall performance. IVY and FDB-AGSK followed in second and third place, respectively. The traditional PSO was ranked 10, while GOOSE ranked the lowest, in 11 place. These ranking results further validate the significant advantage of the AP-IVYPSO algorithm in terms of optimization quality and stability. The Friedman ranking of each algorithm is shown in Table 5.
To further validate the significant advantage of the proposed AP-IVYPSO algorithm on multiple benchmark test functions, the Wilcoxon signed-rank test was performed for paired comparisons between AP-IVYPSO and the other eight comparison algorithms. The significance level for the test was set to α = 0.05. The results showed that the p-values between AP-IVYPSO and all other algorithms were smaller than the significance level of 0.05. This indicates that, in statistical terms, there are significant differences between AP-IVYPSO and all the comparison algorithms, further confirming the superior optimization performance of the proposed algorithm. The Wilcoxon signed-rank test results for AP-IVYPSO and the other eight algorithms are displayed in Table 6.
In conclusion, AP-IVYPSO has shown outstanding performance across all comprehensive tests, proving itself to be a powerful algorithm.

3.3. BPNN Model Parameter Optimization

The AP-IVYPSO-BP model utilizes a feedforward neural network for predicting concrete compressive strength. The network consists of an input layer, two hidden layers, and an output layer. The basic structural diagram of the AP-IVYPSO-BP model is shown in Figure 4. The input layer includes features related to the concrete mix, while the output layer predicts the compressive strength of the concrete. The network’s weights and biases are optimized using the AP-IVYPSO algorithm to enhance the accuracy of the predictions.
The network is trained by minimizing the mean squared error (MSE) between the predicted values and the actual values. The AP-IVYPSO algorithm is used iteratively to optimize the weights and biases of the network, with each particle’s position representing a specific set of neural network parameters.
The algorithm begins by initializing the positions and velocities of the particle swarm and evaluating each particle’s fitness based on the performance of the neural network on the training set. The AP-IVYPSO algorithm then optimizes the model through an adaptive probability mechanism based on fitness improvement. In each optimization iteration, the particle positions are updated according to the probabilities of the PSO or IVYA strategies. The optimization process continues until the maximum number of iterations is reached or a convergence criterion is satisfied. Once the optimization is complete, the optimal parameters identified are used to initialize the BPNN. The network is then trained on the training set using the backpropagation algorithm, which aims to minimize the prediction error.
The reason for choosing AP-IVYPSO to optimize the BPNN lies in the complementary nature of the two algorithms: PSO enables efficient global exploration of the neural network’s high-dimensional weight space, while IVYA introduces precise local search adjustments through its disturbance mechanism. This helps fine-tune the network parameters for better predictive accuracy.
Moreover, the AP-IVYPSO’s adaptive switch strategy ensures that early optimization focuses on avoiding poor minima, while later optimization emphasizes convergence around strong solutions. This is highly suitable for training BPNNs, which are known to suffer from poor initialization and gradient-based convergence issues.

3.4. Summary

The AP-IVYPSO-BP model integrates the global search ability of PSO and the local optimization capabilities of IVYA. By utilizing the adaptive probability mechanism, the model dynamically selects the most suitable optimization strategy, effectively balancing global and local search efforts. This balance significantly enhances the accuracy and stability of concrete compressive strength predictions. The next section will discuss the experimental setup, performance evaluation, and comparisons with benchmark models.

4. Experimental Evaluation and Result Interpretation

4.1. Dataset Overview for Experimentation

This study employs a dataset derived from the high-performance concrete compressive strength experimental dataset in the UCI Machine Learning Repository [56]. This dataset is widely utilized in research on the prediction of HPC properties and holds significant representativeness and practical value [57]. The dataset comprises 1030 samples, each with eight input variables and one output variable. The input variables include the following: cement, fly ash, blast furnace slag, water, superplasticizer, age, fine aggregate, and coarse aggregate, while concrete compressive strength serves as the output variable. All input variables, with the exception of Age, are quantified in kilograms per cubic meter (kg/m3); age is measured in days, while compressive strength is expressed in megapascals (MPa).
The data were collected from controlled laboratory experiments simulating realistic HPC mix designs, ensuring high data reliability and consistency. The samples cover a broad range of mixture proportions and curing ages, which effectively represent the variability encountered in practical engineering scenarios. This diversity in the dataset allows for robust modeling of the nonlinear and complex relationships between mixture components and compressive strength.
In this study, each sample represents a unique high-performance concrete mix design, and the nonlinear mapping between the eight-dimensional input variables and compressive strength output defines a high-dimensional, non-convex search space. This space lacks gradient information and contains numerous local optima, often metaphorically referred to as an “inhospitable environment.” Both IVYA and PSO are applied to this space to search for the optimal parameters of the neural network.
Specifically, each individual in the population represents a set of initial weights and biases for the BP neural network, which are encoded as real-valued vectors. The optimization algorithm’s goal is to minimize the model’s prediction error on the training or validation set. By initializing the BP neural network with the weights corresponding to each individual, the network is trained on the normalized high-performance concrete dataset, and its performance on the test set is evaluated to assess the fitness of each individual. In this manner, the dynamic search process of swarm intelligence is directly applied to the optimization of the concrete strength prediction parameters.
The compressive strength of HPC exhibits a distinct nonlinear relationship with the composition of the mixture, and this complexity is visually illustrated in Figure 5 and Figure 6. Detailed information about the input features is presented in Table 7, facilitating a thorough understanding of their characteristics. By using this data for neural network training and swarm intelligence-based optimization, the algorithmic strategy is not generic or abstract but is specifically customized to model the highly nonlinear strength behavior of concrete.
To better reveal the interactions between variables, this paper employs the Pearson Correlation Coefficient to analyze the correlations among the input variables and between input variables and output variable. The coefficient ranges from −1 to 1, where values near 0 suggest a weak correlation, values approaching 1 indicate a strong positive correlation, and values close to -l represent a strong negative correlation. The resulting Pearson correlation matrix is displayed in Figure 7, from which the correlation patterns between variables can be discerned. The results demonstrate that the influence of various factors on strength follows distinct and significant correlation patterns.
Furthermore, to quantitatively evaluate the influence of each input variable on the compressive strength of HPC, the XGBoost algorithm was employed within a Python 3.8 computational environment. The resulting feature importance rankings, as illustrated in Figure 8, reveal that curing age is the most critical predictor, with cement and water con-tent following closely behind. In contrast, fly ash and coarse aggregate have relatively smaller impacts.
To ensure stable convergence during the model training process and eliminate disparities in the magnitudes of different feature variables, the input features were normalized to the [0, 1] range using min-max normalization, while the output targets were standardized using the Z-score method. This preprocessing step enhances the training efficiency and prediction accuracy of the neural network, while mitigating potential convergence issues or instability arising from significant differences in data scales.
In this study, the primary research goal is to accurately predict the compressive strength of high-performance concrete based on its mix design parameters. To achieve this, a hybrid prediction model named AP-IVYPSO-BP is proposed, in which the AP-IVYPSO is employed to optimize the initial weights and biases of a BPNN. By combining global search capability with nonlinear learning, this model aims to address the complex mapping relationship between concrete components and compressive strength, thereby improving prediction precision.

4.2. Performance Evaluation Metrics

To assess the predictive performance of the proposed algorithmic model, this study utilizes three widely used evaluation metrics: the coefficient of determination ( R 2 ), mean absolute error ( M A E ), and root mean squared error ( R M S E ). R M S E and M A E assess the differences between actual values and predicted value; whereas lower values signify better prediction accuracy, R 2 quantifies the degree of correlation between the model’s predicted results and the observed outcomes. Its value ranges from 0 to 1, with values approaching 1 indicating superior model performance and stronger predictive accuracy. The detailed computational expressions are outlined in Equation (20) through (22):
R 2 = 1 i = 1 N   ( y i y ^ i ) 2 i = 1 N   ( y i y ¯ ) 2   f i ,
M A E = 1 N i = 1 N   | y ^ i y i | ,
R M S E = 1 N i = 1 N   ( y i y ^ i ) 2 ,
where y i is the actual output value of the i t h sample, y ¯ is the mean of the actual values, y ^ i is the predicted output value, and N is the total number of samples.

4.3. Overview of Experimental Procedures

In this experiment, 1000 samples are used for training, while an additional 30 samples are set aside for testing. The model is configured to undergo 100 training iterations, and the optimization algorithm parameters are provided in Table 8. To balance computational efficiency and algorithm performance, the population size for all algorithms in this study was uniformly set to 30.
The flowchart of the experimental procedure is shown in Figure 9. The specific procedures for conducting this experiment are outlined as follows.
The specific procedures for conducting this experiment are detailed as follows:
(1)
Data Preprocessing and Dataset Division: The original HPC dataset is divided into a training set and a test set. The input features are normalized using the mapminmax function, while the output values are standardized with Z-score normalization. These preprocessing steps help to enhance the model’s stability and speed up convergence during training.
(2)
Neural Network Architecture Configuration: A BPNN is constructed, with two hidden layers containing 16 and 8 neurons. The architecture is initialized based on the input features’ dimensionality and the number of output variables.
(3)
AP-IVYPSO Initialization: The maximum number of iterations and population size for the AP-IVYPSO algorithm are set. Each “vine” individual represents a potential combination of neural network weights and thresholds, which are the optimization targets. The individual particles dynamically select between the PSO or IVYA update strategies, using an adaptive probability mechanism based on fitness improvements.
(4)
Fitness Evaluation: The fitness function is defined as the RMSE between predicted and actual values. This guides the “vine” individuals toward the optimal solution, ensuring that the network’s predictive performance is maximized.
(5)
Position Update and Local Search: Each “vine” individual updates its position either by applying a local disturbance strategy or by randomly selecting a leader’s direction. In PSO updates, particle velocity and position are adjusted using the velocity update mechanism. In IVYA updates, the position is modified using vine heuristic growth dynamics. A mutation mechanism with a certain probability is incorporated to increase population diversity and help avoid local optima.
(6)
Optimal Weight Selection: After all iterations, the “vine” individual with the best fitness is selected, and its corresponding neural network weights and thresholds are used to update the BPNN.
(7)
Model Training and Prediction: The BPNN is trained using the optimal weights obtained from the AP-IVYPSO algorithm. Predictions are made for both the training and test sets, and the model’s performance is evaluated using R2, MAE, and RMSE metrics. The results are visualized to assess the model’s accuracy.

4.4. Analysis of Compressive Strength Prediction for High-Performance Concrete

To assess the practicality and advantages of the AP-IVYPSO-BP model in predicting the compressive strength of high-performance concrete, we systematically compared it with four benchmark models: the unoptimized standard BPNN, the PSO-optimized PSO-BP model, the GA-optimized GA-BP model, and the IVYA-optimized IVY-BP model. By keeping the input features consistent, different optimization strategies were applied to adjust the BPNN parameters, ensuring a comprehensive and fair comparison of model performance under the same dataset and experimental conditions.
To facilitate a more intuitive comparison of the prediction performance, we present scatter plots (Figure 10) and prediction curves (Figure 11) showing actual values against predicted values. These visualizations demonstrate the performance of the BPNN, PSO-BP, GA-BP, IVY-BP, and AP-IVYPSO-BP models on the test set.
As shown in Figure 10, the scatter plot clearly indicates that the predicted points of the AP-IVYPSO-BP model are the most tightly clustered and almost uniformly distributed along the ideal diagonal (where predicted values equal actual values), suggesting minimal deviation between the actual and predicted values. In contrast, the scatter plots for the BPNN, PSO-BP, GA-BP, and IVY-BP models show greater dispersion, with the BPNN model exhibiting more pronounced prediction errors.
The prediction curve in Figure 11 highlights how well the AP-IVYPSO-BP model fits the entire sample range, with the predicted curve closely matching the actual curve and demonstrating consistent fluctuations and trends. In comparison, other models show clear deviations at certain sample points, particularly at extreme values or inflection points, indicating poor fitting performance. Furthermore, the AP-IVYPSO-BP model excels in tracking regions with large fluctuations in the data, effectively capturing complex nonlinear relationships. This underscores the superior fitting ability and stability of the AP-IVYPSO-BP model when dealing with complex data structures. Table 9 provides the prediction results for each model, from which we can draw the following conclusions. Additionally, the table also provides the runtime ( T I M E ) of each model.
The AP-IVYPSO-BP model outperforms all comparison models, especially in terms of prediction performance on the test set. Specifically, it surpasses the traditional BPNN, PSO-BP, GA-BP, and IVY-BP models in key evaluation metrics such as R2, MAE, and RMSE. Notably, it achieves R2 = 0.9542, MAE = 3.0404, and RMSE = 3.7991, demonstrating minimal deviation between the predicted and actual values. These results show that the AP-IVYPSO-BP model delivers optimal prediction accuracy for high-performance concrete compressive strength.
Compared to the traditional BPNN model, the PSO and GA algorithms have already enhanced its performance, and the IVYA further improves the predictive capability of BPNN. The AP-IVYPSO-BP model combines the global search ability of PSO with the local search characteristics of IVYA, dynamically adjusting the PSO and IVYA update mechanisms through an adaptive probability strategy based on fitness improvement. This dynamic balance between global and local searches significantly enhances both prediction accuracy and model stability. When compared to the IVY-BP model, which is less optimized, the AP-IVYPSO-BP model improves R2 from 0.9485 to 0.9542, reduces MAE by 0.3155, and lowers RMSE by 0.3757. These improvements clearly demonstrate the superior prediction accuracy of the AP-IVYPSO-BP model over other optimization techniques.
In conclusion, these results suggest that integrating bio-inspired optimization algorithms with neural networks is an effective approach for improving regression prediction accuracy in complex nonlinear problems. By leveraging both global and local search capabilities, the AP-IVYPSO-BP model demonstrates remarkable generalization ability and robustness, achieving notable success in predicting the compressive strength of high-performance concrete.

5. Conclusions

This paper introduces a novel hybrid prediction model, AP-IVYPSO-BP, that combines the bio-inspired IVYA with PSO to optimize a BPNN for accurately predicting the compressive strength of HPC. The AP-IVYPSO-BP model strengthens the global search capability of PSO and the local search characteristics of IVYA, while dynamically adjusting their update mechanisms through an adaptive probability strategy based on fitness improvement. This dynamic adjustment optimizes the balance between global and local searches, significantly enhancing prediction accuracy, model stability, and robustness.
To validate the proposed model’s effectiveness, experiments were conducted on a publicly available dataset containing 1030 high-performance concrete mix samples. The AP-IVYPSO-BP model was compared with traditional BPNN, PSO-BP, GA-BP, and IVY-BP models. The experimental results demonstrate that the AP-IVYPSO-BP model outperforms the other models across various evaluation metrics, particularly excelling in R2, MAE, and RMSE. Specifically, the AP-IVYPSO-BP model achieved an R2 of 95.42%, reflecting its excellent fitting ability and prediction accuracy. Moreover, MAE and RMSE showed substantial improvements compared to the baseline models, further highlighting the model’s superior performance in predicting concrete compressive strength.
The AP-IVYPSO-BP model provides an effective tool for accurately predicting concrete strength, contributing to better material utilization, reduced resource waste, and minimized environmental impact, thereby supporting the sustainable development of the construction industry. Future research could explore applying this model to the prediction of other engineering materials’ strength, incorporating additional optimization algorithms and deep learning techniques to further enhance the model’s performance. Furthermore, investigating the model’s practical applications in engineering management will help unlock its full potential for sustainable development.

Author Contributions

Conceptualization, K.Z. and S.Z. (Shuo Zhang); methodology, K.Z. and X.L.; software, K.Z. and S.Z. (Songsong Zhang); validation, S.Z. (Shuo Zhang) and S.Z. (Songsong Zhang); formal analysis, X.L.; investigation, K.Z. and S.Z. (Songsong Zhang); data curation, X.L.; writing—original draft preparation, K.Z.; writing—review and editing, X.L. and S.Z. (Shuo Zhang); supervision, S.Z. (Shuo Zhang); project administration, S.Z. (Shuo Zhang). All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

All data used and/or analyzed during this research are openly available and can be accessed freely. If needed, they can be requested from the corresponding author.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Appendix A

Appendix A.1. Convergence Curves of AP-IVYPSO and 10 Other Algorithms

Biomimetics 10 00515 i001Biomimetics 10 00515 i002Biomimetics 10 00515 i003Biomimetics 10 00515 i004Biomimetics 10 00515 i005Biomimetics 10 00515 i006Biomimetics 10 00515 i007Biomimetics 10 00515 i008Biomimetics 10 00515 i009Biomimetics 10 00515 i010Biomimetics 10 00515 i011Biomimetics 10 00515 i012Biomimetics 10 00515 i013

References

  1. Kulkarni, V.R. High performance concrete for high-rise buildings: Some crucial issues. Int. J. Res. Eng. Technol. 2016, 26–33. Available online: https://ijret.org/volumes/2016v05/i32/IJRET20160532004.pdf (accessed on 4 August 2025).
  2. Zhou, M.; Lu, W.; Song, J.; Lee, G.C. Application of ultra-high performance concrete in bridge engineering. Constr. Build. Mater. 2018, 186, 1256–1267. [Google Scholar] [CrossRef]
  3. Costaz, J.L. High Performance Concrete in Nuclear Power Plants. In High Performance Concrete; CRC Press: Boca Raton, FL, USA, 2018; pp. 365–378. [Google Scholar]
  4. Magureanu, C.; Sosa, I.; Negrutiu, C.; Heghes, B. Mechanical Properties and Durability of Ultra-High-Performance Concrete. ACI Mater. J. 2012, 109, 177–184. [Google Scholar]
  5. Li, J.; Wu, Z.; Shi, C.; Zhang, Z. Durability of ultra-high performance concrete—A review. Constr. Build. Mater. 2020, 255, 119296. [Google Scholar] [CrossRef]
  6. Chang, P.K.; Peng, Y.N.; Hwang, C.L. A design consideration for durability of high-performance concrete. Cem. Concr. Compos. 2001, 23, 375–380. [Google Scholar] [CrossRef]
  7. Jiang, C.; Li, T.; Yang, A.; Liu, H. Impact of Graphite-Associated Waste as a Sustainable Aggregate on UHPC Performance. Sustainability 2024, 16, 10912. [Google Scholar] [CrossRef]
  8. Alkharisi, M.K.; Dahish, H.A. The Application of Response Surface Methodology and Machine Learning for Predicting the Compressive Strength of Recycled Aggregate Concrete Containing Polypropylene Fibers and Supplementary Cementitious Materials. Sustainability 2025, 17, 2913. [Google Scholar] [CrossRef]
  9. Zou, Z.; Jiang, N.; Yu, Z.; Li, Z.; Zhang, J.; Wang, Y. MBSE Net: Addressing Data Scarcity and Model Interpretability in Predicting UHPC Performance for Bridge Infrastructure. In Proceedings of the 2024 IEEE International Conference on Smart City (SmartCity), Wuhan, China, 13–15 December 2024; IEEE: New York, NJ, USA; pp. 1–8. [Google Scholar]
  10. Kasperkiewicz, J.; Racz, J.; Dubrawski, A. HPC strength prediction using artificial neural network. J. Comput. Civ. Eng. 1995, 9, 279–284. [Google Scholar] [CrossRef]
  11. Shariati, M.; Armaghani, D.J.; Khandelwal, M.; Zhou, J.; Eyvaziyan, A.; Khorami, M. Assessment of longstanding effects of fly ash and silica fume on the compressive strength of concrete using extreme learning machine and artificial neural network. J. Adv. Eng. Comput. 2021, 5, 50–74. [Google Scholar] [CrossRef]
  12. Barkhordari, M.S.; Ghavaminejad, S.; Tehranizadeh, M. Predicting autogenous shrinkage of concrete including superabsorbent polymers and other cementitious ingredients using convolution-based algorithms. Period. Polytech. Civ. Eng. 2024, 68, 1098–1121. [Google Scholar] [CrossRef]
  13. Jibril, M.M.; Zayyan, M.A.; Malami, S.I.; Usman, A.; Salami, B.A.; Rotimi, A.; Abba, S. Implementation of nonlinear computing models and classical regression for predicting compressive strength of high-performance concrete. Appl. Eng. Sci. 2023, 15, 100133. [Google Scholar] [CrossRef]
  14. Hameed, M.M.; AlOmar, M.K. Prediction of Compressive Strength of High-Performance Concrete: Hybrid Artificial Intelligence Technique. In Proceedings of the International Conference on Applied Computing to Support Industry: Innovation and Technology, Ramadi, Iraq, 15–16 September 2019; Springer International Publishing: Cham, Switzerland; pp. 323–335. [Google Scholar]
  15. Zhou, R.; Tang, Y.; Li, H.; Liu, Z. Predicting the compressive strength of ultra-high-performance concrete using a decision tree machine learning model enhanced by the integration of two optimization meta-heuristic algorithms. J. Eng. Appl. Sci. 2024, 71, 43. [Google Scholar] [CrossRef]
  16. Qi, R.; Wu, H.; Qi, Y.; Tang, H. Investigation of mechanical properties of high-performance concrete via multi-method of regression tree approach. Mater. Today Commun. 2024, 40, 109922. [Google Scholar] [CrossRef]
  17. Wang, L. Estimating high-performance concrete compressive strength with support vector regression in hybrid method. Multiscale Multidiscip. Model. Exp. Des. 2024, 7, 477–490. [Google Scholar] [CrossRef]
  18. Sun, Q.; Yu, G. Predicting slump for high-performance concrete using decision tree and support vector regression approaches coupled with phasor particle swarm optimization algorithm. Struct. Concr. 2024, 25, 4103–4118. [Google Scholar] [CrossRef]
  19. Murthy, M.N.; Amruth, S.K.; Marulasiddappa, S.B.; Naganna, S.R. Modeling the compressive strength of binary and ternary blended high-performance concrete mixtures using ensemble machine learning models. Soft Comput. 2024, 28, 6683–6693. [Google Scholar] [CrossRef]
  20. Shi, X.; Chen, S.; Wang, Q.; Lu, Y.; Ren, S.; Huang, J. Mechanical framework for geopolymer gels construction: An optimized LSTM technique to predict compressive strength of fly ash-based geopolymer gels concrete. Gels 2024, 10, 148. [Google Scholar] [CrossRef]
  21. Li, L.; Gao, Y.; Dong, X.; Han, Y. Artificial Neural Network Model for Predicting Mechanical Strengths of Economical Ultra-High-Performance Concrete Containing Coarse Aggregates: Development and Parametric Analysis. Materials 2024, 17, 3908. [Google Scholar] [CrossRef]
  22. Li, Q.; Su, R.; Qiao, H.; Su, L.; Wang, P.; Gong, L. Prediction of compressive strength and porosity of vegetated concrete based on hybrid BPNN. Mater. Today Commun. 2025, 44, 112080. [Google Scholar] [CrossRef]
  23. Kumar, R.; Rai, B.; Samui, P. Prediction of mechanical properties of high-performance concrete and ultrahigh-performance concrete using soft computing techniques: A critical review. Struct. Concr. 2024, 26, 1309–1337. [Google Scholar] [CrossRef]
  24. Jing, W.; Suyuan, Y.; Jingtao, L. Using a machine learning method with novel optimization algorithms in estimating compressive strength rates in ultra-high performance concrete. Int. J. Knowl.-Based Intell. Eng. Syst. 2024, 29, 171–187. [Google Scholar] [CrossRef]
  25. Tang, X.; Xu, B.; Xu, Z. Reactor temperature prediction method based on CPSO-RBF-BPNN. Appl. Sci. 2023, 13, 3230. [Google Scholar] [CrossRef]
  26. Anasseriyil Viswambaran, R. Evolutionary Computation for Designing Deep Recurrent Neural Networks. Master’s Thesis, Open Access Te Herenga Waka-Victoria University of Wellington, Wellington, New Zealand, 2025. [Google Scholar]
  27. Yao, Z.; Zhu, Q.; Zhang, Y.; Huang, H.; Luo, M. Minimizing Long-Term Energy Consumption in RIS-Assisted UAV-Enabled MEC Network. IEEE Internet Things J. 2025, 12, 20942–20958. [Google Scholar] [CrossRef]
  28. El-Hassani, F.Z.; Fatih, F.; Joudar, N.E.; Haddouch, K. Deep multilayer neural network with weights optimization-based genetic algorithm for predicting hypothyroid disease. Arab. J. Sci. Eng. 2024, 49, 11967–11990. [Google Scholar] [CrossRef]
  29. Ma, F.; Wang, S.; Xie, T.; Sun, C. Regional Logistics Express Demand Forecasting Based on Improved GA-BPNN with Indicator Data Characteristics. Appl. Sci. 2024, 14, 6766. [Google Scholar] [CrossRef]
  30. Abdurrakhman, A.; Sutiarso, L.; Ainuri, M.; Ushada, M.; Islam, P. Adaptive Back-Propagation Neural Network and Particle Swarm Optimization based Approach for Optimizing the Output Power Biogas Fueled Electric Generator. IEEE Access 2024, 12, 132303–132316. [Google Scholar] [CrossRef]
  31. Wang, Z.; Cai, J.; Liu, X.; Zou, Z. Optimized BPNN Based on Improved Dung Beetle Optimization Algorithm to Predict High-Performance Concrete Compressive Strength. Buildings 2024, 14, 3465. [Google Scholar] [CrossRef]
  32. Xing, Z.; Ma, G.; Wang, L.; Yang, L.; Guo, X.; Chen, S. Towards visual interaction: Hand segmentation by combining 3D graph deep learning and laser point cloud for intelligent rehabilitation. IEEE Internet Things J. 2025, 12, 21328–21338. [Google Scholar] [CrossRef]
  33. Xing, Z.; Meng, Z.; Zheng, G.; Ma, G.; Yang, L.; Guo, X.; Tan, L.; Jiang, Y.; Wu, H. Intelligent rehabilitation in an aging population: Empowering human-machine interaction for hand function rehabilitation through 3D deep learning and point cloud. Front. Comput. Neurosci. 2025, 19, 1543643. [Google Scholar] [CrossRef]
  34. Kılıç, F.; Yılmaz, İ.H.; Kaya, Ö. Adaptive co-optimization of artificial neural networks using evolutionary algorithm for global radiation forecasting. Renew. Energy 2021, 171, 176–190. [Google Scholar] [CrossRef]
  35. Słowik, A.; Cpałka, K. Hybrid approaches to nature-inspired population-based intelligent optimization for industrial applications. IEEE Trans. Ind. Inform. 2021, 18, 546–558. [Google Scholar] [CrossRef]
  36. Jia, H.; Peng, X.; Lang, C. Remora optimization algorithm. Expert Syst. Appl. 2021, 185, 115665. [Google Scholar] [CrossRef]
  37. Jia, H.; Rao, H.; Wen, C.; Mirjalili, S. Crayfish optimization algorithm. Artif. Intell. Rev. 2023, 56 (Suppl. 2), 1919–1979. [Google Scholar] [CrossRef]
  38. Ghasemi, M.; Zare, M.; Trojovský, P.; Rao, R.V.; Trojovská, E.; Kandasamy, V. Optimization based on the smart behavior of plants with its engineering applications: IVYA. Knowl.-Based Syst. 2024, 295, 111850. [Google Scholar] [CrossRef]
  39. Kennedy, J.; Eberhart, R. Particle Swarm Optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; IEEE: New York, NJ, USA. [Google Scholar]
  40. Fei, J.; Wu, Z.; Sun, X.; Su, D.; Bao, X. Research on tunnel engineering monitoring technology based on BPNN neural network and MARS machine learning regression algorithm. Neural Comput. Appl. 2021, 33, 239–255. [Google Scholar] [CrossRef]
  41. Zhou, W.; Yan, Z.; Zhang, L. A comparative study of 11 non-linear regression models highlighting autoencoder, DBN, and SVR, enhanced by SHAP importance analysis in soybean branching prediction. Sci. Rep. 2024, 14, 5905. [Google Scholar] [CrossRef]
  42. He, Z.; Hou, Z.; Xu, N.; Liu, D.; Zhou, M. A Distributed Model Predictive Control Approach for Virtually Coupled Train Set with Adaptive Mechanism and Particle Swarm Optimization. Mathematics 2025, 13, 1641. [Google Scholar] [CrossRef]
  43. Lin, S.; Yang, Y.; Nagata, Y.; Yang, H. Elite Evolutionary Discrete Particle Swarm Optimization for Recommendation Systems. Mathematics 2025, 13, 1398. [Google Scholar] [CrossRef]
  44. Jamil, M.; Yang, X.S. A literature survey of benchmark functions for global optimisation problems. Int. J. Math. Model. Numer. Optim. 2013, 4, 150–194. [Google Scholar] [CrossRef]
  45. Zhan, Z.H.; Shi, L.; Tan, K.C.; Zhang, J. A survey on evolutionary computation for complex continuous optimization. Artif. Intell. Rev. 2022, 55, 59–110. [Google Scholar] [CrossRef]
  46. Cai, T.; Zhang, S.; Ye, Z.; Zhou, W.; Wang, M.; He, Q.; Chen, Z.; Bai, W. Cooperative metaheuristic algorithm for global optimization and engineering problems inspired by heterosis theory. Sci. Rep. 2024, 14, 28876. [Google Scholar] [CrossRef]
  47. Zhang, K.; Yuan, F.; Jiang, Y.; Mao, Z.; Zuo, Z.; Peng, Y. A Particle Swarm Optimization-Guided Ivy Algorithm for Global Optimization Problems. Biomimetics 2025, 10, 342. [Google Scholar] [CrossRef]
  48. Aydilek, I.B. A hybrid firefly and particle swarm optimization algorithm for computationally expensive numerical problems. Appl. Soft Comput. 2018, 66, 232–249. [Google Scholar] [CrossRef]
  49. Nayyef, H.M.; Ibrahim, A.A.; Mohd Zainuri, M.A.A.; Zulkifley, M.A.; Shareef, H. A novel hybrid algorithm based on jellyfish search and particle swarm optimization. Mathematics 2023, 11, 3210. [Google Scholar] [CrossRef]
  50. Arora, S.; Singh, S. Butterfly optimization algorithm: A novel approach for global optimization. Soft Comput. 2019, 23, 715–734. [Google Scholar] [CrossRef]
  51. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  52. Hamad, R.K.; Rashid, T.A. GOOSE algorithm: A powerful optimization tool for real-world engineering challenges and beyond. Evol. Syst. 2024, 15, 1249–1274. [Google Scholar] [CrossRef]
  53. Khosla, T.; Verma, O.P. An Adaptive Hybrid Particle Swarm Optimizer for Constrained Optimization Problem. In Proceedings of the 2021 International Conference in Advances in Power, Signal, and Information Technology (APSIT), Bhubaneswar, India, 8–10 October 2021; IEEE: New York, NJ, USA; pp. 1–7. [Google Scholar]
  54. Öztürk, H.T.; KAHRAMAN, H.T. Metaheuristic search algorithms in Frequency Constrained Truss Problems: Four improved evolutionary algorithms, optimal solutions and stability analysis. Appl. Soft Comput. 2025, 171, 112854. [Google Scholar] [CrossRef]
  55. Bakır, H.; Duman, S.; Guvenc, U.; Kahraman, H.T. Improved adaptive gaining-sharing knowledge algorithm with FDB-based guiding mechanism for optimization of optimal reactive power flow problem. Electr. Eng. 2023, 105, 3121–3160. [Google Scholar] [CrossRef]
  56. Yeh, I.C.; Concrete Compressive Strength. UCI Machine Learning Repository (2007) [EB/OL]. Available online: https://archive.ics.uci.edu/dataset/165/concrete+compressive+strength (accessed on 4 August 2025).
  57. Wang, K.; Ren, J.; Yan, J.; Wu, X.; Dang, F. Research on a concrete compressive strength prediction method based on the random forest and LCSSA-improved BPNN. J. Build. Eng. 2023, 76, 107150. [Google Scholar] [CrossRef]
Figure 1. Schematic representation of the PSO algorithm’s workflow.
Figure 1. Schematic representation of the PSO algorithm’s workflow.
Biomimetics 10 00515 g001
Figure 2. The individual I i selects the nearest and most influential neighbor I i i within the population.
Figure 2. The individual I i selects the nearest and most influential neighbor I i i within the population.
Biomimetics 10 00515 g002
Figure 3. Schematic representation of the ivy algorithm’s workflow.
Figure 3. Schematic representation of the ivy algorithm’s workflow.
Biomimetics 10 00515 g003
Figure 4. Basic structure diagram of AP-IVYPSO-BP model.
Figure 4. Basic structure diagram of AP-IVYPSO-BP model.
Biomimetics 10 00515 g004
Figure 5. The relationship graph between concrete strength and the initial 4 input features: cement, water, fly ash, and blast furnace slag.
Figure 5. The relationship graph between concrete strength and the initial 4 input features: cement, water, fly ash, and blast furnace slag.
Biomimetics 10 00515 g005
Figure 6. The relationship graph between concrete strength and the final 4 input features: high-efficiency water reducer, age, fine aggregate, and coarse aggregate.
Figure 6. The relationship graph between concrete strength and the final 4 input features: high-efficiency water reducer, age, fine aggregate, and coarse aggregate.
Biomimetics 10 00515 g006
Figure 7. The Pearson correlation matrix for each variable.
Figure 7. The Pearson correlation matrix for each variable.
Biomimetics 10 00515 g007
Figure 8. Ranking of feature importance.
Figure 8. Ranking of feature importance.
Biomimetics 10 00515 g008
Figure 9. The flowchart of the experimental procedure.
Figure 9. The flowchart of the experimental procedure.
Biomimetics 10 00515 g009
Figure 10. Scatter diagrams comparing predicted and actual test set values across models.
Figure 10. Scatter diagrams comparing predicted and actual test set values across models.
Biomimetics 10 00515 g010
Figure 11. Comparison of predicted and actual values on the test set for different models.
Figure 11. Comparison of predicted and actual values on the test set for different models.
Biomimetics 10 00515 g011aBiomimetics 10 00515 g011bBiomimetics 10 00515 g011c
Table 1. Details of the 26 test functions.
Table 1. Details of the 26 test functions.
s/nCategoryFunction NameFormulaRange f m i n *
f1UnimodalSphere f 1 ( x ) = i = 1 d i m   x i 2 [−100, 100]0
f2UnimodalSchwefel 2.22 f 2 | x | = i = 1 d i m   | x i | + i = 1 d i m   | x i | [−10, 10]0
f3UnimodalSchwefel 1.2 f 3 ( x ) = i = 1 d i m   j = 1 i   x j 2 [−100, 100]0
f4UnimodalSchwefel 2.21 f 4 ( x ) = m a x i   { | x i | } , 1 i d i m [−100, 100]0
f5UnimodalStep f 5 ( x ) = i = 1 d i m   0.5 + x i 2 [−100, 100]0
f6UnimodalQuartic f 6 ( x ) = i = 1 d i m   i x i 4 + r a n d [−1.28, 1.28]0
f7UnimodalExponential f 7 ( x ) = i = 1 d i m   ( e x i x i ) [−10, 10]0
f8UnimodalSum power f 8 ( x ) = i = 1 d i m   x i 2 [−1, 1]0
f9UnimodalSum square f 9 ( x ) = i = 1 d i m   i x i 2 [−10, 10]0
f10UnimodalRosenbrock f 10 ( x ) = i = 1 d i m 1   ( x i 1 ) 2 + 100 ( x i + 1 x i 2 ) 2 [−5, 10]0
f11UnimodalZakharov f 11 ( x ) = i = 1 d i m   0.5 i x i 2 + i = 1 d i m   x i 2 + i = 1 d i m   0.5 i x i 4 [−5, 10]0
f12UnimodalTrid f 12 ( x ) = i = 1 d i m   ( 1 x i ) 2 i = 2 d i m   x i x i 1 [−5, 10]0
f13UnimodalElliptic f 13 ( x ) = i = 1 d i m   ( 10 6 ) i / ( d i m 1 ) x i 2 [−100, 100]0
f14UnimodalCigar f 14 ( x ) = 10 6 i = 2 d i m   x i 2 + x 1 2 [−100, 100]0
f15FixedRastrigin f 15 ( x ) = i = 1 d i m   10 10 c o s ( 2 π x i ) + x i 2 [−5.12, 5.12]0
f16MultimodalNCRastrigin f 16 ( x ) = i = 1 d i m   10 10 c o s ( 2 π x i ) + x i 2 , y i = x i ,      if   x i 0.5 x i 1 ,      otherwise      [−5.12, 5.12]0
f17MultimodalAckley f 17 ( x ) = e 1 i = 1 d i m   c o s ( 2 π x i ) + 20 e 0.2 1 d i m i = 1 d i m   x i 2 + 20 + e [−50, 50]0
f18MultimodalGriewank f 18 ( x ) = 1 i = 1 d i m   c o s x i i + 1 4000 i = 1 d i m   x i 2 [−600, 600]0
f19FixedAlpine f 19 ( x ) = i = 1 d i m   | 0.1 x i + x i s i n ( x i ) | [−10, 10]0
f20MultimodalPenalized 1 f 20 x = π d i m i = 1 d i m 1   ( y i 1 ) 2 1 + 10 sin 2 π y i + 1 + 10 s i n 2 ( π y 1 ) + ( y d i m 1 ) 2 + i = 1 d i m   u x i , 10,100,4 , y i = 1 + x i + 1 4 , u ( x i , a , k , m ) = k ( x i a ) m ,      x i > a 0 ,        a x i a k ( x i a ) m ,     x i < a [−100, 100]0
f21MultimodalPenalized 2 f 21 ( x ) = i = 1 d i m 1   ( x i 1 ) 2 1 + sin 2 3 π x i + 1 + 0.1 s i n 2 ( 3 π x 1 ) + ( x d i m 1 ) 2 [ 1 + s i n 2 ( 2 π x d i m ) ] } + i = 1 d i m   u ( x i , 5,100,4 ) [−100, 100]0
f22FixedSchwefel f 22 ( x ) = i = 1 d i m   x i s i n ( | x i | ) [−100, 100]0
f23MultimodalLévy f 23 ( x ) = i = 1 d i m   ( x i 1 ) 2 [ 1 + s i n 2 ( 3 π x i + 1 ) ] + s i n 2 ( 3 π x 1 ) + ( x d i m 1 ) 2 [ 1 + s i n 2 ( 2 π x d i m ) ] [−10, 10]0
f24MultimodalWeierstrass f 24 ( x ) = i = 1 d i m   k = 0 k m a x   a k c o s ( 2 π b k ( 0.5 + x i ) ) d i m k = 0 k m a x   a k c o s ( π b k ) , a = 0.5 , b = 3 , k m a x = 20 [−0.5, 0.5]0
f25FixedSolomon f 25 ( x ) = 1 + 0.1 i = 1 d i m   x i 2 c o s 2 π i = 1 d i m   x i 2 [−100, 100]0
f26FixedBohachevsky f 26 ( x ) = i = 1 d i m   3 x i 2 0.3 c o s ( 3 π x i ) [−10, 10]0
Table 2. Parameter settings of 11 algorithms.
Table 2. Parameter settings of 11 algorithms.
AlgorithmParameterAlgorithmParameter
ALLMax iteration = 500; Agents = 30; Runs = 30BOA a = 0.1 ; p = 0.6 ; c 0 = 0.01
AP-IVYPSO C 1 = C 2 = 2.0 ; V m a x = 0.1 ; a l p h a = 0.2
ω m a x = 0.9 ; ω m i n = 0.4
WOA a = linear   decrease   from   2   to   0 ; C = 0,2 ; a 2 = linear   decrease   from 1   to 2
PSO C 1 = C 2 = 2 ; V = 6,6 ; w = ( 0.2,0.9 ) GOOSE S W m i n = 5 ; S W m a x = 25 ;   c o e _ m i n = 0.17
IVY b e t a 1 = 1,1.5 ; G V = [ 0,1 ] PSOBOA p = 0.6 ; p o w e r e x p o n e n t = 0.1
s e n s o r y _ m o d a l i t y = 0.01
HFPSO C 1 = C 2 = 1.49445 ; V m a x c o e f = 0.1 ;
a l p h a = 0.2 ; b e t a 0 = 2 ; g a m m a = 1 ; m = 2 ;
NSM-BO p x g m i n i t i a l = 0.03 ; s c a b = 1.25 ; s c s b = 1.3
r c p p = 0.0035 ; t s g s _ f a c t o r _ m a x = 0.05
HJSPSO C m i n = 0.5 ; C m a x = 2.0 ; W m i n = 0.4 ;
W m a x = 0.9 ; B t a = 0.1
FDB-AGSK K F p o o l = 0.1 , 1.0 , 0.5 , 1.0 ;
K R p o o l = [ 0.2 , 0.1 , 0.9 , 0.9 ]
Table 3. Best fitness and ranking of AP-IVYPSO and other algorithms.
Table 3. Best fitness and ranking of AP-IVYPSO and other algorithms.
Func.MetricsAP-IVYPSOPSOIVYHFPSOHJSPSOBOAWOAGOOSEPSOBOANSM-BOFDB-AGSK
f1Best02.14301.6538 × 10−56.3891 × 10−457.5933 × 10−115.9355 × 10−880.01091.9073 × 10−610.00047.4370 × 10−94
Rank1111867410593
f2Best06.21300.0061.7830 × 10−242.8236 × 10−85.7347 × 10−55110.44152.7886 × 10−290.00112.0686 × 10−63
Rank1101967411583
f3Best071.822102.50281.7817 × 10−165.2076 × 10−11277.74141.91392.2011 × 10−590.0589281.0804
Rank1918451073611
f4Best02.136900.09661.8662 × 10−192.6513 × 10−85.89940.17372.9378 × 10−313.39142.0974 × 10−6
Rank1917451183106
f5Best0.00370.97782.24333.8506 × 10−60.51926.50740.08090.01227.02910.00091.5393
Rank3791610541128
f6Best3.4874 × 10−540.05839.3677 × 10−50.03010.00080.00170.00180.24240.00050.16067.2595 × 10−5
Rank1113856710492
f7Best7.1751 × 10−6603.3691 × 10−337.1751 × 10−661.0082 × 10−648.4738 × 10−137.1751 × 10−668.2315 × 10−662.9324 × 10−137.1751 × 10−667.1751 × 10−66
Rank2192811271022
f8Best00.008606.3555 × 10−148.6067 × 10−1131.1827 × 10−134.2612 × 10−1162.5647 × 10−57.7022 × 10−725.4193 × 10−241.8839 × 10−149
Rank1111859410673
f9Best020.507801.7088 × 10−52.6389 × 10−487.5159 × 10−111.4417 × 10−780.18561.1040 × 10−601.3930 × 10−51.9045 × 10−105
Rank1111967410583
f10Best25.69171256.749526.990826.615925.374828.843827.4877134.830928.991538.921228.7173
Rank2114317510896
f11Best0194.208702.5069 × 10−51.6487 × 10−477.6444 × 10−114.9707 × 10−780.17113.0846 × 10−593.64571.8917 × 10−103
Rank1111867495103
f12Best0.666790.07490.66670.66670.66670.97180.66691.02680.99850.69050.6686
Rank2111438510976
f13Best07.3733 × 10−3103.7490 × 10−435.6779 × 10−1774.7441 × 10−2201.7220 × 10−71.0862 × 10−6200
Rank1918610111711
f14Best03.1468 × 10−1905.8483 × 10−288.0021 × 10−1181.0286 × 10−162.9238 × 10−1214147.59652.9735 × 10−5801.4151 × 10−164
Rank1918610511714
f15Best01.4621 × 10−2602.4789 × 10−283.9173 × 10−1624.6394 × 10−222.4224 × 10−1510.00137.8454 × 10−6303.9790 × 10−207
Rank1918510611714
f16Best0116.392056.713401.4211 × 10−120151.88161.7053 × 10−1317.91850
Rank1101917111681
f17Best0174.129105636.9823126.87020259.0053.0749 × 10−129.04960
Rank1101879111561
f18Best4.4409 × 10−162.73324.4409 × 10−160.0013.9968 × 10−152.8218 × 10−84.4409 × 10−1617.47314.4409 × 10−161.65013.9968 × 10−15
Rank1101857111195
f19Best00.147701.1175 × 10−508.7103 × 10−1200.04500.19290
Rank1101817191111
f20Best04.718200.00812.4683 × 10−269.4059 × 10−112.6009 × 10−557.23856.1770 × 10−220.00039.3253 × 10−66
Rank1101957411683
f21Best0.00030.05240.00311.9709 × 10−80.0080.84020.0053.15670.97181.0872 × 10−070.0196
Rank3841695111027
f22Best2.56611.052.68622.9258 × 10−70.15152.50550.18360.0112.43010.0110.1103
Rank1071115962834
f23Best0.245.99071.03930.24390.298612.24120.12730.629115.64230.04650.0027
Rank4985610371121
f24Best01.599502.2204 × 10−1604.2802014.07170.000500
Rank1917110111811
f25Best01.591900.89550.09950.89550.09951.59190.09954.87520.0995
Rank1917583106114
f26Best020.577801.0702 × 10−507.0600 × 10−1105.781200.00020
Rank1111817110191
Paired rank +/=/−24/0/27/18/122/1/320/4/225/0/116/8/225/0/122/3/117/5/417/7/2
Avg. rank1.739.352.588.694.628.044.009.356.086.153.62
Overall rank1102958410673
Table 4. Average fitness, standard deviation, and ranking of AP-IVYPSO and other algorithms.
Table 4. Average fitness, standard deviation, and ranking of AP-IVYPSO and other algorithms.
Func.MetricsAP-IVYPSOPSOIVYHFPSOHJSPSOBOAWOAGOOSEPSOBOANSM-BOFDB-AGSK
f1Avg.02.224202.7201 × 10−51.6815 × 10−452.4795 × 10−732.1691 × 10−9424.36287.6307 × 10−118.1448 × 10−21.8042 × 10−60
Std.01.119201.9166 × 10−55.4431 × 10−451.0646 × 10−721.1247 × 10−9372.73175.9056 × 10−120.13663.8616 × 10−60
Rank1101864311795
f2Avg.04.305905.0487 × 10−33.6512 × 10−241.6646 × 10−501.6909 × 10−61311755.51692.2675 × 10−083.3658 × 10−33.2835 × 10−29
Std.01.045303.2921 × 10−31.2900 × 10−235.2406 × 10−504.1066 × 10−611700375.0927.0180 × 10−094.1626 × 10−33.8575 × 10−29
Rank1101964311785
f3Avg.084.315701.0849.3012 × 10−13436.7661398.64512.3925.3594 × 10−113.6857.4448 × 10−60
Std.025.103300.75042.6931 × 10−12150.0621192.57330.8757.2579 × 10−123.48653.1965 × 10−59
Rank1916411107583
f4Avg.01.914800.13892.4045 × 10−204.75684.0510.22142.7187 × 10−082.55117.9000 × 10−31
Std.00.290405.9634 × 10−25.9800 × 10−202.9843.71459.8125 × 10−22.6012 × 10−090.61798.1858 × 10−31
Rank1816411107593
f5Avg.3.2141 × 10−32.13850.44152.6084 × 10−60.18028.7297 × 10−20.51351.0139 × 10−25.36995.0195 × 10−36.291
Std.1.0477 × 10−30.83690.41961.3344 × 10−60.15795.1211 × 10−20.34033.3982 × 10−30.64261.5448 × 10−20.6757
Rank2971658410311
f6Avg.8.3694 × 10−513.65617.6814 × 10−51.8698 × 10−21.5350 × 10−032.7534 × 10−31.0183 × 10−30.13332.0667 × 10−030.10272.0759 × 10−4
Std.7.0251 × 10−511.63699.4172 × 10−58.0492 × 10−36.1941 × 10−042.9686 × 10−32.0364 × 10−36.3191 × 10−26.6445 × 10−043.7157 × 10−21.2431 × 10−4
Rank2111857410693
f7Avg.1.5811 × 10−6203.6996 × 10−327.1751 × 10−662.1876 × 10−647.1751 × 10−667.1751 × 10−661.2778 × 10−655.7664 × 10−107.1751 × 10−663.7060 × 10−10
Std.4.8221 × 10−6201.3245 × 10−313.2167 × 10−814.4300 × 10−643.2167 × 10−813.2167 × 10−812.1948 × 10−652.3165 × 10−93.2167 × 10−811.8506 × 10−9
Rank8192734611510
f8Avg.00.184501.3148 × 10−131.0941 × 10−1153.9508 × 10−1122.1483 × 10−1481.4983 × 10−58.2880 × 10−144.9084 × 10−207.1843 × 10−71
Std.00.140402.5209 × 10−134.1900 × 10−1151.3697 × 10−1111.1765 × 10−1471.0298 × 10−55.2208 × 10−142.5907 × 10−191.9871 × 10−70
Rank1111945310876
f9Avg.024.696405.8998 × 10−58.5050 × 10−463.1248 × 10−752.8005 × 10−961.14557.2583 × 10−110.16023.3952 × 10−59
Std.010.530804.1394 × 10−52.8022 × 10−451.5279 × 10−741.4385 × 10−950.8397.1158 × 10−120.6177.8186 × 10−59
Rank1111864310795
f10Avg.25.7396968.829526.729834.249526.65727.936828.722752.759528.909574.391928.9655
Std.0.295430.31860.741620.30320.62310.44495.4142 × 10−250.14322.5496 × 10−261.22052.0377 × 10−2
Rank1113824596107
f11Avg.0104.111102.8299 × 10−53.8794 × 10−467.1179 × 10−754.5743 × 10−990.16266.7352 × 10−114.3200 × 10−22.7784 × 10−59
Std.047.214101.9534 × 10−51.4895 × 10−452.8523 × 10−741.5075 × 10−986.0467 × 10−25.9932 × 10−120.16794.4222 × 10−59
Rank1111864310795
f12Avg.0.6667188.71170.66670.74360.66670.66690.76922.69670.97144.05980.9945
Std.7.9805 × 10−8111.30236.4798 × 10−80.14965.7002 × 10−71.5067 × 10−40.1172.31497.2435 × 10−34.21414.7855 × 10−3
Rank1112534697108
f13Avg.07.4583 × 10−2606.9712 × 10−352.0885 × 10−175003.8853 × 10−47.8267 × 10−2201.8237 × 10−60
Std.02.0498 × 10−2503.1468 × 10−340005.6794 × 10−43.9371 × 10−2106.9177 × 10−60
Rank1918611111017
f14Avg.06.2909 × 10−1809.9752 × 10−253.3824 × 10−1214.7568 × 10−1083.0927 × 10−1481424.13933.8701 × 10−1709.4288 × 10−56
Std.01.9893 × 10−1703.2933 × 10−241.2320 × 10−1201.8377 × 10−1071.6939 × 10−1471800.31524.8754 × 10−1703.6693 × 10−55
Rank1918564111017
f15Avg.01.5805 × 10−2302.4574 × 10−231.6178 × 10−1609.1804 × 10−1294.5201 × 10−1821.3062 × 10−28.1037 × 10−1904.5225 × 10−62
Std.05.2171 × 10−2301.2322 × 10−225.4692 × 10−1604.0309 × 10−12803.7231 × 10−23.2740 × 10−1801.3610 × 10−61
Rank1819564111017
f16Avg.0160.4583055.25440.75801.8948 × 10−15160.413719.647112.11241.5460 × 10−10
Std.029.6627023.98754.151601.0378 × 10−1441.032459.26225.38077.9237 × 10−10
Rank1111961410875
f17Avg.0173.302060.720.400600180.99699.74388.58381.0908 × 10−7
Std.034.9327018.930310.64380044.863183.36252.72314.2793 × 10−7
Rank1101871111965
f18Avg.4.4409 × 10−162.63234.4409 × 10−160.14193.9968 × 10−153.5231 × 10−152.8126 × 10−158.85372.7313 × 10−80.88154.4409 × 10−16
Std.00.371400.442302.4210 × 10−152.1546 × 10−157.47322.1311 × 10−90.67030
Rank1101865411791
f19Avg.00.139301.4138 × 10−202.1449 × 10−20254.95181.0409 × 10−110.24750
Std.05.2798 × 10−0201.8405 × 10−205.6326 × 10−20215.89638.5551 × 10−120.35130
Rank1917181116101
f20Avg.05.055501.3143 × 10−23.1090 × 10−257.7033 × 10−392.4542 × 10−616.51539.4890 × 10−98.4556 × 10−43.4163 × 10−19
Std.02.451501.1284 × 10−23.7341 × 10−254.2193 × 10−381.1818 × 10−602.54391.7541 × 10−87.7548 × 10−41.8447 × 10−18
Rank1101954311786
f21Avg.2.6129 × 10−46.4649 × 10−21.8404 × 10−20.02763.4536 × 10−32.4664 × 10−21.6021 × 10−23.55070.53191.7249 × 10−20.8967
Std.9.6524 × 10−59.1396 × 10−21.5129 × 10−26.0465 × 10−24.3030 × 10−37.8161 × 10−21.3305 × 10−21.01160.16613.9212 × 10−20.1935
Rank1857263119410
f22Avg.2.46220.50322.82954.0291 × 10−30.35160.14260.14191.0274 × 10−22.8417.7291 × 10−32.8331
Std.0.98410.20570.45355.3853 × 10−30.2587.9898 × 10−20.10087.5354 × 10−30.21821.1225 × 10−20.2424
Rank8791654311210
f23Avg.0.67836.27061.22160.34750.31460.51310.27820.741711.75374.5073 × 10−216.2517
Std.0.69033.22961.11411.050.33260.97070.38920.52452.33926.6888 × 10−22.4806
Rank6984352710111
f24Avg.04.98422.4961 × 10−60.178900010.34720.8743.1999 × 10−35.4261 × 10−5
Std.04.25891.3672 × 10−50.69580005.53632.23891.1460 × 10−21.1400 × 10−4
Rank1105811111976
f25Avg.01.87900.94529.9496 × 10−20.18249.2863 × 10−21.60190.80315.90990.0995
Std.00.502600.33941.9179 × 10−80.14570.11370.48840.19062.01331.9888 × 10−6
Rank1101846397115
f26Avg.022.023700.36320005.32917.5784 × 10−110.40150
Std.05.599800.68230002.12366.8529 × 10−120.9120
Rank1111811110791
Paired rank +/=/−24/0/28/19/122/0/420/3/318/5/318/5/324/0/226/0/020/3/323/0/3
Avg. rank1.819.382.546.924.504.693.779.317.926.655.88
Overall rank1112845310976
Table 5. Friedman ranking of each algorithm.
Table 5. Friedman ranking of each algorithm.
AlgorithmScoresRank
AP-IVYPSO1.85871
PSO9.213510
IVY2.68712
HFPSO6.72028
HJSPSO4.47454
BOA4.65215
WOA5.87866
GOOSE9.23611
PSOBOA7.9319
NSM-BO6.64567
FDB-AGSK3.74763
Table 6. Wilcoxon signed-rank test results for AP-IVYPSO and the other eight algorithms with α = 0.05.
Table 6. Wilcoxon signed-rank test results for AP-IVYPSO and the other eight algorithms with α = 0.05.
Algorithmp-ValueSignificant
AP-IVYPSO-PSO0.00005Yes
AP-IVYPSO-IVY0.01088Yes
AP-IVYPSO-HFPSO0.00187Yes
AP-IVYPSO-HJSPSO0.0105Yes
AP-IVYPSO-BOA0.00117Yes
AP-IVYPSO-WOA0.01614Yes
AP-IVYPSO-GOOSE0.00003Yes
AP-IVYPSO-PSOBOA0.00679Yes
AP-IVYPSO-NSM-BO0.00051Yes
AP-IVYPSO-FDB-AGSK0.02367Yes
Table 7. Statistical information for HPC datasets.
Table 7. Statistical information for HPC datasets.
TypeVariableMinimumMaximum ValueAverage ValueStandard DeviationUnit
input variablescement102540281.1104.54Kg/m3
blast furnace slag0359.473.9786.29Kg/m3
fly ash0200.154.2464.01Kg/m3
water121.8247181.5521.35Kg/m3
superplasticizer032.26.215.97Kg/m3
coarse aggregate8011145972.9277.79Kg/m3
fine aggregate594992.6773.5880.21Kg/m3
age136545.6263.19days
output variablecompressive strength2.3382.635.8216.71MPa
Table 8. Parameter settings of BP, PSO-BP, GA-BP, and IVY-BP.
Table 8. Parameter settings of BP, PSO-BP, GA-BP, and IVY-BP.
ModelParameter SettingModelParameter Setting
BP E p o c h s = 1000
E r r o r   G o a l = 0.000001
L e a r n i n g   R a t e = 0.01
PSO-BP C 1 = C 2 = 4.494
V = ( 1,1 ) , ω = 0.2
GA-BP S e l e c t i o n   p r e s s u r e = 0.09
C r o s s o v e r   R a t e = 2
M u t a t i o n   R a t e = [ 2 , 50 , 3 ]
IVY-BP N = 50 , α = 0.9 × 1 t M a x I t e r
G V = [ 0,1 ]
AP-IVYPSO-BP C 1 = C 2 = 2.0 , V m a x = 0.1 , a l p h a = 0.2 , ω m a x = 0.9 , ω m i n = 0.4
Table 9. Prediction results of different models.
Table 9. Prediction results of different models.
IndexModel
BPNNPSO-BPGA-BPIVY-BPAP-IVYPSO-BP
R 2 0.85330.86310.88850.93850.9542
M A E 5.084.91424.22093.35593.0404
R M S E 6.44956.22985.62384.17483.7991
T I M E 0.44660.89460.91360.78360.9667
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, K.; Li, X.; Zhang, S.; Zhang, S. A Bio-Inspired Adaptive Probability IVYPSO Algorithm with Adaptive Strategy for Backpropagation Neural Network Optimization in Predicting High-Performance Concrete Strength. Biomimetics 2025, 10, 515. https://doi.org/10.3390/biomimetics10080515

AMA Style

Zhang K, Li X, Zhang S, Zhang S. A Bio-Inspired Adaptive Probability IVYPSO Algorithm with Adaptive Strategy for Backpropagation Neural Network Optimization in Predicting High-Performance Concrete Strength. Biomimetics. 2025; 10(8):515. https://doi.org/10.3390/biomimetics10080515

Chicago/Turabian Style

Zhang, Kaifan, Xiangyu Li, Songsong Zhang, and Shuo Zhang. 2025. "A Bio-Inspired Adaptive Probability IVYPSO Algorithm with Adaptive Strategy for Backpropagation Neural Network Optimization in Predicting High-Performance Concrete Strength" Biomimetics 10, no. 8: 515. https://doi.org/10.3390/biomimetics10080515

APA Style

Zhang, K., Li, X., Zhang, S., & Zhang, S. (2025). A Bio-Inspired Adaptive Probability IVYPSO Algorithm with Adaptive Strategy for Backpropagation Neural Network Optimization in Predicting High-Performance Concrete Strength. Biomimetics, 10(8), 515. https://doi.org/10.3390/biomimetics10080515

Article Metrics

Back to TopTop