Next Article in Journal
Novel Clinical Applications of 3D-Printed Highly Porous Titanium for Off-the-Shelf Cementless Joint Replacement Prostheses
Previous Article in Journal
Bonding Strategies for Zirconia Fixed Restorations: A Scoping Review of Surface Treatments, Cementation Protocols, and Long-Term Durability
Previous Article in Special Issue
Multi-Strategy Honey Badger Algorithm for Global Optimization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Crisscross Flower Fertilization Optimization (CCFFO): A Bio-Inspired Metaheuristic for Global and Reservoir Production Optimization

1
School of Geosciences, Yangtze University, Wuhan 430100, China
2
Key Laboratory of Exploration Technologies for Oil and Gas Resources, Yangtze University, Wuhan 430100, China
*
Author to whom correspondence should be addressed.
Biomimetics 2025, 10(9), 633; https://doi.org/10.3390/biomimetics10090633
Submission received: 29 August 2025 / Revised: 15 September 2025 / Accepted: 17 September 2025 / Published: 19 September 2025
(This article belongs to the Special Issue Advances in Biological and Bio-Inspired Algorithms)

Abstract

Developing solutions for complex optimization problems is fundamental to progress in many scientific and engineering disciplines. The Flower Fertilization Optimization (FFO) algorithm, a powerful metaheuristic inspired by the reproductive processes of flowering plants, is one such method. Nevertheless, FFO’s effectiveness can be hampered by a decline in population diversity during the search process, which increases the risk of the algorithm stagnating in local optima. To address this shortcoming, this work proposes an improved method called Crisscross Flower Fertilization Optimization (CCFFO). It enhances the FFO framework by incorporating a crisscross (CC) operator, a mechanism that facilitates a structured exchange of information between different solutions. By doing so, CCFFO effectively boosts population diversity and improves its capacity to avoid local optima. Rigorous testing on the challenging CEC2017 benchmark suite confirms CCFFO’s superiority; it achieved the top overall rank when compared against ten state-of-the-art algorithms. Furthermore, its practical effectiveness is demonstrated on a complex reservoir production optimization problem, where CCFFO secured a higher Net Present Value (NPV) than its competitors. These results highlight CCFFO’s potential as a powerful and versatile tool for solving complex, real-world optimization tasks.

1. Introduction

Effective decision-making is the bedrock of success in any complex system, whether in technological design, financial management, or logistical planning [1]. In a quantitative framework, this translates to the search for an optimal solution within a constrained, high-dimensional decision space—a process formally known as optimization. The goal is to systematically identify a course of action that yields the best possible outcome according to a predefined metric of success [2]. From scheduling airline routes to minimize fuel consumption to designing drug molecules for maximum therapeutic effect, the impact of superior optimization is profound. As the scale and complexity of these problems escalate, they invariably push beyond the limits of human intuition and conventional analytical methods, creating a critical and ever-growing demand for advanced computational tools that can navigate these intricate challenges effectively [3,4,5].
Mathematically, these real-world challenges manifest as objective functions of formidable complexity. They are frequently non-differentiable, discontinuous, and defined over a high-dimensional domain where the number of potential solutions grows exponentially with the number of variables [6]. The solution landscape is typically multimodal, populated by a vast number of local optima that can mislead an optimizer away from the true global best [7]. The presence of intricate, often nonlinear constraints further complicates the search by creating disjoint feasible regions and making it difficult to even identify a valid solution, let alone the optimal one. This inherent structural complexity forms a significant barrier, rendering many classical optimization techniques either inapplicable or inefficient [8,9,10].
To navigate such challenges, a well-established portfolio of classical optimization techniques has been historically employed, encompassing gradient-based approaches like the conjugate gradient method [11] and direct search algorithms such as the simplex method [6]. The power of these methods stems from their rigorous mathematical foundations, which often guarantee convergence under specific, idealized conditions [1]. However, this very reliance on idealized properties becomes their critical vulnerability. When confronted with the non-convex, multimodal, and often discontinuous nature of real-world problems, their theoretical guarantees evaporate. Gradient-based methods, being inherently local in scope, frequently get trapped in the nearest suboptimal peak, while direct search and programming methods often become computationally intractable as problem dimensionality increases—a phenomenon famously known as the “curse of dimensionality” [12].
To address the shortcomings of classical methods, researchers have increasingly turned to a powerful family of solution techniques: metaheuristic algorithms [13,14]. These algorithms were conceived specifically to tackle the kind of complex optimization problems that are intractable for traditional approaches. Free from the restrictive requirements of continuity or differentiability, metaheuristics employ sophisticated stochastic operators to explore the solution space globally [15]. This means they interact with a problem solely by providing a potential solution and receiving a corresponding fitness value, making them directly applicable to a wide range of problem representations, including complex simulations or surrogate models. They achieve a robust search performance by mimicking successful strategies found in nature, such as Darwinian evolution or the foraging behavior of swarms. This nature-inspired foundation grants them the remarkable ability to avoid premature convergence and effectively hone in on high-quality solutions in even the most challenging optimization landscapes [16].
The diverse strategies employed by metaheuristics generally fall into two major streams of thought. The first is rooted in evolutionary theory, called Evolutionary Algorithms (EAs) [7]. These algorithms, such as the widely used Genetic Algorithm (GA) [17] and Differential Evolution (DE) [18], iteratively breed superior solutions by applying operators that mimic genetic selection, crossover, and mutation. The second stream is inspired by the emergent problem-solving capabilities of social colonies, forming the basis of Swarm Intelligence (SI). Prominent examples like Particle Swarm Optimization (PSO) [19] and Ant Colony Optimization (ACO) [20] leverage a population of interacting agents that follow simple local rules, which collectively guide the entire swarm towards the global optimum.
Despite the proliferation and demonstrated success of these diverse metaheuristics, the No Free Lunch (NFL) theorem [21] mathematically demonstrates that, when averaged across all possible optimization problems, no single algorithm can outperform any other. This fundamental principle indicates that there is no universally superior or “one-size-fits-all” optimizer. An algorithm’s strong performance on one type of problem often comes at the expense of its effectiveness on another. This insight highlights the importance of continuing to explore and design novel or improved algorithms specifically adapted to address particular complex challenges more efficiently.
The principles of the NFL theorem are particularly evident in reservoir production optimization [22], a domain where the need for tailored algorithms is paramount. The primary objective in this high-stakes field is to maximize the economic value of an oilfield, typically quantified by the Net Present Value (NPV). This is achieved by strategically managing injection and production wells throughout the reservoir’s lifetime [23]. This task is exceptionally complex due to profound subsurface heterogeneity, highly nonlinear multiphase fluid dynamics, and a vast, high-dimensional decision space. Consequently, classical optimization methods that depend on simplified assumptions or local gradient information are fundamentally ill-equipped to handle the rugged, computationally expensive, and constrained nature of this problem, thereby creating a fertile ground for the application of advanced metaheuristic approaches.
Scholarly efforts in reservoir production optimization have largely bifurcated into two primary thrusts: developing high-level strategies to improve overall optimization efficiency and direct enhancement of the core optimization algorithms themselves. The first and arguably most critical research direction addresses the immense computational burden of full-physics simulations, which is the primary bottleneck in practical applications. The most prominent strategy in this category is the use of surrogate-assisted frameworks. For instance, Wang et al. [24] introduced a multi-surrogate framework (MSFASM) that adaptively selects models to reduce reliance on expensive evaluations. Beyond simply replacing simulations, another powerful efficiency-driven approach is knowledge transfer, where insights from previously solved problems are leveraged to accelerate new optimizations. The competitive knowledge transfer (CKT) method by Cao et al. (2023) [25] exemplifies this by intelligently reusing historical development schemes. The scope of this efficiency-focused research also extends to redefining the problem and solution paradigm. Oliver et al. (2025) [26], for example, tackled the multi-objective problem of balancing economic profit and CO2 emissions, while Zahedi-Seresht et al. (2024) [27] introduced a novel Q-learning approach from reinforcement learning to find optimal production rates in a model-free manner. These methods collectively aim to make the optimization process more tractable and cost-effective. While the aforementioned strategies enhance efficiency, another vital research thrust focuses on fundamentally improving the internal mechanics of the core optimization algorithms to bolster their search capabilities. A notable trend here is the integration of hybrid strategies to achieve a better balance between exploration and exploitation. These efforts seek to make the algorithms inherently more powerful, enabling them to find better solutions within a given computational budget.
Among the array of nature-inspired algorithms, the Flower Fertilization Optimization (FFO) algorithm [28], which simulates the pollination process of flowering plants, has distinguished itself as a particularly effective method. Its core strength lies in a dual-search mechanism that elegantly models both global pollination (cross-pollination) for exploration and local pollination (self-pollination) for exploitation. This bio-inspired design allows FFO to maintain a robust balance, leading to its successful application across numerous optimization benchmarks. However, a closer examination of its search operators reveals a structural limitation. The primary update mechanism guides individuals largely based on their own trajectory and the single global best solution, thereby offering limited channels for direct, peer-to-peer information exchange within the population. This relative isolation of search agents can lead to a gradual decline in population diversity as the search progresses, increasing the risk of premature convergence, especially on complex, deceptive landscapes. Therefore, enhancing inter-solution communication within the FFO framework presents a clear and promising avenue for improvement.
The motivation for this work stems from a structural weakness in the standard Flower Fertilization Optimization (FFO) algorithm. While FFO effectively balances global and local search, it has a significant limitation: individuals in the population primarily learn from the single global best solution. This approach restricts direct information sharing between other solutions. As a result, the population can lose diversity over time, making the algorithm prone to premature convergence, where it gets stuck on a suboptimal solution. To address this limitation, we propose the Crisscross Flower Fertilization Optimization (CCFFO). Our method introduces a crisscross (CC) operator that enables direct and structured information exchange between different individuals. We hypothesize that by improving inter-solution communication, CCFFO can better maintain population diversity, avoid premature convergence, and achieve superior optimization performance.
The contributions of this work are as follows:
  • A novel Crisscross Flower Fertilization Optimization (CCFFO) algorithm is introduced. By combining the balanced search strategy of FFO with a crisscross operator, CCFFO enhances population diversity, accelerates convergence, and improves overall optimization effectiveness.
  • The proposed method is thoroughly evaluated on the CEC2017 benchmark suite, where it is compared against a wide range of established metaheuristics. Its superiority is further confirmed through rigorous statistical analyses, including the Friedman and Wilcoxon signed-rank tests.
  • The practical value of CCFFO is demonstrated in a reservoir production optimization case study. Results show that the algorithm achieves higher economic returns (NPV) and exhibits strong robustness in tackling real-world optimization problems.
The remainder of this paper is organized as follows: Section 2 reviews the original FFO algorithm. Section 3 introduces the proposed CCFFO method and the integration of the crisscross operator. Section 4 presents the experimental design, benchmark results, and statistical evaluations. Section 5 applies CCFFO to reservoir production optimization. Finally, Section 6 concludes the work with a summary and directions for future research.

2. The Original FFO

The FFO algorithm, a metaheuristic algorithm proposed in 2025 by Albedran et al. [28], was inspired by the natural fertilization processes of flowering plants. The FFO algorithm emulates the journey of pollen grains towards ovules by integrating several key mechanisms: a global search strategy powered by Lévy flights to model long-distance pollination, a local search phase characterized by velocity reduction to simulate final approach, and a population mixing and survival strategy based on elitism. The primary mathematical model of the FFO algorithm is structured as follows:
1. Global Search via Lévy Flights: The global exploration capability of FFO is inspired by the long-distance dispersal of pollen grains. This natural process, where pollen is carried by wind or pollinators, allows for the exploration of new territories far from the origin. Specifically, the random walk pattern of Lévy flights, which is characterized by a series of short steps combined with occasional long-distance jumps, effectively mimics how a pollinator like a bee might travel locally before making a long flight to a new patch of flowers, or how a gust of wind can carry pollen unpredictably far. This ability to make large, abrupt moves across the search space is what underpins the algorithm’s global exploration capability. FFO models this using Lévy flights, a random walk pattern that enables large jumps across the search space, effectively preventing the algorithm from becoming trapped in local optima. The global search component influences the update of a solution’s position X using a step size L derived from a Lévy distribution.
L = u | v | 1 / ξ
where u     N ( 0 ,   σ 2 ) and v     N ( 0 ,   1 ) , σ is calculated as
σ = ( Γ ( 1 + ξ ) s i n ( π ξ / 2 ) Γ ( 1 + ξ 2 ) ξ 2 ξ 1 2 ) 1 / ξ
where Γ is the Gamma function and ξ is the Lévy distribution parameter, which is typically set in the interval, with 1.5 being a common value.
2. Local Search and Guided Movement: FFO’s local exploitation mechanism mimics the final, focused stage of fertilization. As a pollen grain nears an ovule, its movement slows and becomes more directed. The algorithm models this by reducing the velocity of solutions and guiding them towards the best-performing regions of the search space. This is achieved through two complementary actions: a velocity reduction controlled by a damping factor and a movement vector that pulls solutions towards a collective center defined by the best (X_first), worst (X_end), and median (X_middle) solutions in the current population. The velocity V is updated as:
V i t + 1 = V i t e ( 1 γ t + 1 )
where the reduction coefficient γ is damped over iterations t by a factor β. The overall position update integrates this local search with the global exploration of Lévy flights into a unified rule, as detailed in the original literature [28].
3. Population Mixing and Survival Strategy: To ensure the propagation of high-quality solutions and maintain a healthy level of competition, FFO employs a population mixing and survival strategy inspired by the concept of elitism. In nature, the fittest pollen grains are most likely to succeed. FFO simulates this by combining the newly generated solutions (the newpollen population) from the current iteration with the existing parent population. This merged group, containing both old and new individuals, is then subjected to a competitive exclusion process. The individuals are sorted based on their fitness (cost value), and only the top-ranking solutions—equivalent to the original population size—are selected to survive and proceed to the next generation. This mechanism guarantees that the best solution found at any point in the search is never lost and provides a strong selective pressure that continually guides the population towards more optimal regions of the search space.
The flowchart of FFO is shown in Figure 1.
In the figures, “FES” denotes the current number of function evaluations, while “MaxFES” represents the maximum number of function evaluations allowed.

3. Proposed CCFFO

3.1. Crisscross Strategy

The CC strategy, first conceptualized within the Crisscross Optimization (CSO) algorithm, is a sophisticated search mechanism designed to enhance population diversity and accelerate convergence. It achieves this by implementing two distinct yet complementary search modalities: Horizontal Crossover (HC) and Vertical Crossover (VC). The HC modality facilitates a rich exchange of information between different individuals, enabling the population to collaboratively explore the search space. Conversely, the VC modality promotes exploration within a single individual by creating novel solutions from its own dimensional components. By integrating this dual-axis crossover strategy into the FFO framework, we aim to overcome the limitations of its original update rule, providing a more structured and potent mechanism for generating trial solutions and effectively avoiding premature stagnation in local optima.
O a , d = ρ 1 P a , d + ( 1 ρ 1 ) P b , d + α ( P a , d P b , d )
where ρ 1 is a uniformly distributed random number in [0, 1], and α is a random scaling factor in [−1, 1] that controls the contribution of the difference vector. The scaling factor a allows for both interpolation and extrapolation, enhancing the algorithm’s exploratory power. However, this process can generate new solutions that fall outside the predefined search boundaries. To ensure all solutions remain within the feasible region, a clamping mechanism is applied. If any dimension of a newly generated offspring exceeds the search limits, its value is reset to the nearest boundary. A similar operation is performed to generate a second offspring from P b .
Subsequently, the Vertical Crossover is applied. For a given parent solution P a two distinct dimensions, d 1 and d 2 , are chosen at random. The algorithm generates a new offspring by interpolating between them:
O a , d 1 = ρ 2 P a , d 1 + ( 1 ρ 2 ) P a , d 2
where ρ 2 is another random number in [0, 1]. This new value replaces the original P a , d 1 creating a new candidate solution. Following crossover in both HC and VC, a greedy selection mechanism is applied. This strategy ensures that for each parent-offspring pair, only the solution with superior fitness survives into the subsequent generation.

3.2. The Proposed CCFFO

This section introduces the proposed Crisscross Flower Fertilization Optimization (CCFFO) algorithm. The CCFFO framework enhances the standard FFO by strategically integrating a crisscross (CC) operator into its iterative loop.
After initializing the population, each iteration begins by applying the standard FFO update rules. Subsequently, the CC operator is employed on the resulting population to facilitate multi-dimensional information exchange and generate new trial solutions. A selection mechanism then chooses the fittest individuals from the combined pool to form the next generation. This process repeats until a predefined stopping criterion is met. The complete workflow of CCFFO is depicted in Figure 2.
Algorithm 1 provides the pseudo-code for the CCFFO.
Algorithm 1 Pseudo-code of the CCFFO
Set parameters: M A X F E S , population_size N , d i m , m i n L , m a x L , β , γ
Initialize population P
F E s ← 0
For   i   =   1 : N
      P ( i ) . P o s i t i o n     R a n d o m   v a l u e   i n   [ m i n L ,   m a x L ]
      P ( i ) . V e l o c i t y     P ( i ) . P o s i t i o n
      P ( i ) . C o s t     f ( P ( i ) . P o s i t i o n )
      F E s     F E s   + 1
End For
Sort population by Cost in ascending order
G l o b a l B e s t     P ( 1 )
While  F E s   <   M A X F E S
      n e w P o l l e n ← Initialize empty population
     For each pollen in P
        Calculate K   =   ( X _ B e s t   +   X _ M e d i a n   +   X _ W o r s t ) / 3
        Generate a Lévy step size L
        Compute S   =   L     ( X _ c u r r e n t     V _ c u r r e n t )
        Update velocity V _ n e w   =   V _ o l d     e ^ ( 1 / ( γ t   +   1 ) )
        Update position X _ n e w   =   X _ c u r r e n t     V _ n e w   +   S     K     r a n d ( )
        Clamp X _ n e w to [ m i n L ,   m a x L ]
        Compute cost of X _ n e w
         F E s     F E s   +   1
        Add the new pollen to n e w P o l l e n
     End For
     Merge P and n e w P o l l e n
     Sort the merged population by Cost in ascending order
      P ← Retain the top N agents from the sorted population
      G l o b a l B e s t     P ( 1 )
     /* CC Strategy */
     For  i   =   1 : N
        Perform Horizontal crossover search to update P ( i ) . P o s i t i o n
        Perform Vertical crossover search to update P ( i ) . P o s i t i o n
         P ( i ) . C o s t     f ( P ( i ) . P o s i t i o n )
         F E s     F E s   +   1
     End For
     Sort population by Cost in ascending order
      G l o b a l B e s t     P ( 1 )
      U p d a t e   γ _ n e w   =   γ _ o l d     β
End While
Return  G l o b a l B e s t
The computational complexity of CCFFO is determined by four main components: population initialization, fitness evaluation, the standard FFO updates, and the integrated crisscross (CC) operator. Let T be the maximum number of iterations, N be the population size, and D be the problem dimension. The total complexity is therefore the sum of these operations: O(initialization) + O(fitness evaluation) + O(FFO update) + O(CC operator). By identifying the dominant term, the overall computational complexity of CCFFO simplifies to O(T × N × D). Although the CC operator adds computational steps, its complexity remains the same, indicating that it does not increase the overall asymptotic complexity of the algorithm, ensuring that CCFFO maintains the same level of computational efficiency as its predecessor and other common metaheuristic algorithms, such as CSO.

4. Experimental Results and Analysis

A rigorous empirical investigation was conducted to assess the optimization capabilities of the proposed CCFFO algorithm. The CEC2017 test suite, comprising 29 benchmark functions, served as the primary testbed for this quantitative assessment. To ensure an unbiased comparison, the parameters for CCFFO were adopted from the original FFO study, and all algorithms were parameterized identically, with a population size of 30 and problem dimensionality of 30. The search process was constrained by a budget of 300,000 function evaluations. To address the stochastic nature of metaheuristics, 30 independent trials were performed for each function, with the mean and standard deviation of the outcomes used as the primary performance metrics.

4.1. Benchmark Functions Overview

The performance evaluation is based on the 29 functions from the CEC2017 test suite [29]. The CEC2017 suite encompasses four distinct function categories—unimodal, multimodal, hybrid, and composition—facilitating a thorough evaluation of algorithm performance across diverse fitness landscapes. Details of these benchmark functions are provided in Table 1.

4.2. Comparative Analysis on Benchmark Functions

This section presents a comprehensive comparative analysis of the proposed Crisscross Flower Fertilization Optimization (CCFFO) algorithm against its original counterpart, FFO, and eight other widely recognized metaheuristic algorithms: DE [18], GWO [30], MFO [31], SCA [32], PSO [33], PO [34], CSA [35], and HGS [36]. The performance evaluation was conducted on the CEC2017 benchmark suite, a standard for assessing optimization algorithms.
Table 2 provides a detailed summary of the experimental results, reporting the average fitness value (Avg) and standard deviation (Std) obtained by each algorithm over multiple independent runs. Furthermore, to provide a holistic measure of overall performance, the final column of the table presents the overall rank of each algorithm based on the Friedman test across all 29 benchmark functions. In Table 2, the “AVG” column under “Overall Rank” denotes the average ranking calculated from the Friedman test, while the “+/−/=“ column offers the “win/draw/loss” statistics for CCFFO compared to each of the other algorithms.
The results in Table 2 demonstrate the superior performance of the proposed CCFFO. It achieved the top overall rank with an average rank of 1.931, outperforming all competitors. This performance gain is particularly stark when compared to the original FFO, which ranked last with an average of 9.690. This significant improvement highlights the efficacy of the integrated crisscross operator. Moreover, CCFFO surpassed the second-best algorithm, DE (average rank 2.828), by achieving better or equal results on 22 of the 29 functions (17 wins, 5 ties). Finally, the consistently low standard deviations reported for CCFFO across most functions underscore its high stability and reliability.
To further validate these findings, the statistical significance of performance disparities between CCFFO and its counterparts is quantified in Table 3, which reports the p-values derived from the Wilcoxon signed-rank test. A p-value less than 0.05 is typically considered to indicate a statistically significant difference. The results reveal that CCFFO achieves p-values well below 0.05 against the vast majority of algorithms on most functions. For instance, the performance improvement of CCFFO over the original FFO is statistically significant on all 29 functions. This confirms that the improvements are directly attributable to the crisscross strategy, which addresses a key limitation in the original FFO by facilitating structured, multi-dimensional information exchange among individuals. This mechanism enhances population diversity and strengthens the algorithm’s ability to escape from local optima, leading to a more substantial and reliable search performance.
Figure 3 illustrates the convergence profiles of CCFFO and its rival algorithms across various benchmark functions. In these plots, the x-axis denotes the number of function evaluations (FEs), while the y-axis represents the optimal fitness value attained. The plots clearly show that the CCFFO algorithm (solid red line) consistently converges faster and finds better final solutions than all other algorithms. The improvement over the original FFO (dotted orange line) is particularly significant, as FFO often gets stuck at much higher fitness values.
This demonstrates CCFFO’s superior search performance. On functions like F12 and F18, CCFFO shows a rapid initial descent, indicating strong exploitation. On problems like F6 and F20, its steady improvement without stalling highlights effective exploration and the ability to avoid local optima. In short, these curves visually confirm that the crisscross strategy significantly enhances FFO’s optimization capability.

5. Application to Production Optimization

The primary objective of the reservoir production application is to determine an optimal operational strategy for a set of production and injection wells that maximizes the field’s economic output, quantified by the Net Present Value (NPV). This task is computationally demanding, as the large number of control variables interacting over a long production lifetime creates a vast and complex search space. This high dimensionality, coupled with the intricate, nonlinear interdependencies between well controls and reservoir response makes metaheuristic algorithms an ideally suited approach for finding robust solutions. In this study, the proposed CCFFO is applied to a three-channel reservoir model, implemented using the MATLAB Reservoir Simulation Toolbox (MRST2024), and its performance is benchmarked against several other leading optimizers. The optimization target is the NPV, which serves as the sole objective function, defined in Equation (6):
N P V ( x , z ) = t = 1 n Δ t ( Q o , t r o ) ( Q w , t r w ) ( Q i , t r i ) ( 1 + b ) p t
where x represents the vector of control variables (e.g., well rates), z contains the model’s state parameters, and n is the total number of simulation steps. The terms Q o , t , Q w , t , and Q i , t signify the production rates of oil and water, and the injection rate of water at time step t , respectively. The parameters r o , r w , and r i are the per-unit revenue from oil, cost of water disposal, and cost of water injection. Finally, b is the discount rate, and p t is the cumulative time in years corresponding to step t .

5.1. Reservoir Model Description

This research incorporates a case study based on a two-dimensional, synthetic, and heterogeneous reservoir model. It is specifically configured to simulate a complex fluvial channel system. The established layout is a conventional five-spot well pattern, which includes a central production well (PRO1) and four peripheral injection wells (INJ1–INJ4), visually represented in Figure 4. The reservoir space is divided into a 25 × 25 Cartesian grid, comprising 625 active cells. Every grid block measures 20 m by 20 m with consistent thickness. A uniform porosity of 0.2 is also maintained across the entire model.
The heterogeneity of this model stems from its permeability field, which was stochastically generated. This process resulted in clearly delineated high-permeability channels (depicted in red and yellow) and low-permeability barriers (shown in dark blue). Figure 4 provides a visual representation of the spatial distribution of the natural logarithm of permeability, ln(K), highlighting its critical role in dictating fluid flow paths between the injection and production wells. The reservoir begins in a state of oil and water saturation, with the primary objective of the optimization being the efficient management of the waterflooding operation.
The optimization task is defined by the objective to maximize the Net Present Value (NPV). This maximization occurs across a production lifespan totaling 2000 days, segmented into 10 distinct control steps, each lasting 200 days. The set of decision variables encompasses the operational settings for all five wells—specifically, the four injectors and one producer—for every one of these 10 control steps. Consequently, this leads to an optimization problem characterized by a total of 50 dimensions (derived from 5 wells multiplied by 10 control steps).
In this production optimization problem, the optimization variables include the injection rate and the water extraction rate for each well. The injection rate ranges from 0 to 200 STB/DAY, and the water extraction rate for production wells also ranges from 0 to 200 STB/DAY. When all well rates are fixed at 100 STB/DAY, the NPV can be calculated as 7.3287 × 107 USD, which serves as the baseline.
The NPV functions as the fitness criterion for the optimization algorithms, its computation relying on specific economic values. These include an oil price set at 80 USD per stock tank barrel (STB), a water injection expense of 3 USD/STB, and an equivalent water processing charge of 3 USD/STB. To streamline the analysis and concentrate solely on the effectiveness of volumetric recovery, an annual discount rate of 0% is applied within the scope of this study.

5.2. Analysis and Discussion of Experimental Results

This section provides a comprehensive performance evaluation of the proposed CCFFO algorithm alongside nine other metaheuristic approaches—FFO, DE, GWO, MFO, SCA, PSO, PO, CSA, and HGS—within the context of the reservoir production optimization problem. Given the computational demands of this application, each algorithm underwent 10 independent executions to ensure a statistically robust comparison. The optimization process was limited to 100 iterations per run. Table 4 compiles the critical statistical indicators, including the average, standard deviation (Std), as well as the highest and lowest Net Present Value (NPV) obtained across these trials.
An examination of the data in Table 4 distinctly highlights CCFFO’s superior efficacy. CCFFO yielded the most favorable average NPV, reaching 9.6172 × 107 USD. This outcome decisively affirms its enhanced capacity to consistently pinpoint highly profitable production schemes. Moreover, CCFFO registered the lowest standard deviation at 1.2971 × 106 USD, a metric that attests to its exceptional operational consistency and dependability when contrasted with all peer algorithms, notably outperforming DE (1.8852 × 106) and HGS (2.0721 × 106).
The performance gain is particularly notable when compared to the original FFO, which not only achieved a significantly lower mean NPV of 8.9711 × 107 USD but also showed nearly double the instability with a standard deviation of 2.1932 × 106. This demonstrates that the crisscross operator enhances both the exploratory power and the exploitation consistency of the algorithm. The quantitative findings underscore CCFFO’s formidable capabilities: adeptly traversing an intricate search landscape to optimize financial gains, concurrent with demonstrating remarkable uniformity in performance across repeated executions.
Figure 5 graphically depicts the Net Present Value (NPV) convergence trajectories for CCFFO and nine comparative algorithms across 100 iterations. With iterations on the horizontal axis and mean NPV on the vertical, CCFFO consistently demonstrates the most rapid convergence and secures the highest ultimate NPV, distinctly surpassing all rival methods.
The algorithm demonstrates a rapid ascent in the initial 30 iterations, quickly identifying a high-value solution and maintaining its superiority throughout the optimization process. In contrast, DE, the second-best algorithm, also converges quickly but stabilizes at a distinctly lower NPV. The performance of the original FFO is significantly inferior to CCFFO, showing both a slower convergence rate and a lower final NPV, which underscores the effectiveness of the integrated crisscross operator. The other algorithms, including GWO and HGS, show moderate performance, while the remaining competitors like PO, MFO, SCA, CSA, and PSO display much slower convergence and ultimately find solutions of considerably lower economic value.
In summary, the convergence plot visually confirms CCFFO’s superior performance. It not only achieves the highest NPV but also converges faster than the other algorithms, demonstrating its efficiency and robustness in solving the complex reservoir optimization problem.

6. Conclusions

This study introduced CCFFO, a variant of the Flower Fertilization Optimization algorithm, designed to remedy a structural limitation in its information-sharing mechanism. By integrating a crisscross (CC) operator, CCFFO facilitates direct, multi-dimensional information exchange between individuals. This enhanced communication protocol is designed to generate more diverse and high-quality trial solutions, thereby mitigating premature convergence and improving the algorithm’s overall search efficacy.
The superiority of CCFFO was validated through two comprehensive experiments. First, on the CEC2017 benchmark suite, CCFFO significantly outperformed nine other metaheuristics, achieving the top overall rank and demonstrating a substantial performance gain over the canonical FFO. Second, in its application to a complex reservoir production optimization scenario, CCFFO achieved the maximum NPV and demonstrated the most rapid convergence. This performance substantiates its practical efficacy and resilience when addressing critical engineering problems.
Future research will proceed along two primary avenues. From an algorithmic perspective, efforts will focus on enhancing the scalability of CCFFO for high-dimensional optimization and extending its framework to address multi-objective problems. From an application standpoint, we intend to investigate its performance across a broader spectrum of complex engineering and data science domains to further establish its versatility.

Author Contributions

Conceptualization, X.W.; software, X.W.; data curation, X.W.; investigation, X.W.; writing—original draft, X.W. and J.S.; project administration, X.W.; methodology, J.S.; writing—review and editing, J.S.; validation, J.S.; formal analysis, J.S.; supervision, J.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (grant number: [41972098]).

Institutional Review Board Statement

Not appliable.

Informed Consent Statement

Not appliable.

Data Availability Statement

The numerical and experimental data used to support the findings of this study are included within the article.

Acknowledgments

Not appliable.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Bottou, L.; Curtis, F.E.; Nocedal, J. Optimization Methods for Large-Scale Machine Learning. SIAM Rev. 2018, 60, 223–311. [Google Scholar] [CrossRef]
  2. Chong, E.K.; Zak, S.H. An Introduction to Optimization; John Wiley & Sons: Hoboken, NJ, USA, 2004. [Google Scholar]
  3. Hu, H.; Wang, J.; Huang, X.; Ablameyko, S.V. An Integrated Online-Offline Hybrid Particle Swarm Optimization Framework for Medium Scale Expensive Problems*. In Proceedings of the 2024 6th International Conference on Data-driven Optimization of Complex Systems (DOCS), Hangzhou, China, 16–18 August 2024; pp. 25–32. [Google Scholar]
  4. Ren, X.; Wang, H.; Hu, H.; Wang, J.; Ablameyko, S.V. Weighted Committee-Based Surrogate-Assisted Differential Evolution Framework for Efficient Medium-Scale Expensive Optimization. Int. J. Mach. Learn. Cybern. 2025, 16, 6513–6530. [Google Scholar] [CrossRef]
  5. Babaei, M.; Norouzi, A.M.; Nick, H.M.; Gluyas, J. Optimisation of Heat Recovery from Low-Enthalpy Aquifers with Geological Uncertainty Using Surrogate Response Surfaces and Simple Search Algorithms. Sustain. Energy Technol. Assess. 2022, 49, 101754. [Google Scholar] [CrossRef]
  6. Nelder, J.A.; Mead, R. A Simplex Method for Function Minimization. Comput. J. 1965, 7, 308–313. [Google Scholar] [CrossRef]
  7. Hu, H.; Shan, W.; Chen, J.; Xing, L.; Heidari, A.A.; Chen, H.; He, X.; Wang, M. Dynamic Individual Selection and Crossover Boosted Forensic-Based Investigation Algorithm for Global Optimization and Feature Selection. J. Bionic Eng. 2023, 20, 2416–2442. [Google Scholar] [CrossRef]
  8. Huang, X.; Hu, H.; Wang, J.; Yuan, B.; Dai, C.; Ablameyk, S.V. Dynamic Strongly Convex Sparse Operator with Learning Mechanism for Sparse Large-Scale Multi-Objective Optimization. In Proceedings of the 2024 6th International Conference on Data-driven Optimization of Complex Systems (DOCS), Hangzhou, China, 16–18 August 2024; pp. 121–127. [Google Scholar]
  9. Hu, H.; Shan, W.; Tang, Y.; Heidari, A.A.; Chen, H.; Liu, H.; Wang, M.; Escorcia-Gutierrez, J.; Mansour, R.F.; Chen, J. Horizontal and Vertical Crossover of Sine Cosine Algorithm with Quick Moves for Optimization and Feature Selection. J. Comput. Des. Eng. 2022, 9, 2524–2555. [Google Scholar] [CrossRef]
  10. Shan, W.; Hu, H.; Cai, Z.; Chen, H.; Liu, H.; Wang, M.; Teng, Y. Multi-Strategies Boosted Mutative Crow Search Algorithm for Global Tasks: Cases of Continuous and Discrete Optimization. J. Bionic Eng. 2022, 19, 1830–1849. [Google Scholar] [CrossRef]
  11. Polyak, B.T. The Conjugate Gradient Method in Extremal Problems. USSR Comput. Math. Math. Phys. 1969, 9, 94–112. [Google Scholar] [CrossRef]
  12. Diwekar, U.M. Introduction to Applied Optimization; Springer Nature: Berlin/Heidelberg, Germany, 2020; Volume 22. [Google Scholar]
  13. Gao, K.; Cao, Z.; Zhang, L.; Chen, Z.; Han, Y.; Pan, Q. A Review on Swarm Intelligence and Evolutionary Algorithms for Solving Flexible Job Shop Scheduling Problems. IEEE/CAA J. Autom. Sin. 2019, 6, 904–916. [Google Scholar] [CrossRef]
  14. Dréo, J.; Pétrowski, A.; Siarry, P.; Taillard, E. Metaheuristics for Hard Optimization: Methods and Case Studies; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2006; ISBN 978-3-540-30966-6. [Google Scholar]
  15. Dokeroglu, T.; Sevinc, E.; Kucukyilmaz, T.; Cosar, A. A Survey on New Generation Metaheuristic Algorithms. Comput. Ind. Eng. 2019, 137, 106040. [Google Scholar] [CrossRef]
  16. Hong, W.-J.; Yang, P.; Tang, K. Evolutionary Computation for Large-Scale Multi-Objective Optimization: A Decade of Progresses. Int. J. Autom. Comput. 2021, 18, 155–169. [Google Scholar] [CrossRef]
  17. Holland, J.H. Genetic Algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  18. Das, S.; Suganthan, P.N. Differential Evolution: A Survey of the State-of-the-Art. IEEE Trans. Evol. Comput. 2010, 15, 4–31. [Google Scholar] [CrossRef]
  19. Poli, R.; Kennedy, J.; Blackwell, T. Particle Swarm Optimization. Swarm Intell. 2007, 1, 33–57. [Google Scholar] [CrossRef]
  20. Dorigo, M.; Birattari, M.; Stutzle, T. Ant Colony Optimization. IEEE Comput. Intell. Mag. 2006, 1, 28–39. [Google Scholar] [CrossRef]
  21. Wolpert, D.H.; Macready, W.G. No Free Lunch Theorems for Optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  22. Gu, J.; Liu, W.; Zhang, K.; Zhai, L.; Zhang, Y.; Chen, F. Reservoir Production Optimization Based on Surrograte Model and Differential Evolution Algorithm. J. Pet. Sci. Eng. 2021, 205, 108879. [Google Scholar] [CrossRef]
  23. Desbordes, J.K.; Zhang, K.; Xue, X.; Ma, X.; Luo, Q.; Huang, Z.; Hai, S.; Jun, Y. Dynamic Production Optimization Based on Transfer Learning Algorithms. J. Pet. Sci. Eng. 2022, 208, 109278. [Google Scholar] [CrossRef]
  24. Wang, J.-L.; Zhang, L.-M.; Zhang, K.; Wang, J.; Zhou, J.-P.; Peng, W.-F.; Yin, F.-L.; Zhong, C.; Yan, X.; Liu, P.-Y. Multi-Surrogate Framework with an Adaptive Selection Mechanism for Production Optimization. Pet. Sci. 2023, 21, 366–383. [Google Scholar] [CrossRef]
  25. Cao, C.; Xue, X.; Zhang, K.; Song, L.; Zhang, L.; Yan, X.; Yang, Y.; Yao, J.; Zhou, W.; Liu, C. Competitive Knowledge Transfer–Enhanced Surrogate-Assisted Search for Production Optimization. SPE J. 2024, 29, 3277–3292. [Google Scholar] [CrossRef]
  26. Oliver, D.S.; Raanes, P.N.; Skorstad, A.; Sætrom, J. Multi-Objective Reservoir Production Optimization: Minimizing CO2 Emissions and Maximizing Profitability. Geoenergy Sci. Eng. 2025, 244, 213396. [Google Scholar] [CrossRef]
  27. Zahedi-Seresht, M.; Sadeghi Bigham, B.; Khosravi, S.; Nikpour, H. Oil Production Optimization Using Q-Learning Approach. Processes 2024, 12, 110. [Google Scholar] [CrossRef]
  28. Albedran, H.; Alsamia, S.; Koch, E. Flower Fertilization Optimization Algorithm with Application to Adaptive Controllers. Sci. Rep. 2025, 15, 6273. [Google Scholar] [CrossRef]
  29. Wu, G.; Mallipeddi, R.; Suganthan, P.N. Problem Definitions and Evaluation Criteria for the CEC 2017 Competition on Constrained Real-Parameter Optimization; Technical Report; National University of Defense Technology: Changsha, China; Kyungpook National University: Daegu, Republic of Korea; Nanyang Technological University: Singapore, 2017. [Google Scholar]
  30. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  31. Mirjalili, S. Moth-Flame Optimization Algorithm: A Novel Nature-Inspired Heuristic Paradigm. Knowl. Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  32. Mirjalili, S. SCA: A Sine Cosine Algorithm for Solving Optimization Problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  33. Kennedy, J.; Eberhart, R. Particle Swarm Optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  34. Lian, J.; Hui, G.; Ma, L.; Zhu, T.; Wu, X.; Heidari, A.A.; Chen, Y.; Chen, H. Parrot Optimizer: Algorithm and Applications to Medical Problems. Comput. Biol. Med. 2024, 172, 108064. [Google Scholar] [CrossRef]
  35. Askarzadeh, A. A Novel Metaheuristic Method for Solving Constrained Engineering Optimization Problems: Crow Search Algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
  36. Yang, Y.; Chen, H.; Heidari, A.A.; Gandomi, A.H. Hunger Games Search: Visions, Conception, Implementation, Deep Analysis, Perspectives, and towards Performance Shifts. Expert Syst. Appl. 2021, 177, 114864. [Google Scholar] [CrossRef]
Figure 1. Flowchart of the FFO.
Figure 1. Flowchart of the FFO.
Biomimetics 10 00633 g001
Figure 2. Flowchart of the CCFFO.
Figure 2. Flowchart of the CCFFO.
Biomimetics 10 00633 g002
Figure 3. Convergence curves of the CCFFO on benchmarks with other algorithms.
Figure 3. Convergence curves of the CCFFO on benchmarks with other algorithms.
Biomimetics 10 00633 g003
Figure 4. Reservoir Permeability Distribution: Logarithmic Scale.
Figure 4. Reservoir Permeability Distribution: Logarithmic Scale.
Biomimetics 10 00633 g004
Figure 5. Performance Convergence: NPV Over Iterations.
Figure 5. Performance Convergence: NPV Over Iterations.
Biomimetics 10 00633 g005
Table 1. CEC2017 benchmark functions.
Table 1. CEC2017 benchmark functions.
FunctionFunction NameClassOptimum
F1Shifted and Rotated Bent Cigar FunctionUnimodal100
F2Shifted and Rotated Zakharov FunctionUnimodal300
F3Shifted and Rotated Rosenbrock’s FunctionMultimodal400
F4Shifted and Rotated Rastrigin’s FunctionMultimodal500
F5Shifted and Rotated Expanded Schaffer’s F6 FunctionMultimodal600
F6Shifted and Rotated Lunacek Bi-Rastrigin FunctionMultimodal700
F7Shifted and Rotated Non-Continuous Rastrigin’s FunctionMultimodal800
F8Shifted and Rotated Lévy FunctionMultimodal900
F9Shifted and Rotated Schwefel’s FunctionMultimodal1000
F10Hybrid Function 1 (N = 3)Hybrid1100
F11Hybrid Function 2 (N = 3)Hybrid1200
F12Hybrid Function 3 (N = 3)Hybrid1300
F13Hybrid Function 4 (N = 4)Hybrid1400
F14Hybrid Function 5 (N = 4)Hybrid1500
F15Hybrid Function 6 (N = 4)Hybrid1600
F16Hybrid Function 6 (N = 5)Hybrid1700
F17Hybrid Function 6 (N = 5)Hybrid1800
F18Hybrid Function 6 (N = 5)Hybrid1900
F19Hybrid Function 6 (N = 6)Hybrid2000
F20Composition Function 1 (N = 3)Composition2100
F21Composition Function 2 (N = 3)Composition2200
F22Composition Function 3 (N = 4)Composition2300
F23Composition Function 4 (N = 4)Composition2400
F24Composition Function 5 (N = 5)Composition2500
F25Composition Function 6 (N = 5)Composition2600
F26Composition Function 7 (N = 6)Composition2700
F27Composition Function 8 (N = 6)Composition2800
F28Composition Function 9 (N = 3)Composition2900
F29Composition Function 10 (N = 3)Composition3000
Table 2. Results of the CCFFO and Other Algorithms on CEC2017.
Table 2. Results of the CCFFO and Other Algorithms on CEC2017.
F1 F2 F3
AvgStdAvgStdAvgStd
CCFFO3.3881 × 1034.0890 × 1034.7065 × 1031.9089 × 1034.6804 × 1022.7523 × 101
FFO3.2720 × 10109.4736 × 1097.3720 × 1046.9991 × 1037.9199 × 1034.3161 × 103
DE1.3949 × 1032.4372 × 1031.9102 × 1044.5464 × 1034.8701 × 1021.8843 × 100
GWO1.5882 × 1091.2995 × 1093.4227 × 1041.2012 × 1046.3126 × 1021.8522 × 102
MFO1.0622 × 10106.1853 × 1098.1494 × 1046.5255 × 1041.2362 × 1037.1530 × 102
SCA1.2621 × 10101.9108 × 1093.9515 × 1045.5319 × 1031.4592 × 1032.6218 × 102
PSO2.5666 × 1032.7181 × 1033.0000 × 1027.6775 × 10−34.6428 × 1022.3198 × 101
PO6.0914 × 1076.8054 × 1075.1560 × 1032.7670 × 1035.1882 × 1022.3781 × 101
CSA2.5045 × 1033.3705 × 1033.0001 × 1028.1843e-035.0394 × 1023.6066 × 101
HGS8.5018 × 1037.6934 × 1031.2486 × 1032.8687 × 1034.8301 × 1023.2918 × 101
F4 F5 F6
AvgStdAvgStdAvgStd
CCFFO6.0979 × 1022.9360 × 1016.0000 × 1029.0260 × 10−78.4065 × 1023.9823 × 101
FFO7.8534 × 1023.3318 × 1016.6492 × 1024.7096 × 1001.2651 × 1035.4144 × 101
DE6.0696 × 1021.0359 × 1016.0000 × 1020.0000 × 1008.4260 × 1029.6547 × 100
GWO6.0753 × 1022.9419 × 1016.0657 × 1023.1605 × 1008.7307 × 1024.6072 × 101
MFO7.1332 × 1025.7166 × 1016.3885 × 1021.2289 × 1011.1053 × 1031.7602 × 102
SCA7.7198 × 1021.8333 × 1016.5144 × 1024.9137 × 1001.1250 × 1034.7770 × 101
PSO7.0426 × 1023.0113 × 1016.4486 × 1029.1094 × 1001.0207 × 1036.4600 × 101
PO7.1488 × 1024.9183 × 1016.5833 × 1029.6541 × 1001.1449 × 1038.5649 × 101
CSA6.9130 × 1023.2097 × 1016.4821 × 1028.4799 × 1009.9714 × 1027.1038 × 101
HGS6.2181 × 1022.9955 × 1016.0216 × 1022.3036 × 1009.0443 × 1024.6124 × 101
F7 F8 F9
AvgStdAvgStdAvgStd
CCFFO8.9851 × 1021.8968 × 1011.7471 × 1035.9355 × 1023.9159 × 1035.4702 × 102
FFO1.0194 × 1032.2381 × 1016.2413 × 1033.8207 × 1025.8887 × 1036.3741 × 102
DE9.0837 × 1021.0189 × 1019.0000 × 1029.6743e-145.7651 × 1032.9940 × 102
GWO8.8904 × 1022.8199 × 1011.8415 × 1035.0349 × 1023.9461 × 1034.7637 × 102
MFO1.0126 × 1035.1630 × 1017.7088 × 1033.6124 × 1035.5687 × 1038.1903 × 102
SCA1.0507 × 1031.7524 × 1015.4695 × 1039.7933 × 1028.1652 × 1032.6112 × 102
PSO9.5080 × 1023.2770 × 1014.2856 × 1036.2042 × 1024.8491 × 1035.6152 × 102
PO9.7420 × 1023.0895 × 1014.8893 × 1038.3169 × 1025.7592 × 1037.7801 × 102
CSA9.3647 × 1021.6700 × 1013.5509 × 1037.7713 × 1025.0778 × 1036.1294 × 102
HGS9.1477 × 1022.6506 × 1013.5684 × 1031.0017 × 1033.9738 × 1034.3318 × 102
F10 F11 F12
AvgStdAvgStdAvgStd
CCFFO1.1386 × 1032.3581 × 1012.6791 × 1051.4828 × 1051.3036 × 1047.3841 × 103
FFO4.3646 × 1031.5122 × 1034.8593 × 1092.6705 × 1093.1396 × 1093.2213 × 109
DE1.1643 × 1032.3151 × 1011.5039 × 1066.3398 × 1053.0019 × 1041.7935 × 104
GWO1.8942 × 1037.9974 × 1027.6671 × 1071.0500 × 1081.5481 × 1075.5450 × 107
MFO4.1124 × 1033.5171 × 1032.9710 × 1086.8070 × 1083.8744 × 1071.9323 × 108
SCA2.0866 × 1032.7922 × 1021.1559 × 1092.5813 × 1084.0446 × 1081.7268 × 108
PSO1.2121 × 1033.1334 × 1014.1419 × 1041.9971 × 1042.0171 × 1041.8112 × 104
PO1.3175 × 1036.6615 × 1012.0095 × 1072.4767 × 1071.2096 × 1056.0060 × 104
CSA1.2498 × 1034.9018 × 1012.0200 × 1061.5415 × 1062.3269 × 1041.2670 × 104
HGS1.2216 × 1033.7360 × 1019.1223 × 1057.7382 × 1053.0920 × 1042.5837 × 104
F13 F14 F15
AvgStdAvgStdAvgStd
CCFFO2.5111 × 1042.7365 × 1042.2860 × 1031.1378 × 1032.5904 × 1032.9099 × 102
FFO1.4639 × 1069.7417 × 1051.1528 × 1082.1154 × 1083.7903 × 1034.2718 × 102
DE5.1970 × 1043.9254 × 1046.9884 × 1033.3943 × 1032.0457 × 1031.4870 × 102
GWO2.6527 × 1053.5817 × 1055.7499 × 1051.0704 × 1062.3967 × 1032.6304 × 102
MFO2.0365 × 1054.2357 × 1058.6397 × 1041.2295 × 1053.0842 × 1033.8883 × 102
SCA1.2206 × 1056.5504 × 1041.0715 × 1071.0103 × 1073.5441 × 1032.5708 × 102
PSO7.9298 × 1036.2803 × 1038.7293 × 1039.5830 × 1032.9752 × 1033.3283 × 102
PO4.5665 × 1042.8233 × 1046.1031 × 1046.3175 × 1043.2313 × 1034.0381 × 102
CSA1.6431 × 1036.1823 × 1011.0536 × 1045.9958 × 1032.9307 × 1033.1545 × 102
HGS4.0731 × 1043.3024 × 1041.9925 × 1041.5766 × 1042.6777 × 1032.9716 × 102
F16 F17 F18
AvgStdAvgStdAvgStd
CCFFO2.0061 × 1031.6557 × 1022.5342 × 1051.9617 × 1055.1152 × 1031.6276 × 103
FFO2.7891 × 1036.1977 × 1021.2063 × 1072.4887 × 1073.0165 × 1076.5274 × 107
DE1.8430 × 1034.1259 × 1013.3449 × 1051.7845 × 1058.2305 × 1034.5262 × 103
GWO2.0128 × 1031.6275 × 1025.9358 × 1051.0739 × 1067.2434 × 1051.6454 × 106
MFO2.5180 × 1032.3231 × 1023.6811 × 1067.2508 × 1061.2124 × 1073.6755 × 107
SCA2.3980 × 1031.8503 × 1022.8225 × 1061.7072 × 1062.4757 × 1071.1562 × 107
PSO2.3648 × 1032.3968 × 1021.5927 × 1051.0753 × 1057.4746 × 1039.8525 × 103
PO2.3028 × 1032.4955 × 1025.5040 × 1054.3576 × 1058.2365 × 1056.6923 × 105
CSA2.2590 × 1032.3853 × 1022.4300 × 1041.1406 × 1045.2186 × 1035.8926 × 103
HGS2.3202 × 1031.9175 × 1022.9100 × 1052.5191 × 1051.7732 × 1041.6537 × 104
F19 F20 F21
AvgStdAvgStdAvgStd
CCFFO2.2693 × 1031.3807 × 1022.3694 × 1033.6153 × 1012.3000 × 1033.5827e-13
FFO2.5874 × 1031.3039 × 1022.6041 × 1034.8142 × 1017.0276 × 1031.0755 × 103
DE2.1296 × 1038.1035 × 1012.4093 × 1031.0166 × 1014.1104 × 1032.0812 × 103
GWO2.3629 × 1031.4547 × 1022.3832 × 1031.6838 × 1014.3958 × 1031.6311 × 103
MFO2.7506 × 1031.9564 × 1022.5059 × 1033.8579 × 1016.4952 × 1031.5493 × 103
SCA2.5935 × 1031.1895 × 1022.5506 × 1031.8541 × 1018.2068 × 1032.4221 × 103
PSO2.6340 × 1032.2488 × 1022.4535 × 1035.6795 × 1014.4673 × 1032.2543 × 103
PO2.5393 × 1031.7828 × 1022.5153 × 1035.4875 × 1013.3303 × 1031.8239 × 103
CSA2.4747 × 1031.2842 × 1022.4713 × 1034.1592 × 1012.4707 × 1039.2913 × 102
HGS2.4980 × 1032.0102 × 1022.4187 × 1032.8084 × 1014.8602 × 1031.5246 × 103
F22 F23 F24
AvgStdAvgStdAvgStd
CCFFO2.7272 × 1032.2919 × 1012.8987 × 1031.8880 × 1012.8939 × 1031.5141 × 101
FFO3.2560 × 1031.4457 × 1023.6733 × 1032.8431 × 1023.7793 × 1034.3463 × 102
DE2.7578 × 1037.9885 × 1002.9590 × 1031.2101 × 1012.8874 × 1033.9882 × 10−1
GWO2.7478 × 1033.3344 × 1012.9121 × 1033.3407 × 1012.9764 × 1033.5718 × 101
MFO2.8316 × 1033.5175 × 1013.0006 × 1034.2546 × 1013.2953 × 1034.8754 × 102
SCA2.9844 × 1032.6955 × 1013.1613 × 1032.3055 × 1013.1830 × 1035.7385 × 101
PSO3.3144 × 1031.7054 × 1023.3819 × 1031.0675 × 1022.8817 × 1031.2739 × 101
PO2.9581 × 1037.0920 × 1013.1055 × 1036.3218 × 1012.9414 × 1032.6784 × 101
CSA3.0959 × 1031.0424 × 1023.2092 × 1031.2090 × 1022.9316 × 1032.1021 × 101
HGS2.7752 × 1033.1514 × 1013.0324 × 1034.8507 × 1012.8883 × 1037.5245 × 100
F25 F26 F27
AvgStdAvgStdAvgStd
CCFFO3.4728 × 1031.1033 × 1033.2210 × 1031.0616 × 1013.1424 × 1034.7079 × 101
FFO9.3492 × 1037.9015 × 1023.8081 × 1032.2961 × 1025.2444 × 1035.5411 × 102
DE4.6204 × 1031.0661 × 1023.2060 × 1032.9832 × 1003.1768 × 1035.6392 × 101
GWO4.7008 × 1033.8852 × 1023.2534 × 1032.9753 × 1013.4148 × 1039.1294 × 101
MFO6.0385 × 1034.9203 × 1023.2464 × 1032.0168 × 1014.5376 × 1039.7285 × 102
SCA7.0123 × 1032.9692 × 1023.4056 × 1033.4502 × 1013.8128 × 1031.2535 × 102
PSO6.7644 × 1032.1414 × 1033.2406 × 1032.7363 × 1023.1621 × 1036.3606 × 101
PO6.3874 × 1031.4138 × 1033.3174 × 1037.8114 × 1013.2975 × 1033.5404 × 101
CSA4.9366 × 1032.2006 × 1033.6000 × 1031.8595 × 1023.2145 × 1031.8546 × 101
HGS4.8055 × 1034.8203 × 1023.2264 × 1031.4534 × 1013.1937 × 1035.3486 × 101
F28 F29
AvgStdAvgStd
CCFFO3.5697 × 1031.7701 × 1026.5019 × 1037.9346 × 102
FFO5.3737 × 1036.4659 × 1021.4443 × 1085.4055 × 108
DE3.5174 × 1036.2973 × 1011.1730 × 1042.0397 × 103
GWO3.7025 × 1031.2777 × 1025.0522 × 1064.5566 × 106
MFO4.2545 × 1033.4156 × 1024.7871 × 1055.8888 × 105
SCA4.6363 × 1032.3954 × 1027.2163 × 1073.8671 × 107
PSO4.0002 × 1032.8736 × 1025.3397 × 1032.7270 × 103
PO4.5231 × 1033.4208 × 1027.1810 × 1065.2589 × 106
CSA4.4017 × 1033.6065 × 1021.1669 × 1059.3591 × 104
HGS3.7414 × 1031.9507 × 1021.2730 × 1051.6515 × 105
Overall Rank
RANK+/=/−AVG
CCFFO1~1.931
FFO1029/0/09.6897
DE217/5/72.8276
GWO621/7/14.7931
MFO829/0/07.4828
SCA929/0/08.5172
PSO418/5/64.4138
PO728/1/06.5517
CSA525/1/34.5862
HGS321/7/14.2069
Table 3. The p-values of the CCFFO versus other algorithms on CEC2017.
Table 3. The p-values of the CCFFO versus other algorithms on CEC2017.
CCFFOFFODEGWOMFO
F1/1.7344 × 10−68.7297 × 10−31.7344 × 10−61.7344 × 10−6
F2/1.7344 × 10−61.7344 × 10−61.7344 × 10−66.9838 × 10−6
F3/1.7344 × 10−61.3820 × 10−31.7344 × 10−61.7344 × 10−6
F4/1.7344 × 10−69.7539 × 10−14.5281 × 10−12.8786 × 10−6
F5/1.7344 × 10−63.9063 × 10−31.7344 × 10−61.7344 × 10−6
F6/1.7344 × 10−67.0356 × 10−11.1079 × 10−22.1266 × 10−6
F7/1.7344 × 10−61.8519 × 10−27.1903 × 10−21.7344 × 10−6
F8/1.7344 × 10−61.7344 × 10−61.8462 × 10−11.7344 × 10−6
F9/2.3534 × 10−61.7344 × 10−68.6121 × 10−12.8786 × 10−6
F10/1.7344 × 10−63.8811 × 10−41.7344 × 10−61.7344 × 10−6
F11/1.7344 × 10−62.1266 × 10−61.7344 × 10−62.8786 × 10−6
F12/1.7344 × 10−64.8603 × 10−51.7344 × 10−61.7344 × 10−6
F13/1.7344 × 10−61.1973 × 10−31.7423 × 10−41.3820 × 10−3
F14/1.7344 × 10−62.8786 × 10−61.7344 × 10−61.7344 × 10−6
F15/1.7344 × 10−62.8786 × 10−63.3789 × 10−31.2506 × 10−4
F16/1.9209 × 10−61.3595 × 10−49.0993 × 10−12.8786 × 10−6
F17/1.7344 × 10−61.2044 × 10−12.4308 × 10−24.5336 × 10−4
F18/1.7344 × 10−62.2551 × 10−31.7344 × 10−66.9838 × 10−6
F19/1.1265 × 10−51.0357 × 10−31.3975 × 10−22.3534 × 10−6
F20/1.7344 × 10−62.1266 × 10−68.2206 × 10−21.7344 × 10−6
F21/1.7344 × 10−61.7344 × 10−61.7344 × 10−61.7344 × 10−6
F22/1.7344 × 10−65.2165 × 10−64.7162 × 10−21.7344 × 10−6
F23/1.7344 × 10−61.7344 × 10−61.5286 × 10−11.7344 × 10−6
F24/1.7344 × 10−63.4935 × 10−11.7344 × 10−61.7344 × 10−6
F25/1.7344 × 10−61.0570 × 10−41.1499 × 10−42.8786 × 10−6
F26/1.7344 × 10−62.6033 × 10−64.2857 × 10−63.1123 × 10−5
F27/1.7344 × 10−61.3194 × 10−21.7344 × 10−61.7344 × 10−6
F28/1.7344 × 10−62.0589 × 10−11.4839 × 10−32.3534 × 10−6
F29/1.7344 × 10−61.7344 × 10−61.7344 × 10−61.7344 × 10−6
SCAPSOPOCSAHGS
F11.7344 × 10−67.3433 × 10−11.7344 × 10−61.9152 × 10−11.2866 × 10−3
F21.7344 × 10−61.7344 × 10−65.4401 × 10−11.7344 × 10−61.3595 × 10−4
F31.7344 × 10−69.7539 × 10−15.2165 × 10−62.6134 × 10−46.2683 × 10−2
F41.7344 × 10−61.7344 × 10−61.7344 × 10−62.6033 × 10−68.9718 × 10−2
F51.7344 × 10−61.7344 × 10−61.7344 × 10−61.7344 × 10−61.7344 × 10−6
F61.7344 × 10−61.7344 × 10−61.7344 × 10−61.9209 × 10−61.7988 × 10−5
F71.7344 × 10−61.0246 × 10−51.7344 × 10−63.5152 × 10−63.8723 × 10−2
F81.7344 × 10−61.9209 × 10−61.7344 × 10−61.9209 × 10−64.7292 × 10−6
F91.7344 × 10−68.9187 × 10−51.9209 × 10−68.4661 × 10−67.4987 × 10−1
F101.7344 × 10−64.2857 × 10−61.7344 × 10−61.7344 × 10−62.6033 × 10−6
F111.7344 × 10−62.3534 × 10−61.7344 × 10−62.1266 × 10−61.0357 × 10−3
F121.7344 × 10−61.4139 × 10−11.7344 × 10−61.8326 × 10−31.1138 × 10−3
F132.6033 × 10−68.9443 × 10−46.4242 × 10−31.7344 × 10−64.9498 × 10−2
F141.7344 × 10−63.0650 × 10−41.7344 × 10−61.7344 × 10−62.8786 × 10−6
F152.1266 × 10−61.2506 × 10−41.2381 × 10−58.9443 × 10−42.0589 × 10−1
F162.1266 × 10−62.8434 × 10−54.8603 × 10−52.5967 × 10−54.7292 × 10−6
F171.7344 × 10−63.8723 × 10−21.1138 × 10−31.7344 × 10−66.4352 × 10−1
F181.7344 × 10−65.3044 × 10−11.7344 × 10−62.3038 × 10−21.1499 × 10−4
F196.3391 × 10−64.2857 × 10−64.2857 × 10−61.9729 × 10−56.3198 × 10−5
F201.7344 × 10−63.1123 × 10−51.7344 × 10−61.7344 × 10−65.7517 × 10−6
F211.7344 × 10−61.3101 × 10−41.7344 × 10−61.7344 × 10−61.7344 × 10−6
F221.7344 × 10−61.7344 × 10−61.7344 × 10−61.7344 × 10−65.2165 × 10−6
F231.7344 × 10−61.7344 × 10−61.7344 × 10−61.7344 × 10−61.7344 × 10−6
F241.7344 × 10−65.2872 × 10−45.2165 × 10−62.5967 × 10−58.9718 × 10−2
F251.7344 × 10−62.5970 × 10−53.5152 × 10−61.8326 × 10−32.0515 × 10−4
F261.7344 × 10−63.5888 × 10−41.7344 × 10−61.7344 × 10−68.2206 × 10−2
F271.7344 × 10−62.7116 × 10−11.7344 × 10−63.1817 × 10−62.8308 × 10−4
F281.7344 × 10−69.3157 × 10−61.7344 × 10−61.9209 × 10−62.9575 × 10−3
F291.7344 × 10−67.2710 × 10−31.7344 × 10−61.7344 × 10−61.7344 × 10−6
Table 4. The results of CCFFO and other algorithms on the oil reservoir production optimization.
Table 4. The results of CCFFO and other algorithms on the oil reservoir production optimization.
AlgorithmNPV (USD)
MeanStdBestWorst
CCFFO9.6172 × 1071.2971 × 1069.9131 × 1079.3212 × 107
FFO8.9711 × 1072.1932 × 1069.3259 × 1078.4846 × 107
DE9.2935 × 1071.8852 × 1069.7418 × 1078.8915 × 107
GWO9.0331 × 1072.2142 × 1069.5422 × 1078.5497 × 107
MFO8.2722 × 1071.9501 × 1068.6192 × 1077.7980 × 107
SCA8.2555 × 1072.9418 × 1068.7861 × 1077.6303 × 107
PSO7.7534 × 1073.6438 × 1068.6419 × 1077.0297 × 107
PO8.5447 × 1072.4544 × 1068.9692 × 1077.9498 × 107
CSA7.9733 × 1073.5607 × 1068.5280 × 1077.1727 × 107
HGS8.8127 × 1072.0721 × 1069.2660 × 1078.3578 × 107
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, X.; Shan, J. Crisscross Flower Fertilization Optimization (CCFFO): A Bio-Inspired Metaheuristic for Global and Reservoir Production Optimization. Biomimetics 2025, 10, 633. https://doi.org/10.3390/biomimetics10090633

AMA Style

Wang X, Shan J. Crisscross Flower Fertilization Optimization (CCFFO): A Bio-Inspired Metaheuristic for Global and Reservoir Production Optimization. Biomimetics. 2025; 10(9):633. https://doi.org/10.3390/biomimetics10090633

Chicago/Turabian Style

Wang, Xu, and Jingfu Shan. 2025. "Crisscross Flower Fertilization Optimization (CCFFO): A Bio-Inspired Metaheuristic for Global and Reservoir Production Optimization" Biomimetics 10, no. 9: 633. https://doi.org/10.3390/biomimetics10090633

APA Style

Wang, X., & Shan, J. (2025). Crisscross Flower Fertilization Optimization (CCFFO): A Bio-Inspired Metaheuristic for Global and Reservoir Production Optimization. Biomimetics, 10(9), 633. https://doi.org/10.3390/biomimetics10090633

Article Metrics

Back to TopTop