Next Article in Journal
Solutions of Nonlinear Fractional-Order Differential Equation Systems Using a Numerical Technique
Previous Article in Journal
Comprehensive Weighted Newton Inequalities for Broad Function Classes via Generalized Proportional Fractional Operators
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Improved Chaotic Game Optimization Algorithm and Its Application in Air Quality Prediction

1
College of Big Data Statistics, Guizhou University of Finance and Economics, Guiyang 550025, China
2
College of Environmental Science and Engineering, China West Normal University, Nanchong 637002, China
3
School of Information, Guizhou University of Finance and Economics, Guiyang 550025, China
4
School of Humanities, Guizhou University of Finance and Economics, Guiyang 550025, China
*
Author to whom correspondence should be addressed.
Axioms 2025, 14(4), 235; https://doi.org/10.3390/axioms14040235
Submission received: 27 February 2025 / Revised: 17 March 2025 / Accepted: 19 March 2025 / Published: 21 March 2025

Abstract

:
Air pollution poses significant threats to public health and ecological sustainability, necessitating precise air quality prediction to facilitate timely preventive measures and policymaking. Although Long Short-Term Memory (LSTM) networks demonstrate effectiveness in air quality prediction, their performance critically depends on appropriate hyperparameter configuration. Traditional manual parameter tuning methods prove inefficient and prone to suboptimal solutions. While conventional swarm intelligence algorithms have been proved to be effective in optimizing the hyperparameters of LSTM models, they still face challenges in prediction accuracy and model generalizability. To address these limitations, this study proposes an improved chaotic game optimization (ICGO) algorithm incorporating multiple improvement strategies, subsequently developing an ICGO-LSTM hybrid model for Chengdu’s air quality prediction. The experimental validation comprises two phases: First, comprehensive benchmarking on 23 mathematical functions reveals that the proposed ICGO algorithm achieves superior mean values across all test functions and optimal variance metrics in 22 functions, demonstrating enhanced global convergence capability and algorithmic robustness. Second, comparative analysis with seven swarm-optimized LSTM models and six machine learning benchmarks on Chengdu’s air quality dataset shows the ICGO-LSTM model’s superior performance. Extensive evaluations show that the proposed model achieves minimal error metrics, MAE = 3.2865, MAPE = 0.720%, and RMSE = 4.8089, along with an exceptional coefficient of determination (R2 = 0.98512). These results indicate that the proposed ICGO-LSTM model significantly outperforms comparative models in predictive accuracy and reliability, suggesting substantial practical implications for urban environmental management.

1. Introduction

Automotive and industrial expansion has become a major pollution source, with vehicular and industrial emissions severely degrading atmospheric conditions and compounding public health risks [1]. Urban air quality deterioration now presents dual threats: environmental degradation and direct physiological harm to residents. Current mitigation strategies relying mainly on real-time monitoring stations prove inadequate for proactive management. The Air Quality Index (AQI) serves as a vital standardized metric, quantifying pollutant concentrations and their health consequences [2]. This underscores the urgent imperative to develop predictive air quality models—a critical tool for enabling preemptive interventions and targeted emission control strategies.
The calculation of the AQI usually relies on a large number of observation data points and complex mathematical models, and this process often needs more manpower and time. A surrogate model is a kind of data model used to approximate the expression of complex optimization problems, which can not only improve the accuracy of the solution but also greatly reduce the amount of calculation [3]. Therefore, the surrogate model has been widely used in the computation of AQIs. The traditional surrogate model mainly includes a regression model [4], ARIMA model [5], and Holt Winter model [6]. However, the AQI is usually affected by many aspects such as air pollutants and atmospheric factors, showing complex nonlinear characteristics [7]. In this case, machine learning techniques can be used for more accurate AQI predictions by effectively identifying complex patterns and nonlinear trends in the data [8,9,10]. For example, Simu et al. [11] used surrogate models such as random forests to predict pollutant concentrations emitted from the stack and examine model prediction performance against the root mean square error. Leong et al. [12] used a Support Vector Machine (SVM) to predict the AQI in their study, and analyzed the model results using the sum of squares error and coefficient of determination ( R 2 ). Kumar et al. [13] selected the air pollution data of India in the past six years and used five machine learning models to make predictions. Guo et al. [14] took PM2.5 and meteorological elements as input variables of the artificial neural network, aiming to predict the PM2.5 value of Chongqing, where it can be found that an artificial neural network is effective in predicting the PM2.5 concentration.
Among the AQI prediction methods, the Long Short-Term Memory (LSTM) model stands out due to its strong nonlinear fitting ability [15,16]. The internal gate mechanism of LSTM is effective for the management and utilization of historical information and can better discover the long-term dependencies existing in the data, so as to improve the prediction accuracy of complex time series data such as AQI data [17,18]. However, inappropriate parameter selection not only easily affects the prediction ability of LSTM but also is prone to gradient disappearance and other situations [19]. Consequently, the selection of an appropriate optimization methodology for LSTM neural network parameter tuning becomes crucial in machine learning applications. This critical task inherently constitutes an intricate non-convex optimization challenge, particularly due to the high-dimensional parameter space and complex temporal dependency characteristics of LSTM architectures, rendering the solution process computationally demanding. Traditional gradient-based optimization techniques, constrained by their dependence on function differentiability and local gradient information, frequently exhibit diminished efficacy in such scenarios, often converging to suboptimal local minima [20]. In contrast, metaheuristic (MH) algorithms demonstrate distinct advantages through their derivative-free optimization mechanism and proven global convergence properties [21,22,23]. These population-based stochastic methods, inspired by natural phenomena and evolutionary principles, have shown remarkable success in navigating complex solution landscapes, thereby attracting considerable research attention in deep learning optimization domains. The inherent adaptability of MH algorithms enables the simultaneous exploration of hyperparameter configurations and architectural parameters, which is particularly beneficial for optimizing LSTM networks with their inherent temporal dependencies and gate mechanism complexities. For example, Drewil et al. [24] combined the Genetic Algorithm (GA) with LSTM and used the resulting GA-LSTM model to predict pollution levels. Baniasadi et al. [25] proposed a model combining the binary chimpanzee algorithm and LSTM network to predict the AQI and used cross-validation methods to evaluate the prediction results. Experiments showed that the optimized LSTM model has higher prediction performance. Duan et al. [26] used the ARIMA model to fit the linear part of the AQI, used the Shitshell algorithm to optimize a CNN-LSTM model to fit the other part of the data, and finally concluded that the combined model had a higher prediction accuracy. Wang  et al. [27] used the particle swarm optimization (PSO) algorithm to optimize LSTM network weights and found that the prediction accuracy of LSTM after parameter optimization was significantly improved in air quality prediction. In conclusion, the MH algorithm used for LSTM hyperparameter optimization is beneficial for improving the model prediction performance.
Despite their widespread applications, traditional MH algorithms still exhibit limitations in global exploration capabilities, particularly manifesting as insufficient convergence precision and premature convergence to local optima [28,29,30]. The recently developed chaos game optimization (CGO) algorithm demonstrates promising optimization potential in many application fields [31]. However, similar to other swarm intelligence algorithms, CGO tends to face an elevated risk of local optima entrapment during later iterations [32]. To address these challenges, this study proposes an improved CGO (ICGO) algorithm incorporating multiple improvement strategies to enhance the global optimization ability in practical application scenarios. The proposed ICGO is subsequently employed to optimize LSTM parameters, specifically targeting accuracy improvements in the AQI prediction. The main work of this paper is as follows:
  • To enhance the global optimization ability of the CGO algorithm, we developed an ICGO algorithm by incorporating four improvement strategies: the logistic–sine chaos mapping strategy, Q-learning-based dynamic parameter adjustment, Whale Optimization Algorithm (WOA)-inspired prey-encircling strategy, and Human Behavior Evolution Algorithm (HEOA)-derived leader strategy. Extensive evaluations across 23 benchmark functions demonstrate the enhanced convergence accuracy and global search capability of our ICGO algorithm compared to other algorithms.
  • To address the hyperparameter sensitivity of LSTM networks in time series prediction, we propose a hybrid ICGO-LSTM model that integrates the ICGO algorithm for automated parameter tuning. Extensive experiment results for the ICGO-LSTM model are provided for air quality prediction in Chengdu, demonstrating its effectiveness in handling complex environmental data while maintaining generalization capabilities across diverse conditions.
  • Simulation results show that compared with seven MH algorithms optimized with LSTM networks, the ICGO-LSTM model obtains the best value in each evaluation index, which proves the effectiveness of the model. Meanwhile, compared with six machine learning algorithms, it can be found that the ICGO-LSTM model still achieves the best evaluation index, which also demonstrates that the model has good prediction performance.
The remainder of this paper is structured as follows. Section 2 presents the theoretical approach. Section 3 presents the experimental results and performance analysis of the ICGO algorithm. Section 4 presents the application of the ICGO-LSTM model in air quality prediction. Section 5 summarizes the conclusions.

2. Theoretical Approach

2.1. CGO Algorithm

As a new algorithm, CGO is inspired by chaos theory and fractal geometry [31]. By building fractal graphs from polygons and randomly generated initial points, the CGO algorithm solves the optimization problem and then obtains the solution. In this section, we present the mathematical formulation of the CGO algorithm. There are many solutions X in the CGO algorithm that represent some eligible seeds in the Sierpinski triangle (search space) [33]. Each candidate solution X consists of a series of decision variables x i , j that can be found in the Sierpinski triangle. The CGO algorithm uses the random initialization method to construct the initial solution, which can be expressed as follows:
x i j ( 0 ) = x i , m i n j + r ( x i , m a x j x i , m i n j ) , i { 1 , , n } } , j { 1 , 2 , d i m } ,
where x i j ( 0 ) represents the initial position of the solution, x i , m i n j and x i , m a x j represent the minimum and maximum variables, respectively, and r [ 0 , 1 ] . d i m and n denote the dimension size and the number of candidate solutions, respectively.
The CGO algorithm constructs a temporary Sierpinski triangle by generating eligible seeds in the search space, which helps in the efficient search of solutions. In this triangle, the position of the i-th seed is X i , the three vertices correspond to the global best position (gb), and the mean is the randomly selected seed (mg) [34]. The CGO algorithm then uses these three seeds and a random seed to generate new qualified seeds. Specifically, each candidate solution X i produces four qualified seeds, namely S E E D 1 , S E E D 2 , S E E D 3 , and S E E D 4 , by locating four points within their respective temporary Sierpinski triangle [31], whose mathematical expressions for i { 1 , 2 , n } can be respectively given by
S E E D i 1 = X i + a l p h a i × ( b e t a i × g b g a m m a i × m g i ) ,
S E E D i 2 = g b + a l p h a i × ( b e t a i × X i g a m m a i × m g i ) ,
S E E D i 3 = m g i + a l p h a i × ( b e t a i × X i g a m m a i × g b ) ,
S E E D i 4 = X i ( x i k = x i k + R ) , k = [ 1 , 2 , , d i m ] ,
where a l p h a i represents the random factor, b e t a i , g a m m a i [ 0 , 1 ] , and R denotes a uniformly distributed random number in the interval of [0,1]. The random factor a l p h a i can be given by
a p l h a i = r 2 × r ( δ × r ) + 1 ( ϵ × r ) + ( ϵ )
where δ and ϵ represent a random integer in the range of [0,1].

2.2. ICGO Algorithm

2.2.1. Logistic–Sine Mapping Strategy

The random initialization method in the traditional CGO tends to generate populations with low diversity, potentially delaying convergence. Chaotic mapping often refers to the mapping function that describes the dynamic evolution of chaotic systems. In chaotic mapping, small changes in the initial conditions can lead to large differences in system behavior, making the evolution of the system difficult to predict. As a typical chaotic mapping [35], logistic mapping has strong randomness, which can ensure a maximum mapping range and ergodic results, thus enhancing the population diversity of the CGO algorithm. In order to make the population more evenly distributed in the search space, this paper introduces logistic and sine chaotic mapping [36] into the CGO algorithm for generating a mixed logistic–sine chaotic mapping to increase the diversity of the seed population so as to significantly improve the convergence speed of the seed population in the initial stage.
The logistic–sine mixed mapping can be described as
x i + 1 = μ × x i × ( 1 X i ) + ( 4 μ ) × sin ( π × x i ) / 4 ,
where μ = 0.5 , and  x 0 = r a n d .

2.2.2. Q-Learning Strategy

As a reinforcement learning algorithm based on a value function, Q-learning aims to learn a policy to take the optimal action in a specific environment to obtain the maximum cumulative reward [37]. The main components of the Q-learning algorithm are shown in Figure 1. An agent continuously interacts with the environment, adjusting the value of each action in a given state based on rewards and the current state. These updated values are then stored in a Q table. The update rules for the Q table can be given by
Q ( s t + 1 , a t + 1 ) = Q ( s t , a t ) + λ × [ R e i n t + γ × max a Q ( s t + 1 , a t ) Q ( s t , a t ) ] ,
where Q ( s t + 1 , a t + 1 ) represents the value of Q in the next state s t + 1 , λ [ 0 , 1 ] is the learning rate, R e i n t is the immediate reward obtained for taking action a t , and  γ [ 0 , 1 ] represents the discount factor.
Since CGO has the defect of an insufficient global search ability, Q-learning is introduced into CGO to solve this problem. In leveraging the Q table update mechanism, the movement direction of seeds can be dynamically adjusted according to historical patterns and reward signals, enabling more efficient navigation toward regions likely to contain global optimal solutions. After introducing the Q-learning mechanism, the seed position in the CGO algorithm is regarded as the state in Q-learning, and the forward, backward, left, and right movements of the seed are regarded as four actions. According to the status and action, the seed is rewarded, and these rewards are the R values in the table. The position of X e l i t e is selected as the state index corresponding to the maximum Q value in the Q table, i.e.,  X e l i t e = max s Q ( s , a ) . This combination enables the algorithm to choose the action path more intelligently to achieve a better global search effect. For the ICGO algorithm, the Q table is updated using Equation (8). The reward function [38] is updated as follows:
R e i n = R e i n f ( S + 1 ) R e i n f ( S ) ,
where R e i n f ( S + 1 ) and R e i n f ( S ) represent the fitness value in the current state and the fitness value in the next state, respectively.

2.2.3. WOA Encircling Prey Mechanism

In the CGO algorithm, S E E D 4 embodies the exploration ability of the algorithm. However, due to the existence of random seeds in CGO, the situation in which the algorithm jumps out of the local optimal solution is increased. The WOA, which simulates the behavior of a whale searching for and attacking prey, is an optimization algorithm with good convergence and exploration performance [36]. In the WOA, the surrounding prey strategy can make the algorithm converge quickly and significantly improve the performance of the algorithm. Therefore, this paper introduces the encircling prey strategy of the WOA into CGO to better enhance the exploration ability in the solution space. The prey-encircling mechanism in ICGO can be given by
X ( t + 1 ) = X e l i t e M | E X e l i t e X i ( t ) | ,
where X i represents the position of the current whale, X e l i t e represents the position of the prey (the optimal whale position), t represents the number of current iterations, and M and E represent the auxiliary coefficients of the update position, and their expressions are shown in the following equation:
M = 2 b r b E = 2 r b = 2 t ( 2 / T )
where T is the maximum number of iterations.
The expression of S E E D 4 is updated according to (12).
S E E D i 4 = X e l i t e ( t ) M | E X e l i t e X i ( t ) | .

2.2.4. HEOA Leadership Strategy

In the HEOA, the whole iterative process is divided into two stages, i.e., human exploration and human development [39], where the stage of human development creates a more favorable environment for the HEOA to explore globally optimal solutions [40]. Therefore, the leader strategy with higher global exploration performance in the human development stage is applied to the CGO algorithm to strengthen its optimization.
The leader update process in the HEOA can be expressed as
X i t + 1 = rn · ones ( 1 , d i m ) + ξ · X i t , if r B ξ · X i t · exp ( t rand · T ) , if r < B
where B = 0.6 , ξ = 0.2 cos ( π 2 ( 1 t T ) ) , and  rn represents a random number following a standard normal distribution.
In CGO, new seeds with top fitness values are selected and the position update equation is added to enhance the convergence accuracy and global optimization ability. After introducing the leader strategy of the HEOA, the positions of new seeds in ICGO can be updated according to Equation (14).
S e e d n e w i t + 1 = rn · ones ( 1 , d i m ) + ξ · X i t , if r B ξ · X i t · exp ( t rand · T ) , if r < B
Algorithm 1 shows the pseudo-code of ICGO.
Algorithm 1 ICGO algorithm.
Require: Population size p o p , Maximum iterations T m a x , dimension variable D i m .
Ensure: Optimal solution X e l i t e and optimal function value f b e s t .
 Initialize the CGO population X i ( i = 1 , 2 , , p o p ) according to Equation (7).
 Calculate the fitness value for each seed.
while  t < T m a x  do
     for  i = 1 to p o p  do
         Update Q table according to Equation (8), and obtain X e l i t e = max s Q ( s , a ) .
         Calculate the random factor a l p h a i according to Equation (6).
         Generate the location of new seed according to Equations (2)–(4).
         Update the S E E D 4 according to Equation (12).
     end for
     Boundary check.
     Calculate the objective function value of the new seed, leaving the one with the better
     function value compared to the value obtained by the previous seed.
     for  i = 1 to p o p  do
         Update the locations of seeds according to Equation (14).
      end for
      Boundary check.
      Calculate the objective function value of the updated seed, and if this value is better than
      the previous one, it is replaced.
      Update the Q table and obtain X e l i t e = max s Q ( s , a ) .
end while
return  X e l i t e and f b e s t

2.3. LSTM Model

The recurrent neural network (RNN) has gradient explosion when dealing with time series data, which complicates the modeling of data [41]. Since the LSTM model controls the flow of information by introducing a control gate mechanism and cell state, thus processing time series data more effectively, it has been widely used in the prediction of time series data [41,42,43]. The structure diagram of the LSTM network is shown in Figure 2 [43], and its main update process can be expressed as
i t = sigma ( w i · [ h t 1 , X t ] + b i ) o t = sigma ( w o · [ h t 1 , X t ] + b o ) f t = sigma ( w f · [ h t 1 , X t ] + b f ) c t = f t c t 1 + i t g t h t = o t tanh ( c t ) g t = tanh ( w g · [ h t 1 , X t ] + b g )
where w and b denote the weight coefficient and the bias vector, respectively. i t , o t , f t , g t , c t , and h t represent the output corresponding to the input gate, the output gate, the forgetting gate, the candidate memory unit vector, the memory unit vector, and the final output vector at time t, respectively. sigma ( x ) and tanh ( x ) represent the activation function and the hyperbolic tangent function, respectively. Their expressions can be given by
tanh ( x ) = 1 e 2 x 1 + e 2 x , sigma ( x ) = 1 1 + e x .

2.4. Model Evaluation Index

By selecting different indicators, including the root mean Square error (RMSE), mean absolute error (MAE), mean absolute percentage error (MAPE), and determination coefficient R 2 , the relevant performance of the model was further measured [41]. The involved computational expressions can be respectively represented as
R 2 = i = 1 n Z ^ i Z i i = 1 n Z ^ i i = 1 n Z i n ( i = 1 n Z ^ i 2 ( i = 1 n Z ^ i ) 2 / n ) ( i = 1 n Z ^ i 2 ( i = 1 n Z i ) 2 / n ) R M S E = 1 n i = 1 n ( Z i Z ^ i ) 2 M A P E = 1 n i = 1 n | Z i Z ^ i Z i | × 100 % M A E = 1 n i = 1 n | Z i Z ^ i |
where n represents the number of samples, and Z i and Z ^ i denote the actual and predicted values of the i-th sample, respectively.

3. Performance Analysis of the ICGO Algorithm

3.1. Test Function

Twenty-three test functions were used as evaluation tools, including unimodal functions (F1–F7) and complex multimodal functions (F8–F23), to measure the local exploration and global exploitation performance of the ICGO algorithm [35,41]. In evaluating these diverse test functions, the robustness and effectiveness of the ICGO algorithm in dealing with various challenges could be tested effectively. Table 1 describes these functions in detail. In addition, seven algorithms, including CGO [31], SHO [44], GWO [29], SABO [45], WOA [36], ARO [46], and AO [47], were selected as comparison algorithms to verify the optimization performance of ICGO. Table 2 shows the relevant information of each algorithm. In addition, the settings set the number of experiments to 30, the population to 30, and the maximum number of iterations to 500 [31,35,44]. Finally, the mean (avg) and variance (std) [44] of the 30 results were calculated to better measure the global optimization ability of the algorithm. All experiments were implemented on MATLAB R2022a, the computer was configured with Intel(R) Core (TM) i5-12500H CPU and 16 GB of RAM, and the operating system was MICROSOFT WINDOWS 11 with the 64-bit Home Edition.

3.2. Analysis of Statistical Results

Table 3 shows the avg and std obtained by the ICGO algorithm and each algorithm over 30 experiments. In terms of unimodal functions (F1-F8), the mean values of the ICGO algorithm on functions F1 to F4 all reach the global optimal solution, and the variance is zero. Although the mean value of the ICGO algorithm on F5, F6, F7, and F8 does not reach the global optimal value of the objective functions, the mean value of the ICGO algorithm is still better than that of the other seven algorithms, showing a better local optimal solution. In function F8, the ICGO algorithm has a better mean value than the other algorithms, but its stability is slightly lower than that of the ARO, CGO, SHO, GWO, and SABO algorithms. This indicates that ICGO performs best on local convergence functions. In terms of multimodal functions (F8-F23), the ICGO algorithm achieves the optimal value of the mean value for the F9, F11, and F18 to F23 functions, and shows higher stability compared with the other seven algorithms. Although the mean value of the ICGO algorithm on F10, F12, F13, F14, F15, F16, and F17 does not reach the global optimal value of the objective function, compared with the other seven algorithms, it still converges to a better local optimal value and remains stable around the optimal value. In general, compared with the other seven algorithms, the ICGO algorithm shows better local utilization, global exploration ability, and stronger stability on the 23 test functions.
The convergence curves [35] of the ICGO algorithm and the other algorithms on some functions are shown in Figure 3. It can be found that, compared with the other seven algorithms, the ICGO pair can exert its utilization ability more on functions F1 to F4. On functions F5 to F13 and F15, the ICGO algorithm first rapidly converges to the local optimal value, then fluctuates around the local optimal value of the corresponding function in the later iteration process, and finally forms a stable value. In summary, the proposed ICGO algorithm shows strong robustness and effectiveness.

3.3. Wilcoxon Symbol Test

In order to investigate the performance difference between the ICGO algorithm and the other seven algorithms, we used Wilcoxon’s symbolic test as a research tool to conduct a correlation analysis [35]. The value after Wilcoxon’s symbol test between the ICGO algorithm and the other algorithms within a 95% confidence interval is as shown in Table 4. In the preceding command, ‘+’ indicates that ICGO is superior to the comparison algorithm, ‘−’ indicates that the ICGO algorithm has a poorer performance than the comparison algorithm, and ‘≈’ indicates that the ICGO algorithm has little difference compared to the comparison algorithm. According to the data obtained in Table 4, the following results can be observed: Compared with the CGO, SHO, and SABO algorithms, the proposed ICGO algorithm has a stronger convergence performance on 21 functions, and the performance on functions F9 and F11 is similar to that of these three algorithms. Compared with the GWO and WOA algorithms, the proposed ICGO algorithm has a stronger convergence performance on the 23 functions. Compared with the ARO and AO algorithms, the proposed ICGO algorithm shows a stronger local utilization and global exploration performance on 20 functions, and there is little difference between the convergence performance of the above two algorithms only on functions F9 to F11.

4. Application of the ICGO-LSTM Model in Air Quality Prediction

4.1. Data Preprocessing

Chengdu is located in Sichuan Province, China [41]. At present, Chengdu has a population of 21.4 million, making it the most populous city in the southwest of China. As an industrial city in the southwest of China, Chengdu’s economic scale is also considerable in the region. However, Chengdu’s location in a basin surrounded by mountains prevents air circulation and makes it difficult for pollutants to spread. In addition, as an industrial city, Chengdu has a large number of industrial emissions, which further aggravate the deterioration of air quality. Given the serious impact of worsening air pollution on Chengdu’s citizens, the accurate forecasting of Chengdu’s air pollution has become crucial. Timely and accurate air quality prediction can help Chengdu residents take preventive measures in advance, such as in preventing the spread of diseases related to air pollution. To achieve this goal, Chengdu was selected as a research object to address the complex issue of air pollution and its adverse effects on public health and the environment. In this study, using crawler technology, we successfully obtained a daily air quality dataset of Chengdu from the website (http://www.tianqihoubao.com/) from 1 January 2020 to 31 August 2024. The air quality dataset contains seben indicators, namely the Air Quality Index (AQI) and six major pollutants: P M 2.5 , P M 10 , N O 2 , CO, S O 2 , and O 3 . In order to ensure the accuracy of the prediction model, we adopted the max-min normalization method [25] to normalize the dataset and generate a time series dataset, where the normalization expression is shown as follows:
X = x x m i n x m a x x m i n ( max min ) + min ,
where x represents the air quality data, x m a x and x m i n represent the maximum and minimum value of the data, respectively, and max and min represent the maximum and minimum value range of the final data, which are generally set as 1 and 0.

4.2. Establishment of ICGO-LSTM Model

Based on the above experimental results, it can be seen that compared with the CGO algorithm, the ICGO algorithm shows higher accuracy and better global search ability. The prediction accuracy of the LSTM model is affected by the number of hidden neurons h, the learning rate l, the maximum number of iterations m, and the minimum batch scale s. However, the traditional manual parameter adjustment method is cumbersome and inefficient. Therefore, in this study, ICGO algorithm was used to optimize the hyperparameter combination ( h , l , m , s ) of the LSTM model to make the model achieve better prediction performance. A flowchart of the ICGO-LSTM model is shown in Figure 4, and the corresponding steps are summarized as follows:
Step 1: The air quality dataset of Chengdu is preprocessed, including data normalization, and divided into a training set and prediction set according to a ratio of 8:2.
Step 2: Construct the LSTM model.
Step 3: Set the parameters of the ICGO algorithm and the range of the hyperparameter combination ( h , l , m , s ) .
Step 4: The ICGO algorithm is used to optimize hyperparameters of the LSTM model, where the update procedure of the ICGO algorithm is shown in Algorithm 1.
Step 5: If the ICGO algorithm meets the termination condition, go to Step 6; otherwise, go back to Step 4.
Step 6: The optimal result of the parameter combination ( h , l , m , s ) is derived, and the trained LSTM model is obtained.
Step 7: The prediction results and errors are obtained from the trained LSTM model.

4.3. Comparison of AQI Prediction Results Between ICGO-LSTM and Other MH Algorithms

CGO-LSTM, SHO-LSTM, GWO-LSTM, SABO-LSTM, WOA-LSTM, ARO-LSTM, and AO-LSTM were used as comparison algorithms to predict the AQI of Chengdu City to verify the prediction performance of the proposed ICGO-LSTM. Table 5 shows the evaluation results of each algorithm on the air quality dataset. It can be seen that the MAE, MAPE, RMSE, and R 2 values obtained by CGO-LSTM were 9.3641, 2.0584%, 10.9916, and 0.96892, respectively. The MAE, MAPE, RMSE, and R 2 values obtained by the ICGO-LSTM model were 3.2865, 0.720%, 4.9098, and 0.98512, respectively. Therefore, the ICGO-LSTM model has lower MAE, MAPE, and RMSE values, and higher R 2 values compared to the CGO-LSTM model, which indicates that the prediction performance of the ICGO-LSTM model is better. Moreover, compared with the SHO-LSTM, GWA-LSTM, SABO-LSTM, WOA-LSTM, ARO-LSTM, and AO-LSTM models, the ICGO-LSTM model also shows lower MAE, MAPE, and RMSE values. This shows that ICGO-LSTM has better predictive ability compared with the other algorithms. In addition, it can also be observed that the prediction fitting value R 2 of the ICGO-LSTM model is the highest, reaching 0.98512, which indicates that the proposed ICGO-LSTM model has a better fitting effect on the AQI than the other comparison algorithms. In addition, the ICGO-LSTM model consumes the least time.
We drew the comparison results of each algorithm on the air quality dataset for the four indicators, as shown in Figure 5. In addition, the convergence curves of the different algorithms were plotted, as shown in Figure 6. It can be observed from the figure that the ICGO-LSTM model can quickly converge to a good objective function value.
Meanwhile, we also provide the prediction results obtained with the ICGO-LSTM, CGO-LSTM, SHO-LSTM, GWO-LSTM, SABO-LSTM, WOA-LSTM, ARO-LSTM, and AO-LSTM models and the fitting graphs of the real results, as shown in Figure 7. It can be seen that the predicted-value curve obtained by the ICGO-LSTM model almost completely coincides with the true-value curve. Therefore, compared with the predicted-value curves obtained by other algorithms, the ICGO-LSTM model shows a higher fitting accuracy. In addition, we plotted the Taylor plots of the predicted AQI values of the ICGO-LSTM model and each of the comparison algorithms, as shown in Figure 8. In looking at the value of the correlation coefficient, it is found that the correlation between the algorithms is between 0.9 and 0.99. Among them, the ICGO-LSTM model presents the highest correlation coefficient, reaching 0.99, which indicates that this model has a strong predictive ability in AQI prediction.

4.4. Comparison of AQI Prediction Results Between ICGO-LSTM Model and Other Machine Learning Methods

Six commonly used machine learning methods were used for AQI prediction to measure the prediction ability of the ICG0-LSTM model, i.e., LSTM [42], SVM [48], Elman [49], CNN [50], BP [51], and RBFNN [52] were selected as comparison models. The result values of four indicators obtained by ICGO-LSTM and six other machine learning algorithms on the test set are listed in Table 6. As can be seen from Table 6, the MAE, MAPE, RMSE, and R 2 values obtained by the traditional LSTM model were 7.8077, 1.0678%, 13.3791, and 0.90864, respectively. The experimental results show that the MAE, MAPE, and RMSE values obtained by the ICGO-LSTM model are lower, and the R 2 value is higher, which means that the proposed ICGO-LSTM model has good prediction ability. Meanwhile, compared with the other five machine learning algorithms, it was found that the MAE, MAPE, and PMSE values obtained by ICGO-LSTM model are still the lowest. In addition, it was found from that the R 2 values obtained by the SVM, Elman, CNN, BP, and RBFNN models were 0.95781, 0.87793, 0.93836, 0.95572, and 0.93876, respectively, while the R 2 value obtained by the ICGO-LSTM model was 0.98512. This further verifies that the prediction accuracy obtained by the proposed ICGO-LSTM model is high. Finally, it can be found that although the proposed ICGO-LSTM model consumes the longest time, its prediction accuracy is the highest. This section plots the evaluation results obtained by the above models on the test set, as shown in Figure 9.
Figure 10 shows the fitting plot of the predicted results and the true results obtained by ICGO-LSTM and six of the other models. It can be seen that the difference between the predicted value and the true value of the ICGO-LSTM model is the lowest. This further validates the previous views, and it is concluded that the ICGO-LSTM model has better prediction performance in AQI prediction. In addition, we plotted the Taylor plots of the predicted AQI values of the proposed ICGO-LSTM model and other machine learning models, as shown in Figure 11. In observing the data in Figure 11, it is found that the correlation coefficient of the proposed ICGO-LSTM model is also the highest, which further verifies its strong prediction ability.

5. Conclusions

Nowadays, the world is facing the imminent problem of air pollution, which has serious impacted human health, ecological balance, and economic development. In this context, the importance of accurately predicting the AQI is self-evident. Although the LSTM model is effective in air quality prediction, its performance is easily affected by hyperparameters. Therefore, we propose a novel ICGO algorithm to optimize the hyperparameters of the LSTM model and applied it to AQI prediction in Chengdu.
In the experiment, the logic–sine mapping, Q-learning strategy, WOA prey-encircling strategy, and HEOA leader strategy were introduced into the CGO algorithm, and we developed a new swarm intelligence method called ICGO. We verified the performance of ICGO on 23 test functions, where the results showed that ICGO achieves the best mean on the 23 functions and the best variance on 22 functions. Compared with seven well-known swarm intelligence algorithms, ICGO performs better on complex function problems. Secondly, ICGO-LSTM was used to predict the air quality dataset of Chengdu. The prediction results show that compared with seven MH algorithms and six machine learning models, the ICGO-LSTM model has the lowest MAE, MAPE, and RMSE values, and the R 2 values are the highest, which were 3.2865, 0.720%, 4.8089, and 0.98512, respectively. This indicates that the proposed ICGO-LSTM model has a higher prediction performance in AQI prediction. In addition, from the perspective of time consumption, the running time of the ICGO-LSTM model is less than that of the seven well-known swarm intelligence algorithms but higher than that of the six machine learning models.
Despite the remarkable achievements, there are still many aspects that need to be improved in the proposed ICGO-LSTM model, such as the selected air quality index being limited, the seasonal factors not being considered, and the model being relatively homogeneous and lacking integration with other machine models. These problems need to be further investigated in future research.

Author Contributions

Conceptualization, Y.L., R.Z., B.L. and C.T.; methodology, Y.L., R.Z. and F.S.; software, Y.L., B.Y. and B.L.; validation, Y.L. and B.Y.; data curation, B.Y. and F.S.; writing—original draft preparation, R.Z.; writing—review and editing, Y.L. and C.T.; visualization, R.Z. and F.S.; funding acquisition, C.T. and B.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the Guizhou University of Finance and Economics Innovation Exploration and Academic Emerging Project under Grant 2022XSXMB14, and in part by the Guizhou Provincial Basic Research Program (Natural Science) under Grant MS[2025]226.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Tran, V.V.; Park, D.; Lee, Y. Indoor air pollution, related human diseases, and recent trends in the control and improvement of indoor air quality. Int. J. Environ. Res. Public Health 2020, 17, 2927. [Google Scholar] [CrossRef]
  2. Guo, Z.; Jing, X.; Ling, Y.; Yang, Y.; Jing, N.; Yuan, R.; Liu, Y. Optimized air quality management based on air quality index prediction and air pollutants identification in representative cities in China. Sci. Rep. 2024, 14, 17923. [Google Scholar] [CrossRef]
  3. Nie, J.; Yu, Z.; Li, J. Multi-objective optimization of the robustness of complex networks based on the mixture of weighted surrogates. Axioms 2023, 12, 404. [Google Scholar] [CrossRef]
  4. Song, Z.; Deng, Q.; Ren, Z. Correlation and principal component regression analysis for studying air quality and meteorological elements in wuhan, china. Environ. Prog. Sustain. Energy 2020, 39, 13278. [Google Scholar] [CrossRef]
  5. Mani, G.; Viswanadhapalli, J.K. Prediction and forecasting of air quality index in chennai using regression and arima time series models. J. Eng. Res. 2022, 10, 179–194. [Google Scholar] [CrossRef]
  6. Syafei, A.D.; Ramadhan, N.; Hermana, J.; Slamet, A.; Boedisantoso, R.; Assomadi, A.F. Application of exponential smoothing holt winter and arima models for predicting air pollutant concentrations. EnvironmentAsia 2018, 11, 251–262. [Google Scholar]
  7. Janarthanan, R.; Partheeban, P.; Somasundaram, K.; Elamparithi, P.N. A deep learning approach for prediction of air quality index in a metropolitan city. Sustain. Cities Soc. 2021, 67, 102720. [Google Scholar] [CrossRef]
  8. Gu, Y.; Li, B.; Meng, Q. Hybrid interpretable predictive machine learning model for air pollution prediction. Neurocomputing 2022, 468, 123–136. [Google Scholar] [CrossRef]
  9. Gupta, N.S.; Mohta, Y.; Heda, K.; Armaan, R.; Valarmathi, B.; Arulkumaran, G. Prediction of air quality index using machine learning techniques: A comparative analysis. J. Environ. Public Health 2023, 2023, 4916267. [Google Scholar] [CrossRef]
  10. Hardini, M.; Sunarjo, R.A.; Asfi, M.; Chakim, M.H.R.; Sanjaya, Y.P.A. Predicting air quality index using ensemble machine learning. Adi J. Recent Innov. 2023, 5, 78–86. [Google Scholar] [CrossRef]
  11. Simu, S.; Turkar, V.; Martires, R.; Asolkar, V.; Monteiro, S.; Fernandes, V.; Salgaoncary, V. Air pollution prediction using machine learning. In Proceedings of the 2020 IEEE Bombay Section Signature Conference (IBSSC), Coimbatore, India, 2–4 February 2020; pp. 231–236. [Google Scholar]
  12. Leong, W.C.; Kelani, R.O.; Ahmad, Z. Prediction of air pollution index (api) using support vector machine(svm). J. Environ. Chem. Eng. 2020, 8, 103208. [Google Scholar]
  13. Kumar, K.; Pande, B.P. Air pollution prediction with machine learning: A case study of indian cities. Int. J. Environ. Sci. Technol. 2023, 20, 5333–5348. [Google Scholar]
  14. Guo, Q.; He, Z.; Wang, Z. Prediction of hourly pm2. 5 and pm10 concentrations in chongqing city in china based on artificial neural network. Aerosol Air Qual. Res. 2023, 23, 220448. [Google Scholar]
  15. Mishra, A.; Gupta, Y. Comparative analysis of air quality index prediction using deep learning algorithms. Spat. Inf. Res. 2024, 32, 63–72. [Google Scholar] [CrossRef]
  16. Zhang, J.; Li, S. Air quality index forecast in beijing based on cnn-lstm multi-model. Chemosphere 2022, 308, 136180. [Google Scholar]
  17. Sarkar, N.; Gupta, R.; Keserwani, P.K.; Govil, M.C. Air quality index prediction using an effective hybrid deep learning model. Environ. Pollut. 2022, 315, 120404. [Google Scholar]
  18. Kim, D.; Han, H.; Wang, W.; Kang, Y.; Lee, H.; Kim, H.S. Application of deep learning models and network method for comprehensive air-quality index prediction. Appl. Sci. 2022, 12, 6699. [Google Scholar] [CrossRef]
  19. Bacanin, N.; Stoean, C.; Zivkovic, M.; Rakic, M.; Strulak-Wójcikiewicz, R.; Stoean, R. On the benefits of using metaheuristics in the hyperparameter tuning of deep learning models for energy load forecasting. Energies 2023, 16, 1434. [Google Scholar] [CrossRef]
  20. Chakraborty, S.; Saha, A.K.; Sharma, S.; Chakraborty, R.; Debnath, S. A hybrid whale optimization algorithm for global optimization. J. Ambient. Intell. Humaniz. Comput. 2023, 14, 431–467. [Google Scholar]
  21. Dehghani, M.; Trojovskỳ, P. Osprey optimization algorithm: A new bio-inspired metaheuristic algorithm for solving engineering optimization problems. Front. Mech. Eng. 2023, 8, 1126450. [Google Scholar]
  22. Huang, Y.; Yu, J.; Dai, X.; Huang, Z.; Li, Y. Air-quality prediction based on the md–ipso–lstm combination model. Sustainability 2022, 14, 4889. [Google Scholar]
  23. Nguyen, A.T.; Pham, D.H.; Oo, B.L.; Ahn, Y.; Lim, B.T.H. Predicting air quality index using attention hybrid deep learning and quantum-inspired particle swarm optimization. J. Big Data 2024, 11, 71. [Google Scholar]
  24. Drewil, G.I.; Al-Bahadili, R.J. Air pollution prediction using lstm deep learning and metaheuristics algorithms. Meas. Sens. 2022, 24, 100546. [Google Scholar]
  25. Baniasadi, S.; Salehi, R.; Soltani, S.; Martín, D.; Pourmand, P.; Ghafourian, E. Optimizing long short-term memory network for air pollution prediction using a novel binary chimp optimization algorithm. Electronics 2023, 12, 3985. [Google Scholar] [CrossRef]
  26. Duan, J.; Gong, Y.; Luo, J.; Zhao, Z. Air-quality prediction based on the arima-cnn-lstm combination model optimized by dung beetle optimizer. Sci. Rep. 2023, 13, 12127. [Google Scholar]
  27. Wang, S.; Li, P.; Ji, H.; Zhan, Y.; Li, H. Prediction of air particulate matter in Beijing, China, based on the improved particle swarm optimization algorithm and long short-term memory neural network. J. Intell. Fuzzy Syst. 2021, 41, 1869–1885. [Google Scholar]
  28. Li, J.; Chen, J.; Shi, J. Evaluation of new sparrow search algorithms with sequential fusion of improvement strategies. Comput. Ind. Eng. 2023, 182, 109425. [Google Scholar] [CrossRef]
  29. Zhao, X.; Chen, Y.; Wei, G.; Pang, L.; Xu, C. A comprehensive compensation method for piezoresistive pressure sensor based on surface fitting and improved grey wolf algorithm. Measurement 2023, 207, 112387. [Google Scholar]
  30. Duan, Y.; Yu, X. A collaboration-based hybrid gwo-sca optimizer for engineering optimization problems. Expert Syst. Appl. 2023, 213, 119017. [Google Scholar]
  31. Talatahari, S.; Azizi, M. Chaos game optimization: A novel metaheuristic algorithm. Artif. Intell. Rev. 2021, 54, 917–1004. [Google Scholar]
  32. Alam, A.; Muqeem, M. An optimal heart disease prediction using chaos game optimization-based recurrent neural model. Int. J. Inf. Technol. 2024, 16, 3359–3366. [Google Scholar] [CrossRef]
  33. Goodarzimehr, V.; Talatahari, S.; Shojaee, S.; Hamzehei-Javaran, S.; Sareh, P. Structural design with dynamic constraints using weighted chaos game optimization. J. Comput. Des. Eng. 2022, 9, 2271–2296. [Google Scholar] [CrossRef]
  34. Shaheen, M.A.M.; Hasanien, H.M.; Mekhamer, S.F.; Talaat, H.E.A. A chaos game optimization algorithm-based optimal control strategy for performance enhancement of offshore wind farms. Renew. Energy Focus 2024, 49, 100578. [Google Scholar] [CrossRef]
  35. Özbay, F.A. A modified seahorse optimization algorithm based on chaotic maps for solving global optimization and engineering problems. Eng. Sci. Technol. Int. J. 2023, 41, 101408. [Google Scholar] [CrossRef]
  36. Hsieh, C.; Zhang, Q.; Xu, Y.; Wang, Z. Cmais-woa: An improved woa with chaotic mapping and adaptive iterative strategy. Discret. Dyn. Nat. Soc. 2023, 2023, 8160121. [Google Scholar] [CrossRef]
  37. Hu, Z.; Yu, X. Reinforcement learning-based comprehensive learning grey wolf optimizer for feature selection. Appl. Soft Comput. 2023, 149, 110959. [Google Scholar] [CrossRef]
  38. Gao, X.; Zhou, Y.; Xu, L.; Zhao, D. Optimal Security Protection Strategy Selection Model Based on Q-Learning Particle Swarm Optimization. Entropy 2022, 24, 1727. [Google Scholar] [CrossRef]
  39. Lian, J.; Hui, G. Human evolutionary optimization algorithm. Expert Syst. Appl. 2024, 241, 122638. [Google Scholar]
  40. Cheng, M.; Zhang, Q.; Cao, Y. An early warning model for turbine intermediate-stage flux failure based on an improved heoa algorithm optimizing dmse-gru model. Energies 2024, 17, 3629. [Google Scholar] [CrossRef]
  41. Song, Q.; Zou, J.; Xu, M.; Xi, M.; Zhou, Z. Air quality prediction for chengdu based on long short-term memory neural network with improved jellyfish search optimizer. Environ. Sci. Pollut. Res. 2023, 30, 64416–64442. [Google Scholar] [CrossRef]
  42. Chang, Y.; Chiao, H.; Abimannan, S.; Huang, Y.P.; Tsai, Y.T.; Lin, K. An lstm-based aggregated model for air pollution forecasting. Atmos. Pollut. Res. 2020, 11, 1451–1463. [Google Scholar] [CrossRef]
  43. Wang, J.; Li, J.; Wang, X.; Wang, J.; Huang, M. Air quality prediction using ct-lstm. Neural Comput. Appl. 2021, 33, 4779–4792. [Google Scholar] [CrossRef]
  44. Zhao, S.; Zhang, T.; Ma, S.; Wang, M. Sea-horse optimizer: A novel nature-inspired meta-heuristic for global optimization problems. Appl. Intell. 2023, 53, 11833–11860. [Google Scholar] [CrossRef]
  45. Trojovskỳ, P.; Dehghani, M. Subtraction-average-based optimizer: A new swarm-inspired metaheuristic algorithm for solving optimization problems. Biomimetics 2023, 8, 149. [Google Scholar] [CrossRef] [PubMed]
  46. Wang, L.; Cao, Q.; Zhang, Z.; Mirjalili, S.; Zhao, W. Artificial rabbits optimization: A new bio-inspired meta-heuristic algorithm for solving engineering optimization problems. Eng. Appl. Artif. Intell. 2022, 114, 105082. [Google Scholar] [CrossRef]
  47. Abualigah, L.; Yousri, D.; Elaziz, M.A.; Ewees, A.A.; Al-Qaness, M.A.; Gandomi, A.H. Aquila optimizer: A novel meta-heuristic optimization algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
  48. Abdullah, D.M.; Abdulazeez, A.M. Machine learning applications based on svm classification a review. Qubahan Acad. J. 2021, 1, 81–90. [Google Scholar] [CrossRef]
  49. Guo, Y.; Yang, D.; Zhang, Y.; Wang, L.; Wang, K. Online estimation of soh for lithium-ion battery based on ssa-elman neural network. Prot. Control Mod. Power Syst. 2022, 7, 40. [Google Scholar] [CrossRef]
  50. Yan, R.; Liao, J.; Yang, J.; Sun, W.; Nong, M.; Li, F. Multi-hour and multi-site air quality index forecasting in beijing using cnn, lstm, cnn-lstm, and spatiotemporal clustering. Expert Syst. Appl. 2021, 169, 114513. [Google Scholar] [CrossRef]
  51. Huang, Y.; Xiang, Y.; Zhao, R.; Cheng, Z. Air quality prediction using improved pso-bp neural network. IEEE Access 2020, 8, 99346–99353. [Google Scholar] [CrossRef]
  52. Du, J. Mechanism analysis and self-adaptive rbfnn based hybrid soft sensor model in energy production process: A case study. Sensors 2022, 22, 1333. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Diagram of the Q-learning structure.
Figure 1. Diagram of the Q-learning structure.
Axioms 14 00235 g001
Figure 2. Diagram of the LSTM network structure.
Figure 2. Diagram of the LSTM network structure.
Axioms 14 00235 g002
Figure 3. Convergence of the ICGO algorithm and other algorithms on functions.
Figure 3. Convergence of the ICGO algorithm and other algorithms on functions.
Axioms 14 00235 g003
Figure 4. Flowchart of the proposed ICGO-LSTM model.
Figure 4. Flowchart of the proposed ICGO-LSTM model.
Axioms 14 00235 g004
Figure 5. Comparison of evaluation indicators between ICGO-LSTM and other algorithms.
Figure 5. Comparison of evaluation indicators between ICGO-LSTM and other algorithms.
Axioms 14 00235 g005
Figure 6. Convergence curve of each algorithm.
Figure 6. Convergence curve of each algorithm.
Axioms 14 00235 g006
Figure 7. AQI prediction results of the ICGO-LSTM and each of the comparison algorithms.
Figure 7. AQI prediction results of the ICGO-LSTM and each of the comparison algorithms.
Axioms 14 00235 g007
Figure 8. Taylor plots of the predicted AQI values of the ICGO-LSTM model and the comparison algorithms.
Figure 8. Taylor plots of the predicted AQI values of the ICGO-LSTM model and the comparison algorithms.
Axioms 14 00235 g008
Figure 9. Comparison of evaluation results of ICGO-LSTM and the other machine learning models.
Figure 9. Comparison of evaluation results of ICGO-LSTM and the other machine learning models.
Axioms 14 00235 g009
Figure 10. AQI prediction results of the proposed ICGO-LSTM model and other machine learning models.
Figure 10. AQI prediction results of the proposed ICGO-LSTM model and other machine learning models.
Axioms 14 00235 g010
Figure 11. Taylor plot of AQI prediction values for ICGO-LSTM model and other machine learning models.
Figure 11. Taylor plot of AQI prediction values for ICGO-LSTM model and other machine learning models.
Axioms 14 00235 g011
Table 1. Twenty-three functions.
Table 1. Twenty-three functions.
FunctionRangeDimfmin
F 1 ( Y ) = i = 1 n y i 2 [−100,100]300
F 2 ( Y ) = i = 1 n | y i | + i = 1 n | y i | [−10,10]300
F 3 ( Y ) = i = 1 n ( j = 1 n y j ) 2 [−100,100]300
F 4 ( Y ) = max i { y i , 1 i n } [−100,100]300
F 5 ( Y ) = i = 1 n 1 [ 100 ( y i + 1 y i 2 ) 2 + ( y i 1 ) 2 ] [−30,30]300
F 6 ( Y ) = i = 1 n ( [ y i + 0.5 ] ) 2 [−100,100]300
F 7 ( Y ) = i = 1 n i y 4 + r a n d o m [ 0 , 1 ] [−1.28,1.28]300
F 8 ( Y ) = i = 1 n y i sin ( y i ) [−500,500]30−12,569.5
F 9 ( Y ) = i = 1 n [ y i 2 10 cos ( 2 π y i ) + 10 ] [−5.12,5.12]300
F 10 ( Y ) = 20 exp ( 0.2 ( 1 n i = 1 n y i 2 ) 0.5 ) exp ( 1 n i = 1 n cos ( 2 π y i ) ) + 20 + e [−32,32]300
F 11 ( Y ) = 1 4000 i = 1 n y i 2 i = 1 n cos ( y i i ) + 1 [−600,600]300
F 12 ( Y ) = π n { 10 sin ( π y 1 ) + i = 1 n 1 ( y i 1 ) 2 [ 1 + 10 sin 2 ( π y i + 1 ) ] + ( y n 1 ) 2 } [−50,50]300
+ i = 1 n u ( y i , 10 , 100 , 4 ) , y i = 1 + y i + 1 4 u ( y i , a , m ) = k ( y i a ) m , y i > a 0 , a < y i < a k ( y i a ) m , y i < a
F 13 ( Y ) = 0.1 { sin 2 ( 3 π y 1 ) + i 1 n ( y i 1 ) 2 [ 1 + sin 2 ( 3 π y i + 1 ) ] [−50,50]300
+ ( y n 1 ) 2 [ 1 + sin 2 ( 2 π y n ) ] } + i = 1 n u ( y i , 5 , 100 , 4 )
F 14 ( Y ) = ( 1 500 + j = 1 25 1 j + i = 1 2 ( y i a i j ) 6 ) 1 [−65,65]21
F 15 ( Y ) = i = 1 11 [ a i y 1 ( b i 2 + b i y 2 ) b i 2 + b i y 3 + y 4 ] 2 [−5,5]40.000308
F 16 ( Y ) = 4 y 1 2 2.1 y 1 4 + 1 3 y 1 6 + y 1 y 2 4 y 2 2 + 4 y 2 4 [−5,5]2−1.0316
F 17 ( Y ) = ( y 2 5.1 4 π 2 y 1 2 + 5 π y 1 6 ) 2 + 10 ( 1 1 8 π ) cos y 1 + 10 [−5,10],[0,15]20.398
F 18 ( Y ) = [ 1 + ( y 1 + y 2 + 1 ) 2 ( 19 14 y 1 + 3 x 1 2 14 y 2 + 6 y 1 y 2 + 3 y 2 2 ) ] [−2,2]23
× [ 30 + ( 2 y 1 3 y 2 ) 2 × ( 18 32 y 1 + 12 y 1 2 + 48 y 2 36 y 1 y 2 + 27 y 2 2 ) ]
F 19 ( Y ) = i = 1 4 c i exp ( j = 1 3 a i j ( y j p i j ) 2 ) [0,1]3−3.86
F 20 ( Y ) = i = 1 4 c i exp ( j = 1 6 a i j ( y j p i j ) 2 ) [0,1]6−3.32
F 21 ( Y ) = i = 1 5 [ ( Y a i ) ( Y a i ) T + c i ] 1 [0,10]4−10.15
F 22 ( Y ) = i = 1 7 [ ( Y a i ) ( Y a i ) T + c i ] 1 [0,10]4−10.4
F 23 ( Y ) = i = 1 10 [ ( Y a i ) ( Y a i ) T + c i ] 1 [0,10]4−10.536
Table 2. Parameter settings.
Table 2. Parameter settings.
AlgorithmParametersValue
CGO β , γ A random integer of 0 or 1
R[0,1]
SHO r 1 0
r 2 0.1
GWO a l p h a Linear reduction from 2 to 0
SABOv[1,2]
WOA r 2 0.1
r, l[0,1],[−1,1]
ARO r 1 , r 2 , r 3 [0,1]
AO r 1 [1,20]
v0.0265
w0.005
Table 3. Results of ICGO and each algorithm on test functions.
Table 3. Results of ICGO and each algorithm on test functions.
FunctionICGOCGOSHOGWOSABOWOAAROAOIndex
F10.00  × 10 00 8.15  × 10 25 1.49  × 10 141 9.45  × 10 28 3.10  × 10 196 2.08  × 10 73 3.40  × 10 57 3.66  × 10 122 avg
0.00  × 10 00 1.53  × 10 24 4.83  × 10 141 1.49  × 10 27 0.00  × 10 00 5.53  × 10 73 1.04  × 10 56 1.16  × 10 121 std
F20.00  × 10 00 1.81  × 10 13 1.17  × 10 78 8.05  × 10 17 5.86  × 10 111 1.31  × 10 47 1.12  × 10 31 2.37  × 10 59 avg
0.00  × 10 00 2.21  × 10 13 1.83  × 10 78 8.54  × 10 17 1.80  × 10 110 3.53  × 10 47 3.54  × 10 31 7.50  × 10 59 std
F30.00  × 10 00 1.18  × 10 17 5.26  × 10 99 1.50  × 10 05 2.06  × 10 25 4.24  × 10 04 7.33  × 10 39 3.65  × 10 101 avg
0.00  × 10 00 3.54  × 10 17 9.17  × 10 98 3.02  × 10 05 9.21  × 10 25 1.51  × 10 04 2.32  × 10 38 1.15  × 10 100 std
F40.00  × 10 00 1.18  × 10 10 2.91  × 10 56 5.64  × 10 07 1.34  × 10 77 4.41  × 10 01 8.55  × 10 25 7.45  × 10 57 avg
0.00  × 10 00 1.59  × 10 10 1.18  × 10 55 7.70  × 10 07 1.69  × 10 77 3.19  × 10 01 1.27  × 10 24 2.35  × 10 56 std
F51.77  × 10 08 2.53  × 10 01 2.81  × 10 01 2.71  × 10 01 2.84  × 10 01 2.78  × 10 01 3.17  × 10 00 4.66  × 10 03 avg
4.95  × 10 08 5.74  × 10 01 2.89  × 10 01 2.85  × 10 01 2.88  × 10 01 4.22  × 10 01 8.13  × 10 00 5.22  × 10 03 std
F67.43  × 10 12 3.73  × 10 11 3.17  × 10 00 8.81  × 10 01 2.73  × 10 00 3.49  × 10 01 1.35  × 10 03 2.82  × 10 04 avg
3.57  × 10 11 2.00  × 10 01 5.53  × 10 01 3.67  × 10 01 5.23  × 10 01 1.84  × 10 01 7.09  × 10 04 3.07  × 10 04 std
F73.21  × 10 07 1.41  × 10 03 8.30  × 10 05 6.39  × 10 04 6.43  × 10 06 2.93  × 10 03 8.73  × 10 04 1.06  × 10 04 avg
5.39  × 10 07 6.82  × 10 04 8.94  × 10 05 1.15  × 10 03 8.79  × 10 05 3.46  × 10 03 5.29  × 10 04 9.41  × 10 05 std
F8−1.08  × 10 04 −7.98  × 10 03 −6.02  × 10 03 −5.84  × 10 03 −3.18  × 10 03 −1.01  × 10 04 −9.21  × 10 03 −9.04  × 10 03 avg
1.07  × 10 03 4.37  × 10 02 6.37  × 10 02 9.33  × 10 02 3.45  × 10 02 1.93  × 10 03 3.87  × 10 02 3.90  × 10 03 std
F90.00  × 10 00 0.00  × 10 00 0.00  × 10 00 3.82  × 10 00 0.00  × 10 00 5.68  × 10 15 0.00  × 10 00 0.00  × 10 00 avg
0.00  × 10 00 0.00  × 10 00 0.00  × 10 00 4.22  × 10 00 0.00  × 10 00 2.54  × 10 14 0.00  × 10 00 0.00  × 10 00 std
F104.44  × 10 16 1.96  × 10 13 4.00  × 10 15 9.92  × 10 14 4.00  × 10 15 3.46  × 10 15 4.44  × 10 16 4.44  × 10 16 avg
0.00  × 10 00 3.58  × 10 13 0.00  × 10 00 1.73  × 10 14 0.00  × 10 00 2.38  × 10 15 0.00  × 10 00 0.00  × 10 00 std
F110.00  × 10 00 0.00  × 10 00 0.00  × 10 00 7.06  × 10 03 0.00  × 10 00 1.18  × 10 02 0.00  × 10 00 0.00  × 10 00 avg
0.00  × 10 00 0.00  × 10 00 0.00  × 10 00 1.09  × 10 02 0.00  × 10 00 5.29  × 10 02 0.00  × 10 00 0.00  × 10 00 std
F122.08  × 10 13 7.40  × 10 03 9.52  × 10 02 3.84  × 10 02 2.19  × 10 01 1.13  × 10 01 8.94  × 10 05 1.15  × 10 06 avg
6.84  × 10 13 2.63  × 10 02 7.79  × 10 02 1.58  × 10 02 8.59  × 10 02 3.80  × 10 01 6.10  × 10 05 1.03  × 10 06 std
F137.95  × 10 11 2.36  × 10 01 1.95  × 10 00 5.56  × 10 01 2.75  × 10 00 5.51  × 10 01 2.83  × 10 03 6.62  × 10 05 avg
3.05  × 10 10 5.46  × 10 01 3.27  × 10 01 2.26  × 10 01 5.49  × 10 01 2.75  × 10 01 4.70  × 10 03 9.29  × 10 05 std
F149.98  × 10 01 2.24  × 10 00 5.11  × 10 00 4.98  × 10 00 3.23  × 10 00 2.61  × 10 00 9.98  × 10 01 1.89  × 10 00 avg
0.00  × 10 00 2.54  × 10 00 4.29  × 10 00 4.53  × 10 00 1.80  × 10 00 2.39  × 10 00 0.00  × 10 00 8.69  × 10 01 std
F153.07  × 10 04 5.69  × 10 03 5.09  × 10 04 4.37  × 10 03 1.27  × 10 03 7.70  × 10 04 4.49  × 10 04 4.65  × 10 04 avg
1.94  × 10 19 9.00  × 10 03 3.57  × 10 04 8.21  × 10 03 2.83  × 10 03 5.11  × 10 04 2.97  × 10 04 9.08  × 10 05 std
F16−1.03  × 10 00 −1.03  × 10 00 −1.03  × 10 00 −1.03  × 10 00 −1.02  × 10 00 −1.03  × 10 00 −1.03  × 10 00 −1.03  × 10 00 avg
5.78  × 10 16 6.25  × 10 16 7.33  × 10 09 1.59  × 10 08 1.94  × 10 02 1.09  × 10 09 1.28  × 10 16 7.25  × 10 04 std
F173.98  × 10 01 3.98  × 10 01 3.98  × 10 01 3.98  × 10 01 5.21  × 10 01 3.98  × 10 01 3.98  × 10 01 3.98  × 10 01 avg
0.00  × 10 00 0.00  × 10 00 1.28  × 10 03 1.50  × 10 06 2.09  × 10 01 3.67  × 10 05 0.00  × 10 00 2.47  × 10 04 std
F183.00  × 10 00 3.00  × 10 00 3.00  × 10 00 3.00  × 10 00 4.17  × 10 00 3.00  × 10 00 0.00  × 10 00 3.03  × 10 00 avg
8.25  × 10 16 4.73  × 10 15 5.16  × 10 09 2.35  × 10 05 2.15  × 10 00 2.66  × 10 04 3.31  × 10 16 3.86  × 10 02 std
F19−3.86  × 10 00 −3.86  × 10 00 −3.86  × 10 00 −3.86  × 10 00 −3.65  × 10 00 −3.86  × 10 00 −3.86  × 10 00 −3.86  × 10 00 avg
2.51  × 10 15 2.58  × 10 15 3.80  × 10 03 2.19  × 10 03 1.86  × 10 01 5.09  × 10 03 9.36  × 10 16 6.41  × 10 03 std
F20−3.32  × 10 00 −3.25  × 10 00 −3.03  × 10 00 −3.26  × 10 00 −3.26  × 10 00 −3.20  × 10 00 −3.25  × 10 00 −3.20  × 10 00 avg
1.36  × 10 02 3.92  × 10 02 2.04  × 10 01 7.74  × 10 02 8.00  × 10 02 6.07  × 10 02 6.14  × 10 02 2.00  × 10 01 std
F21−1.15  × 10 01 −7.38  × 10 00 −5.85  × 10 00 −9.65  × 10 00 −5.04  × 10 00 −8.74  × 10 00 −9.64  × 10 00 −1.01  × 10 01 avg
7.17  × 10 15 3.12  × 10 00 2.47  × 10 00 1.56  × 10 00 1.64  × 10 01 2.52  × 10 00 1.61  × 10 00 9.11  × 10 02 std
F22−1.04  × 10 01 −7.59  × 10 00 −5.32  × 10 00 −1.00  × 10 01 −4.94  × 10 00 −6.57  × 10 00 −9.20  × 10 00 −1.04  × 10 01 avg
1.78  × 10 15 3.36  × 10 00 1.94  × 10 00 1.71  × 10 00 3.72  × 10 01 2.97  × 10 00 2.55  × 10 00 9.13  × 10 03 std
F23−1.05  × 10 01 −8.39  × 10 00 −5.62  × 10 00 −1.01  × 10 01 −4.69  × 10 00 −6.93  × 10 00 −9.33  × 10 00 −1.05  × 10 01 avg
7.38  × 10 16 2.93  × 10 00 1.88  × 10 00 1.81  × 10 00 1.25  × 10 00 3.28  × 10 00 2.57  × 10 00 1.47  × 10 02 std
Table 4. Wilcoxon test results of the ICGO algorithm and each comparison algorithm on 23 functions.
Table 4. Wilcoxon test results of the ICGO algorithm and each comparison algorithm on 23 functions.
FunctionICGO vs. CGOICGO vs. SHOICGO vs. GWOICGO vs. SABOICGO vs. WOAICGO vs. AROICGO vs. AOIndex
F16.25  × 10 10 1.21  × 10 10 1.21  × 10 11 1.01  × 10 12 1.21  × 10 12 2.21  × 10 11 2.01  × 10 12 p
F24.57  × 10 12 1.01  × 10 10 1.61  × 10 12 1.16  × 10 12 1.21  × 10 12 2.11  × 10 10 1.81  × 10 11 p
F31.21  × 10 12 1.51  × 10 12 1.11  × 10 12 1.01  × 10 10 1.51  × 10 10 1.27  × 10 11 1.27  × 10 10 p
F41.93  × 10 10 1.71  × 10 10 1.41  × 10 12 1.27  × 10 12 2.21  × 10 10 1.61  × 10 09 3.21  × 10 09 p
F53.02  × 10 11 1.02  × 10 11 3.02  × 10 11 3.02  × 10 11 3.02  × 10 11 3.02  × 10 11 3.02  × 10 11 p
F63.02  × 10 11 3.18  × 10 11 3.09  × 10 10 3.12  × 10 09 2.02  × 10 11 1.32  × 10 12 2.02  × 10 10 p
F73.34  × 10 11 1.49  × 10 06 3.12  × 10 8 2.44  × 10 09 3.02  × 10 11 3.69  × 10 11 3.59  × 10 05 p
F82.28  × 10 05 3.02  × 10 11 3.02  × 10 11 3.02  × 10 11 7.28  × 10 03 7.28  × 10 03 7.28  × 10 03 p
F9NANNAN1.20  × 10 12 NAN1.61  × 10 02 NANNANp
F105.59  × 10 05 1.17  × 10 13 1.12  × 10 12 1.69  × 10 14 7.75  × 10 10 NANNANp
F11NANNAN2.15  × 10 02 NAN3.34  × 10 03 NANNANp
F122.02  × 10 08 3.02  × 10 11 3.02  × 10 11 3.02  × 10 11 3.02  × 10 11 2.02  × 10 12 1.02  × 10 11 p
F131.07  × 10 07 3.02  × 10 11 3.02  × 10 11 3.02  × 10 11 3.02  × 10 11 4.02  × 10 09 2.02  × 10 12 p
F145.32  × 10 10 4.71  × 10 03 1.00  × 10 02 1.29  × 10 02 2.02  × 10 02 1.49  × 10 08 3.71  × 10 02 p
F152.90  × 10 05 3.46  × 10 10 1.61  × 10 10 3.46  × 10 10 3.46  × 10 10 4.37  × 10 10 5.56  × 10 10 p
F166.72  × 10 09 5.36  × 10 15 5.36  × 10 15 5.36  × 10 15 5.36  × 10 15 2.23  × 10 02 5.36  × 10 15 p
F173.30  × 10 02 2.80  × 10 16 2.80  × 10 16 2.80  × 10 16 2.80  × 10 16 3.30  × 10 02 2.80  × 10 16 p
F181.80  × 10 15 1.43  × 10 14 1.43  × 10 14 1.43  × 10 14 1.43  × 10 14 1.22  × 10 15 1.43  × 10 14 p
F192.23  × 10 08 1.91  × 10 13 2.26  × 10 10 2.40  × 10 14 1.68  × 10 11 6.00  × 10 05 2.19  × 10 11 p
F202.09  × 10 04 3.87  × 10 11 2.02  × 10 04 1.44  × 10 02 1.73  × 10 04 4.32  × 10 02 1.15  × 10 06 p
F213.44  × 10 05 1.53  × 10 11 1.53  × 10 11 1.53  × 10 11 1.13  × 10 11 3.18  × 10 04 1.53  × 10 12 p
F222.11  × 10 04 1.60  × 10 11 1.60  × 10 11 1.60  × 10 11 1.60  × 10 08 2.39  × 10 02 2.60  × 10 09 p
F231.69  × 10 07 1.74  × 10 11 1.94  × 10 11 1.31  × 10 11 1.24  × 10 09 1.57  × 10 02 1.74  × 10 11 p
Total21/2/021/2/023/0/021/2/023/0/020/3/020/3/0 + / /
Table 5. Evaluation results of the ICGO-LSTM and other algorithms.
Table 5. Evaluation results of the ICGO-LSTM and other algorithms.
ModelMAEMAPERMSER2Running Time (s)
ICGO-LSTM3.28650.72%4.80890.98512880
CGO-LSTM9.36412.06%10.99160.968921040
SHO-LSTM21.84966.74%23.72970.90121186
GWO-LSTM12.74315.50%14.70080.95667925
SABO-LSTM14.20475.77%16.13040.953551185
WOA-LSTM12.78026.24%15.73240.950031021
ARO-LSTM5.88665.89%8.98880.94153972
AO-LSTM13.35491.97%17.06350.892061208
Table 6. Evaluation results of ICGO-LSTM and other machine learning models.
Table 6. Evaluation results of ICGO-LSTM and other machine learning models.
ModelMAEMAPERMSER2Running Time (s)
ICGO-LSTM3.28650.72%4.80890.98512880
LSTM7.80771.07%13.37910.9086423
SVM4.50211.84%7.29930.9578110
Elman8.70561.70%12.41560.8779319
CNN6.17851.91%8.82230.9383616
BP4.27321.27%7.47780.9557220
RBFNN4.35831.08%8.79390.9387618
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, Y.; Zheng, R.; Yu, B.; Liao, B.; Song, F.; Tang, C. An Improved Chaotic Game Optimization Algorithm and Its Application in Air Quality Prediction. Axioms 2025, 14, 235. https://doi.org/10.3390/axioms14040235

AMA Style

Liu Y, Zheng R, Yu B, Liao B, Song F, Tang C. An Improved Chaotic Game Optimization Algorithm and Its Application in Air Quality Prediction. Axioms. 2025; 14(4):235. https://doi.org/10.3390/axioms14040235

Chicago/Turabian Style

Liu, Yanping, Rongyan Zheng, Bohao Yu, Bin Liao, Fuhong Song, and Chunju Tang. 2025. "An Improved Chaotic Game Optimization Algorithm and Its Application in Air Quality Prediction" Axioms 14, no. 4: 235. https://doi.org/10.3390/axioms14040235

APA Style

Liu, Y., Zheng, R., Yu, B., Liao, B., Song, F., & Tang, C. (2025). An Improved Chaotic Game Optimization Algorithm and Its Application in Air Quality Prediction. Axioms, 14(4), 235. https://doi.org/10.3390/axioms14040235

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop