Next Article in Journal
Study on the Effects of High-Voltage Discharge Plasma Drying on the Volatile Organic Compounds and Texture Characteristics of Oat Grass
Previous Article in Journal
Design and Experiment of an Internet of Things-Based Wireless System for Farmland Soil Information Monitoring
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel BiGRU-Attention Model for Predicting Corn Market Prices Based on Multi-Feature Fusion and Grey Wolf Optimization

1
College of Information Engineering, Sichuan Agricultural University, Chengdu 611130, China
2
College of Information and Electrical Engineering, China Agricultural University, Beijing 100083, China
*
Author to whom correspondence should be addressed.
Agriculture 2025, 15(5), 469; https://doi.org/10.3390/agriculture15050469
Submission received: 16 January 2025 / Revised: 11 February 2025 / Accepted: 14 February 2025 / Published: 21 February 2025
(This article belongs to the Section Agricultural Economics, Policies and Rural Management)

Abstract

:
Accurately predicting corn market prices is crucial for ensuring corn production, enhancing farmers’ income, and maintaining the stability of the grain market. However, corn price fluctuations are influenced by various factors, exhibiting non-stationarity, nonlinearity, and high volatility, making prediction challenging. Therefore, this paper proposes a comprehensive, efficient, and accurate method for predicting corn prices. First, in the data processing phase, the seasonal and trend decomposition using LOESS (STL) algorithm was used to extract the trend, seasonality, and residual components of corn prices, combined with the GARCH-in-mean (GARCH-M) model to delve into the volatility clustering characteristics. Next, the kernel principal component analysis (KPCA) was employed for nonlinear dimensionality reduction to extract key information and accelerate model convergence. Finally, a BiGRU-Attention model, optimized by the grey wolf optimizer (GWO), was constructed to predict corn market prices accurately. The effectiveness of the proposed model was assessed through cross-sectional and longitudinal validation experiments. The empirical results indicated that the proposed STLG-KPCA-GWO-BiGRU-Attention (SGKGBA) model exhibited significant advantages in terms of MAE (0.0159), RMSE (0.0215), MAPE (0.5544%), and R2 (0.9815). This model effectively captures price fluctuation features, significantly enhances prediction accuracy, and offers reliable trend forecasts for decision makers regarding corn market prices.

1. Introduction

Securing a stable supply of food and agricultural products is vital for national and public wellbeing. Price volatility significantly influences both macroeconomic stability and daily life [1,2]. Corn, being the most widely planted food crop in China [3], plays a particularly critical strategic role. The planting area of corn in China occupies about 35% of the total area used for food crops, and its yield accounts for 40% of the total annual grain production. Corn is not only a crucial source of staple food but also a major provider of feed for the livestock sector and raw materials for industry. It plays an indispensable role in the national food security system [4].
The key to implementing a food security strategy lies in establishing an effective price regulation mechanism to maintain the stable operation of the food market [5]. However, corn price fluctuations have profound impacts on agricultural production and farmers’ livelihoods; when prices decline, reduced earnings directly undermine farmers’ motivation to produce, potentially leading to a contraction in agricultural production, which in turn threatens national food security. On the other hand, when prices rise, it increases the production costs for livestock farming and industry, affecting the stable development of related industrial chains. Therefore, constructing a scientifically sound and rational corn price prediction and regulation system is not only an important measure to ensure food market stability but also a critical step to protect farmers’ interests and promote sustainable agricultural development.
In recent years, influenced by changes in international relations, frequent extreme weather events, energy price fluctuations, and other uncertain factors, the price fluctuations in China’s corn market have intensified. The volatility characteristics exhibit significant nonlinearity, non-stationarity, and volatility clustering [6]. This complex market environment makes accurately predicting the price of China’s corn market a challenge. There is an urgent need to establish a new forecasting method to accurately capture the price fluctuation patterns [7].
In previous studies on forecasting methods, traditional forecasting models offer strong interpretability for problems and predict variables based on mathematical statistics [8]. Representative models include seasonal autoregressive integrated moving average (SARIMA) [9] and exponential smoothing (ES) [10]. SARIMA, by combining autoregressive, differencing, and moving average components, can effectively capture the trend in time series data. These univariate statistical models are simple to implement, but they may result in uncertain predictions when handling complex nonlinear problems. In recent years, multivariate extension models, such as vector autoregression (VAR) [11] and Bayesian structural time series (BSTS) [12] have emerged, which significantly improve the robustness of forecasts by incorporating multivariate analysis and prior distributions. Meanwhile, the development of machine learning models addresses the limitations of traditional statistical methods in handling nonlinear problems [13]. Representative models include regression techniques [14] and support vector regression (SVR) [15]. Machine learning models have the advantage of automatically learning patterns from data, but their ability to handle complex, nonlinear problems is limited, and they suffer from slow convergence and local optima issues. Recent machine learning research has focused on ensemble learning (such as XGBoost, LightGBM) [16] and transfer learning [17]. In recent years, deep learning models have been widely applied. For example, Murugesan et al. [18] applied five LSTM techniques and achieved good prediction results for five different commodities. Cheung et al. [19] proposed a 3D-CNN model for predicting corn futures prices, which further improved prediction accuracy by integrating the time series variables of different frequencies. Recent advancements in deep learning include transformer models [20], generative adversarial networks (GAN) [21], and graph neural networks (GNN) [22], which significantly enhance the predictive ability for complex nonlinear problems through self-attention mechanisms, data generation, and relational modeling.
With the development of model forecasting methods, ensemble models have emerged as a more scientific approach to predictions [23]. Ensemble models combine multiple models to predict variables. For example, Ray et al. [24] proposed the RF-ARIMA-LSTM hybrid model, which combines the advantages of random forest, ARIMA, and LSTM, effectively capturing both linear and nonlinear features. However, such simple sequential combination models suffer from error accumulation, and their effectiveness remains limited when dealing with multivariate, non-stationary time series.
In summary, in multivariate forecasting, deep learning models are more effective in handling complex nonlinear problems compared to statistical methods and machine learning models. Combination models can integrate the strengths of different models to improve prediction accuracy, but they suffer from the issue of error accumulation. Therefore, to overcome the above limitations and simultaneously address the significant nonlinearity, non-stationarity, and volatility clustering characteristics of corn price data, this study proposes a multivariate ensemble prediction model, STLG-KPCA-GWO-BiGRU-Attention, based on the deep learning model BiGRU, suitable for forecasting China’s corn market prices. The model enriches the input features by extracting the inherent volatility characteristics of the corn data using GARCH-M and STL based on multivariate research. Then, KPCA is used for feature dimensionality reduction, and these features are finally input into the BiGRU-Attention model for prediction. BiGRU enhances the model’s memory capacity by combining both forward and backward GRUs, allowing it to learn the dynamic changes in data from both directions. The attention mechanism in this study is connected after the BiGRU layer, enabling the model to focus on the most important parts of the sequence and learn more complex and abstract feature representations. In addition, the GWO algorithm is combined to further optimize the model’s hyperparameters, significantly improving the prediction accuracy. The main contributions of this paper are:
(1)
Comprehensive integration of multi-dimensional features. A systematic review of various external factors affecting corn market prices was conducted, including planting costs, port inventory, feed farming, deep processing enterprises, corn substitutes, freight, and economic indicators. These multi-dimensional features provide more comprehensive information input for the model.
(2)
In-depth mining of the complex price series volatility characteristics. The STL algorithm is used to extract the trend, seasonality, and residual components of the corn price series, and the GARCH-M model is combined to uncover the volatility clustering characteristics of the prices, integrating the inherent volatility patterns of the time series with external factors, significantly enhancing the model’s predictive ability.
(3)
Nonlinear dimensionality reduction and optimization of the model. KPCA was employed to reduce the dimensionality of the high-dimensional feature set, and a BiGRU-Attention model optimized by GWO was subsequently built. The model was optimized by automatically searching for key parameters, achieving high-precision predictions for corn market prices.
The subsequent chapters are as follows: Section 2 covers the materials and methods. Section 3 provides the analysis and discussion of experimental results. Section 4 concludes with conclusions and perspectives on future work.

2. Materials and Methods

2.1. Factors Influencing Corn Market Price Fluctuations

Exploring the factors influencing corn prices provides input data for the model, with an analysis of the basic supply–demand relationship of corn prices. The factors influencing supply are analyzed, including production costs, inventory, and factors influencing demand, such as the status of livestock farming, feed processing, and deep processing enterprises. The price of corn, as a market product, fluctuates dynamically with changes in market supply and demand [25], and supply–demand relations are the main factors influencing corn price fluctuations. Factors such as production, consumption, and inventory are key factors in the supply–demand situation that drive corn price fluctuations [26].
On the supply side, the rising costs of corn planting and subsequent processing directly affect corn prices. The cost of corn cultivation primarily consists of fertilizers, a critical factor in enhancing grain production [27]. Fertilizer prices have a direct impact on production costs. In recent years, these costs have been rising, leading to increased corn planting costs, which in turn raise corn prices.
On the demand side, corn is a widely used bulk agricultural product that can be used as food, animal feed, and industrial raw material [28]. Feed consumption, industrial processing, and export conditions are major factors affecting the demand for corn. The main domestic uses of corn include food, feed, industry, and seed, with feed accounting for the largest proportion, about 60%. The largest proportion of feed consumption is for pig farming. With the improvement in living standards, the demand for high-protein foods, such as meat and dairy products, is also rising, and the production of these foods relies heavily on corn feed [29]. Transportation costs are part of the cost of corn prices [30].

2.2. GARCH-M

GARCH-M is a generalized autoregressive conditional heteroscedasticity model, which is an extension of the traditional GARCH model [31]. The standard GARCH-M model is presented as follows:
Conditional mean equation:
y t = γ + λ σ t 2 + ϵ t
Variance equation:
σ t 2 = ω + α 1 ϵ t 1 2 + + α p ϵ t p 2 + β 1 σ t 1 2 + + β q σ t q 2
y t represents the return of corn market prices, and σ t 2 is the conditional variance.

2.3. STL

STL decomposes a time series into three components—trend, seasonal, and residual—effectively highlighting the underlying trends and patterns in the data. This method was proposed by Cleveland in 1990 [32]. STL consists of an inner loop and an outer loop, where the inner loop primarily performs trend fitting and the calculation of the seasonal component. Figure 1 is a flowchart of the inner loop of STL.
  Y t = T t + S t + R t , t = 1 ,   2 , , N
Y t represents the observed value at time t , and T t , S t , and R t represent the trend, seasonal, and residual components at time t , respectively.

2.4. KPCA

KPCA is a nonlinear extension of principal component analysis (PCA). This method was proposed by Schölkopf et al. [33]. The basic idea of KPCA is to use a kernel function to nonlinearly map the original data to a high-dimensional feature space and then perform linear PCA in this high-dimensional space.

2.5. GWO

GWO is a population-based intelligence optimization algorithm inspired by the hunting behavior and social hierarchy of grey wolves, proposed by Mirjalili et al. in 2014. Compared to traditional algorithms, GWO has the advantages of a simple structure, few adjustable parameters, and strong search capability, making it suitable for multi-objective optimization tasks [34]. In the grey wolf population, there is a clear hierarchy, with ranks from highest to lowest being alpha, beta, gamma, and omega wolf. Figure 2 is the algorithm flow.

2.6. BiGRU-Attention

Bidirectional GRU (BiGRU) extracts deep features by calculating forward and backward time series, which reduces the negative impact of the input data sequence on the final output of a single gated recurrent unit (GRU), thereby improving model accuracy [35]. Figure 3 is the structure.
In this study, the attention mechanism is used to identify the more critical factors influencing the BiGRU output. The structure is shown in Figure 4, and the formulas are given in Equations (4)–(6).
  e i = v T tanh w i h i + b i
a i = exp e i j = 1 n e j
  c i = i = 1 t a i h i
e i is the hidden layer state vector, v T and w i are the network weight matrices, b i is the bias, a i is the attention weight for the hidden layer output h i , and c i represents the weighted feature.

2.7. The Methodology

To effectively capture price volatility characteristics, a corn price hybrid prediction model SGKGBA is proposed, which combines the STL algorithm with the GARCH-M model to explore the underlying volatility features of the price. The basic model layer of the proposed model, BiGRU, enhances the model’s memory capability by combining both forward and backward GRUs, allowing it to learn the dynamic changes in the data from both directions. The subsequent attention layer then focuses on the most important parts of the forward sequence output, learning more complex and abstract feature representations. STL and GARCH-M, as important methods for capturing volatility features, capture the underlying potential patterns in historical data to enrich feature input expressions. KPCA reduces the dimensionality of data from multiple sources to accelerate model training, while retaining most of the important information. Finally, under the GWO search, the model achieves optimal parameters, and its performance is the best. Figure 5 presents the process.
As follows:
(1)
Data preprocessing: The acquired multi-factor data are temporally aligned based on the date dimension of the corn market price series. For missing values, a combined interpolation strategy using the nearest-neighbor interpolation method and cubic spline interpolation is employed.
(2)
Spearman’s rank correlation analysis: Spearman’s correlation analysis is applied to filter the key influencing factors of corn market prices. Before this, the Shapiro–Wilk test is applied to test the normality of the data.
(3)
Extraction of latent volatility information patterns and feature reduction: To allow the model to learn latent information patterns and enhance input information for better fitting, the GARCH-M and STL algorithms are employed to extract the complex fluctuation characteristics of the corn market price series and integrate them with external influencing factors, enhancing the model’s information expression. This paper uses KPCA to reduce the dimensionality of the constructed input feature matrix. In the GARCH-M model, the conditional mean equation y t and the variance equation σ t 2 are formulated, and the resulting volatility clustering feature is labeled as I t . Next, STL is used to obtain component information, which is divided into two steps: the inner and outer loops. Assume that T t k and S t k are the trends and seasonal components at the end of the k 1 th pass in the inner loop, with an initial condition of T t k = 0 , and the parameters are n i , n o , n p , n s , n l , and n t .
Step 1: by subtracting the trend component from the previous round’s result, it can be implemented as
  Y t d e t r e n d = Y t T t k , T t 0 = 0
Y t d e t r e n d denotes the detrended corn price, and T t k is the trend value at the k th iteration.
Step 2: use LOESS () ( q = n s , d = 1 ) to perform regression on each subseries and extend one cycle forward and backward, the smoothed results form the temporary seasonal series, denoted as C t ( k + 1 ) .
Step 3: the result sequence C t ( k + 1 ) from the previous step is applied sequentially to moving averages of lengths n p , n p , and 3, followed by LOESS regression ( q = n l , d = 1 ) to obtain the result sequence L t ( k + 1 ) , for t = 1 ,   2 , , N ; this is equivalent to a low-pass filtering of the cyclic subseries.
Step 4: obtain the seasonal component:
  S t k + 1 = C t k + 1 L t k + 1
Step 5: subtract the seasonal component:
    Y t d e s e a s o n = Y t S t k + 1
Step 6: for the sequence Y t d e s e a s o n obtained by removing the seasonal component in Step 5, perform LOESS ( q = n t , d = 1 ) regression to obtain the trend component T t ( k + 1 ) .
The outer loop primarily adjusts the robustness weights, meaning it computes and updates the robustness weight ρ t for each sample point t . If there are outliers in the data sequence, the residuals will be larger.
In Step 2 and Step 6 of the corresponding inner loop, the domain weights need to be multiplied by ρ t . The calculation of ρ t is as follows:
  ρ t = B R t h , h = 6 m e d i a n ( | R t | )
The B function is the square function:
B μ = 1 μ 2 2 ,     f o r   0 μ 1 0                   ,     f o r   μ 1
After the external loop ends, the corn price series is ultimately decomposed into T t ,   S t ,   R t , as follows.
R t k + 1 = Y t S t k + 1 T t k + 1
Step 7: the corn price series components T t , S t ,   R t and the volatility clustering feature I t are added to the original corn feature data matrix X = x 1 , x 2 , , x n R d × n , where d represents the data dimension after incorporating the volatility feature, and n is the number of samples; the data are then centralized by calculating the mean μ = 1 n i = 1 n x i , and X c = X μ e T .
Step 8: select an appropriate kernel function κ ( x i , x j ) to map the original corn feature data into a high-dimensional feature space ϕ ( x ) ; the Gaussian kernel function is defined as κ x i , x j = exp ( x i x j 2 σ 2 ) .
Step 9: calculate the kernel matrix K i j = κ x i , x j .
Step 10: perform eigenvalue decomposition on the centralized kernel matrix K c = ( I n 1 n e e T ) K ( I n 1 n e e T ) , where I n is the n × n identity matrix; the eigenvectors of K c correspond to the principal component directions in the high-dimensional feature space; the eigenvalues of K c are sorted in descending order, and the eigenvectors corresponding to the top mmm largest eigenvalues, V = [ v 1 , v 2 , , v m ] , are chosen, with m < n .
Step 11: project the original data onto the low-dimensional space formed by the first m principal components: Y = V T K c , where Y R m × n is the dimensionality-reduced corn price feature data.
(4)
GWO: The dimension-reduced corn price feature data Y are input into the BiGRU. In this paper, GWO is used to optimize the key parameters of the attention-based BiGRU model. The GWO algorithm continuously adjusts these parameters during training, searching for the optimal solution to improve model performance.
Step 12: to mathematically simulate this encircling behavior, the following equation is proposed:
D = C X p t X t
    X t + 1 = X p t A D
D represents the distance between the grey wolf and the prey, X p t is the position vector of the prey at time t , and X t and X t + 1 are the position vectors of the wolf at time t and t + 1 , respectively. A and C are coefficient vectors used to adjust the coefficients for updating the position of the grey wolf, and they can be defined as follows:
  A = 2 a t r 1 a t
C = 2 r 2
a decreases linearly from 2 to 0 during the iterations, and r 1 and r 2 are random vectors in the range [ 0,1 ] .
Step 13: we assume that α , β , and δ have a better understanding of the potential location of the prey; as a result, we store the top three best solutions obtained so far and require other search agents to update their positions based on the position of the best search agent; the following equation is introduced in this context:
Estimate the position of the prey using the different positions of the α ,   β , and δ wolves:
    D α = C 1 X a t X t D β = C 2 X β t X t D δ = C 3 X δ t X t
X a , X β , and X δ represent the approximate distances between the α ,   β , and δ wolves, respectively. The position update for each grey wolf is as follows:
        X 1 = X a A 1 D a X 2 = X β A 2 D β X 3 = X δ A 3 D δ
X t + 1 = X 1 + X 2 + X 3 3
X a , X β , and X δ represent the current positions of the alpha, beta, and gamma wolves, respectively. X 1 , X 2 , and X 3 represent the step lengths and directions of the ω wolf moving toward the α ,   β , and δ wolves, respectively. X t + 1 is the final position of the wolf.
Step 14: when the prey ceases to move, the grey wolves stop hunting and begin their attack; This is mathematically expressed by reducing the value of a t from 2 to 0; the grey wolves attack the prey when the random value of A falls within the range [ 1,1 ] .
Step 15: when the maximum number of iterations is reached, the output corresponds to the historical best position, which represents the optimal solution found by the optimization algorithm.
(5)
Prediction: The optimal solution parameters obtained are used as the new model parameters for training. The model is then tested on the test set to obtain the final predicted value Y t . The evaluation metrics MAE, RMSE, MAPE, and R2 are computed by comparing the predicted value Y t with the true value Y t .

3. Analysis and Discussion of Experimental Results

3.1. Data Source

Our dataset includes weekly market price data for Chinese corn from January 2014 to June 2024, as well as data on 21 different factors. All data are sourced from two database websites: https://d.qianzhan.com/ (accessed on 25 July 2024) and http://www.agdata.cn/ (accessed on 27 July 2024). Both the corn price data and the influencing factor data can be considered as time series data. The corn market price data are recorded weekly. Table 1 is an overview of the selected statistical data.

3.2. Data Preprocessing

Other feature data were aligned to the date dimension of the corn market price sequence to ensure data reliability and continuity. After aligning the data by the time dimension, missing values were present. This study aims to fill in missing data as much as possible.
To retain the original data patterns, the nearest-neighbor cubic spline interpolation was used to handle missing values [36]. Figure 6a,b shows the corn market price sequences before and after processing. From Figure 6, it can be observed that the interpolation effectively fills the breakpoints, and the constructed sequence retains the regularity of the original sequence. The same treatment was applied to the data of various influencing factors. The above interpolation processing provided good experimental data for subsequent research.
To visually present the data range, understand the basic characteristics of the data, and comprehensively compare the differences between variables, half-violin plots for each sequence are drawn based on the data value intervals. As shown in Figure 7, it can be observed that the dataset has almost no outliers.
To remove the interference caused by inconsistent data dimensionalities, this study uses min–max normalization to process the dataset, scaling each data variable’s values linearly between 0 and 1. The formula is:
x i = x x m i n x m a x x m i n
x m a x and x m i n represent the maximum and minimum values of the sample, while x is the original value, and x i is the normalized value.

3.3. Extraction of Volatility Clustering Characteristics Using the GARCH-M Model

This research utilizes the GARCH-M model to extract the volatility clustering characteristics of corn market prices. The specific steps are as follows:
(1)
Normality test and descriptive statistics of corn market price returns
The GARCH-M model is constructed by using the first-order differences of the natural logarithm of the weekly corn market prices, as shown in the following equation:
R t = ln P t ln P t 1
R t represents the price return in week t , and P t and P t 1 are the corn market prices in weeks t and t 1 , respectively.
Figure 8 shows the series of corn market price returns, revealing a clustering effect in return volatility, a phenomenon that aligns with the patterns observed in most financial markets. After testing, the skewness is 0.559335 > 0, indicating a right-skewed distribution, and the kurtosis is 11.73145 > 3, indicating a leptokurtic distribution. Thus, the return series does not follow a normal distribution.
(2)
Stationarity Test of the Return Series
The ADF test is used for stationarity testing to avoid spurious regression and meet modeling requirements. From Table 2, it can be seen that the p-value is 0 < 0.05, rejecting the null hypothesis. The t-test statistic is −8.048156, smaller than the critical values at the 1%, 5%, and 10% significance levels. Thus, the return series is stationary, satisfying the stationarity requirement for the GARCH-M model.
(3)
Determining the Lag Order of the Conditional Mean Equation
The lag order of the conditional mean equation is determined using the ARDL model. Afterward, autocorrelation tests are conducted on the return series based on the conditional mean model. According to the autocorrelation test results in Table 3, the random error term shows no serial correlation. Thus, the choice of the lag-2 explanatory variable is appropriate, meeting the prerequisites for constructing the GARCH-M model.
(4)
ARCH Effect Test
To fulfill the requirements for the next step in modeling, the ARCH effect of the disturbance term in the return series must be tested. According to the results in Table 4, both the F-statistic and Obs*R-squared are significant at the 1% level. Hence, the corn market price return series displays significant ARCH effects, and the GARCH-M model can be applied to analyze the return volatility characteristics.
(5)
Determination of Optimal Lag Order for the GARCH-M Model
To establish the optimal GARCH-M model, the error distribution is modeled using Student’s t and generalized error distribution (GED), with different lag orders (q, p, r) set to determine the optimal lag structure. There is no leverage effect in the corn market price return series, so r is set to 0. The optimal lag order of the model is assessed using three criteria: AIC, SC, and HQ. The process of selecting the optimal lag order for the GARCH-M model is shown in Table 5. From the results in the table, the optimal model is the GARCH-M(1,1,0) under the Student’s t-distribution. The extracted volatility clustering features, as shown in Figure 9, are used as input volatility features in the subsequent model.

3.4. Corn Market Price Feature Extraction Based on STL Decomposition

The seasonal parameter in the STL algorithm is crucial for the decomposition performance. Thus, its best value needs to be determined through experiments and analysis to achieve the optimal decomposition. Other STL parameter settings are also important. Table 6 shows the detailed STL parameter settings.
Figure 10 shows the decomposition results under various STL parameters. The trend component effectively fits the long-term trend of the original series at each seasonal value, exhibiting strong stability. The seasonal component exceeding 11 can effectively capture the seasonal volatility features of corn prices. With the overall amplitude following a pattern of first decreasing and then increasing. The residual component’s fitting effect also gradually improves when the seasonal component exceeds 11. After considering the fitting of all components, it was found that when the seasonal value is 71, and a good balance between components is achieved, leading to the best decomposition results.
After determining the optimal key parameter, seasonal = 71, Figure 11 shows the STL decomposition of the corn market price series, generating four sequences; MPC represents the original corn market price series, and the seasonal component sequence reflects the seasonal periodicity of the corn market price. The trend component sequence accurately reflects the overall trend of the corn market price. From the figure, it can be observed that the trend component has a larger value and a smoother curve, which helps us capture the long-term direction of prices. The residual component sequence of random terms corresponds to the residual noise in MPC, displaying random and irregular variations.

3.5. Performance Evaluation Metrics and Model Parameter Settings

To make it easier to observe the prediction performance of different models under multiple influencing factors, four error evaluation metrics— M A E , R M S E , M A P E (%) and R 2 —are computed. The formulas are as follows:
M A E = 1 n t = 1 n y ^ t y t
  R M S E = 1 n t = 1 n y ^ t y t
    M A P E = 1 n t = 1 n y ^ t y t y t × 100 %
R 2 = 1 t = 1 n y ^ t y t 2 t = 1 n y ¯ t y t 2
n represents the sample size of the test set, y ^ t represents the predicted corn market price for week t , y t is the actual corn market price for week t , and y ¯ t is the mean of the actual values.
The experiment used RF, LightGBM, XGBoost, LSTM, GRU, BiGRU, and BiGRU-Attention as the baseline models for corn market price prediction analysis. The training and testing sets were divided into an 8:2 ratio. To maximize the prediction performance of the models, the key parameters of each model were optimized before testing, and the same hyperparameters were applied to similar structures across different models. Table 7 shows hyperparameter settings.

3.6. Cross-Sectional Forecasting with Multiple Influencing Factors

To extract more relevant information and facilitate model fitting with significant features, this study employs Spearman’s correlation analysis to explore the influencing factors of corn market prices. Before this, the Shapiro–Wilk test was conducted to test the normality of the data. Figure 12 is the correlation heatmap. This study eliminates weakly correlated factors with a correlation coefficient lower than 0.4 with the corn market price series. The 12 selected influencing factors effectively reflect various aspects, including planting costs, port inventory, feed farming, deep processing enterprises, corn substitutes, freight, and economic indicators, and thus can serve as significant features influencing the corn market price series.
After conducting cross-sectional experiments, the prediction results are shown in Figure 13. It can be observed that the ensemble learning model is able to capture the overall trend of corn market price changes to a certain extent, but its performance is not satisfactory, especially when the corn market price experiences jump-like fluctuations. The deep learning model’s prediction results, as shown in Figure 14, reveal that deep learning models fit the curve better than machine learning models, though they still fail to effectively fit the trend.
From Table 8, it can be observed that the BiGRU-Attention model performs the best for predicting the Chinese corn market price, with MAE (0.0433), RMSE (0.0569), MAPE (1.4884%), and R2 (0.8708). In this case, deep learning models exhibit overall smaller prediction errors compared to machine learning models. Although all models incorporate exogenous variable information to some extent for fitting, none of the models achieved the expected performance. The BiGRU-Attention model performs the best, effectively capturing the volatility characteristics of corn market prices. The model’s good performance is attributed to the use of both forward and backward GRUs, which enhance its memory capacity and allow it to learn the dynamic changes in data from both directions. Additionally, the attention mechanism is connected after the BiGRU layer, enabling the model to focus on the most important parts of the sequence and learn more complex and abstract feature representations. The next step is to design longitudinal experiments to improve the BiGRU-Attention model and validate the advantages of the proposed model.

3.7. Longitudinal Forecasting with Multi-Feature Fusion and Grey Wolf Optimization

To enable the model to better capture and fit important latent information, by incorporating the volatility features extracted by the GARCH-M and STL models into external influencing factors and using KPCA to construct the inputs, longitudinal comparative experiments are conducted to explore how different feature input patterns improve the model’s performance and verify the effectiveness of the proposed model.
Before conducting the longitudinal experiments, this study analyzes the cumulative contribution rates under different kernel functions to select an appropriate kernel function for dimensionality reduction in the feature set. Based on a preset threshold of 95% information retention and following the principles of maximizing the cumulative contribution rate, while minimizing the number of components, the optimal kernel function is selected. The specific steps are as follows: First, the KPCA algorithm with different kernel functions (linear, poly, rbf, sigmoid) is used to perform dimensionality reduction experiments on the feature set. The contribution rate and cumulative contribution rate for each kernel function are plotted and compared. The experimental results are shown in Figure 15, where Figure 15d shows that, when eight principal components are selected, the cumulative contribution rate is the highest. Therefore, the best kernel function is sigmoid, and KPCA uses sigmoid for dimensionality reduction to construct the model inputs.
The error metrics in Table 9 are analyzed as follows:
When the volatility clustering feature is integrated with multiple influencing factors for prediction, the GM-BiGRU-Attention model, which accounts for this feature, reduces MAE by 0.0034, RMSE by 0.0076, and MAPE by 0.1341, while R2 improves by 0.0319 compared to the BiGRU-Attention model. This result demonstrates that the volatility clustering feature extracted by the GARCH-M model significantly enhances corn market price predictions.
The decomposed trend, seasonal, and residual components are added as new features to the BiGRU-Attention model used in the GARCH-M model experiment. Keeping other experimental conditions constant, only the input features are extended. Compared to the BiGRU-Attention model, the STL-BiGRU-Attention model reduces MAE by 0.0051, RMSE by 0.0099, MAPE by 0.1687, and improves R2 by 0.041. This indicates that the fluctuation feature information extracted by the STL algorithm positively impacts corn market price prediction. Moreover, compared to volatility clustering features, STL features contribute more significantly to enhancing model prediction performance.
In the STLG-BiGRU-Attention model, which combines two fluctuation features, MAE decreases by 0.0168, RMSE by 0.0174, and MAPE by 0.5643, indicating further improvement in prediction accuracy. However, the dimensionality of the data increases, so dimensionality reduction is applied to the fused feature matrix to fully utilize the two fluctuation features and external influencing factors information. After dimensionality reduction, the STLG-KPCA-BiGRU-Attention model achieves further reductions in error across all metrics. The weights of the feature variables after KPCA dimensionality reduction are presented in Figure 16. Using the GWO algorithm, the STLG-KPCA-GWO-BiGRU-Attention model achieves the highest accuracy, and the optimal hyperparameters of the model searched by the GWO algorithm are 51 neurons, a batch size of 94, and 71 epochs. The loss convergence process chart of the proposed model is shown in Figure 17, where the model reaches complete convergence at 37 epochs.
Finally, Figure 18 shows the experimental results of the model’s predictions. From Figure 19, we can see the R2 values of the predictions for each model. This proposed model (R2 = 0.9815) can be regarded as an effective approach for analyzing and forecasting the corn market price in China.

4. Conclusions

Accurate prediction of corn market prices is crucial for corn production, farmer income, and the stability of the corn market. In this context, the paper employs both qualitative and quantitative analyses, considering a range of factors affecting corn market prices, and constructs a multi-feature fusion prediction model, STLG-KPCA-GWO-BiGRU-Attention, to accurately predict the weekly prices of corn in China. The results indicate that the model is able to effectively capture the volatility patterns of corn prices and provides highly reliable decision-making support for stakeholders.
In the research process, this paper improves the model’s prediction accuracy through multi-stage integrated processing. Initially, the STL decomposes the corn market price into trend, seasonal, and residual components, combined with the GARCH-M model to analyze the price’s volatility clustering in detail. This serves as the basis for extracting various volatility features for the STLG-KPCA-GWO-BiGRU-Attention model. KPCA further reduces the feature dimensions, improves the model’s training efficiency, and avoids overfitting. Finally, after GWO optimization, the model’s nonlinear fitting ability is strengthened, and the attention mechanism assigns greater weight to key features, thus enhancing the prediction accuracy.
This model not only provides highly reliable decision support for corn producers, farmers, and other stakeholders but also has broad application potential. Firstly, corn market price forecasting can help farmers make more rational decisions in planting and selling, increasing farmers’ income and ensuring market stability. Secondly, the methodological framework proposed in this paper has strong adaptability and can be applied to price forecasting in other agricultural commodity markets, such as wheat, soybeans, etc., helping policymakers better understand market dynamics and conduct effective macro control.
Although this study has achieved significant results in several aspects, it still has certain limitations. Firstly, the model is highly sensitive to certain variables, especially to changes in price volatility and economic indicators. Therefore, in the case of missing data or numerous outliers, the model may produce unstable prediction results. Extreme data points (such as natural disasters or sudden policy changes) may have a significant impact on the model, leading to a decline in prediction accuracy. Secondly, the current model primarily relies on historical data and volatility features. Future research could integrate more real-time data sources, such as climate change, policy adjustments, and international market fluctuations, to further improve prediction accuracy and timeliness.
In addition, the interpretability of the model continues to be a significant research focus. The current model improves feature interpretability to some extent through the attention mechanism, but how to further reveal the underlying mechanisms in the decision-making process of the model remains a challenge for future research. Through explainable artificial intelligence techniques, stakeholders will be able to better understand the model’s prediction logic, thereby enhancing the model’s transparency and trustworthiness.

Author Contributions

Conceptualization, Y.F. and Y.G.; methodology, Y.F.; writing—original draft preparation, Y.F.; writing—review and editing, X.H.; supervision, project administration, S.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Natural Science Foundation of Sichuan Province, grant number 2024NSFSC1059.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declared no conflicts of interest.

References

  1. Kyriazi, F.; Thomakos, D.D.; Guerard, J.B. Adaptive learning forecasting, with applications in forecasting agricultural prices. Int. J. Forecast. 2019, 35, 1356–1369. [Google Scholar] [CrossRef]
  2. Avinash, G.; Ramasubramanian, V.; Ray, M.; Paul, R.K.; Godara, S.; Nayak, G.H.; Kumar, R.R.; Manjunatha, B.; Dahiya, S.; Iquebal, M.A. Hidden Markov guided Deep Learning models for forecasting highly volatile agricultural commodity prices. Appl. Soft Comput. 2024, 158, 111557. [Google Scholar] [CrossRef]
  3. Sun, M.; Li, S.; Yang, W.; Zhao, B.; Wang, Y.; Liu, X. Commercial genetically modified corn and soybean are poised following pilot planting in China. Mol. Plant 2024, 17, 519–521. [Google Scholar] [CrossRef]
  4. Sun, Q.; Li, Y.; Gong, D.; Hu, A.; Zhong, W.; Zhao, H.; Ning, Q.; Tan, Z.; Liang, K.; Mu, L. A NAC-EXPANSIN module enhances maize kernel size by controlling nucellus elimination. Nat. Commun. 2022, 13, 5708. [Google Scholar] [CrossRef]
  5. Mu, W.; Kleter, G.A.; Bouzembrak, Y.; Dupouy, E.; Frewer, L.J.; Radwan Al Natour, F.N.; Marvin, H. Making food systems more resilient to food safety risks by including artificial intelligence, big data, and internet of things into food safety early warning and emerging risk identification tools. Compr. Rev. Food Sci. Food Saf. 2024, 23, e13296. [Google Scholar] [CrossRef]
  6. Wang, J.; Wang, Z.; Li, X.; Zhou, H. Artificial bee colony-based combination approach to forecasting agricultural commodity prices. Int. J. Forecast. 2022, 38, 21–34. [Google Scholar] [CrossRef]
  7. Zeng, L.; Ling, L.; Zhang, D.; Jiang, W. Optimal forecast combination based on PSO-CS approach for daily agricultural future prices forecasting. Appl. Soft Comput. 2023, 132, 109833. [Google Scholar] [CrossRef]
  8. Mohsin, M.; Jamaani, F. A novel deep-learning technique for forecasting oil price volatility using historical prices of five precious metals in context of green financing—A comparison of deep learning, machine learning, and statistical models. Resour. Policy 2023, 86, 104216. [Google Scholar] [CrossRef]
  9. Khadka, R.; Chi, Y.N. Forecasting the Global Price of Corn: Unveiling Insights with SARIMA Modelling Amidst Geopolitical Events and Market Dynamics. Am. J. Appl. Stat. Econ. 2024, 3, 124–135. [Google Scholar] [CrossRef]
  10. Zhou, Z.; Zhang, Y.; Zhao, J. Research on Vegetable Pricing and Replenishment Based on Exponential Smoothing Model Prediction. Trans. Comput. Appl. Math. 2024, 4, 11–18. [Google Scholar]
  11. Kilian, L. How to construct monthly VAR proxies based on daily surprises in futures markets. J. Econ. Dyn. Control 2024, 168, 104966. [Google Scholar] [CrossRef]
  12. Hassoo, A.K. Forecasting Maize Production In Romania: A BSTS Model Approach. Rom. Stat. Rev. 2024. [Google Scholar] [CrossRef]
  13. Caton, S.; Haas, C. Fairness in machine learning: A survey. ACM Comput. Surv. 2024, 56, 1–38. [Google Scholar] [CrossRef]
  14. Jin, B.; Xu, X. Forecasting wholesale prices of yellow corn through the Gaussian process regression. Neural Comput. Appl. 2024, 36, 8693–8710. [Google Scholar] [CrossRef]
  15. Xiang, X.; Xiao, J.; Wen, H.; Li, Z.; Huang, J. Prediction of landslide step-like displacement using factor preprocessing-based hybrid optimized SVR model in the Three Gorges Reservoir, China. Gondwana Res. 2024, 126, 289–304. [Google Scholar] [CrossRef]
  16. Liu, Z.L. Ensemble learning. In Artificial Intelligence for Engineers: Basics and Implementations; Springer: Cham, Switzerland, 2025; pp. 221–242. [Google Scholar]
  17. Zhao, Z.; Alzubaidi, L.; Zhang, J.; Duan, Y.; Gu, Y. A comparison review of transfer learning and self-supervised learning: Definitions, applications, advantages and limitations. Expert Syst. Appl. 2024, 242, 122807. [Google Scholar] [CrossRef]
  18. Murugesan, R.; Mishra, E.; Krishnan, A.H. Forecasting agricultural commodities prices using deep learning-based models: Basic LSTM, bi-LSTM, stacked LSTM, CNN LSTM, and convolutional LSTM. Int. J. Sustain. Agric. Manag. Inform. 2022, 8, 242–277. [Google Scholar] [CrossRef]
  19. Cheung, L.; Wang, Y.; Lau, A.S.; Chan, R.M. Using a novel clustered 3D-CNN model for improving crop future price prediction. Knowl. Based Syst. 2023, 260, 110133. [Google Scholar] [CrossRef]
  20. Wu, B.; Wang, Z.; Wang, L. Interpretable corn future price forecasting with multivariate time series. J. Forecast. 2024. [Google Scholar] [CrossRef]
  21. Wang, Y.; Wang, X.; Li, L. EMD-TCN-TimeGAN: A Data Augmentation Model for Enhancing the Accuracy of Agricultural Product Price Prediction. In Proceedings of the 2024 IEEE International Conference on Big Data (BigData), Washington, DC, USA, 15–18 December 2024; pp. 5161–5168. [Google Scholar]
  22. Ye, Z.; Zhai, X.; She, T.; Liu, X.; Hong, Y.; Wang, L.; Zhang, L.; Wang, Q. Winter Wheat Yield Prediction Based on the ASTGNN Model Coupled with Multi-Source Data. Agronomy 2024, 14, 2262. [Google Scholar] [CrossRef]
  23. Sun, F.; Meng, X.; Zhang, Y.; Wang, Y.; Jiang, H.; Liu, P. Agricultural Product Price Forecasting Methods: A Review. Agriculture 2023, 13, 1671. [Google Scholar] [CrossRef]
  24. Ray, S.; Lama, A.; Mishra, P.; Biswas, T.; Das, S.S.; Gurung, B. An ARIMA-LSTM model for predicting volatile agricultural price series with random forest technique. Appl. Soft Comput. 2023, 149, 110939. [Google Scholar] [CrossRef]
  25. Wang, D.; He, Z.; He, S.; Zhang, Z.; Zhang, Y. Dynamic pricing of two-dimensional extended warranty considering the impacts of product price fluctuations and repair learning. Reliab. Eng. Syst. Saf. 2021, 210, 107516. [Google Scholar] [CrossRef]
  26. Ge, Y.; Wu, H. Prediction of corn price fluctuation based on multiple linear regression analysis model under big data. Neural Comput. Appl. 2020, 32, 16843–16855. [Google Scholar] [CrossRef]
  27. Van Wesenbeeck, C.; Keyzer, M.; Van Veen, W.; Qiu, H. Can China’s overuse of fertilizer be reduced without threatening food security and farm incomes? Agric. Syst. 2021, 190, 103093. [Google Scholar] [CrossRef]
  28. Cao, Y.; Cheng, S. Impact of COVID-19 outbreak on multi-scale asymmetric spillovers between food and oil prices. Resour. Policy 2021, 74, 102364. [Google Scholar] [CrossRef]
  29. Yin, D.; Wang, Y.; Wang, L.; Wu, Y.; Bian, X.; Aggrey, S.E.; Yuan, J. Insights into the proteomic profile of newly harvested corn and metagenomic analysis of the broiler intestinal microbiota. J. Anim. Sci. Biotechnol. 2022, 13, 26. [Google Scholar] [CrossRef]
  30. Xu, Q.; Meng, T.; Sha, Y.; Jiang, X. Volatility in metallic resources prices in COVID-19 and financial Crises-2008: Evidence from global market. Resour. Policy 2022, 78, 102927. [Google Scholar] [CrossRef]
  31. Zeng, H.; Shao, B.; Dai, H.; Yan, Y.; Tian, N. Prediction of fluctuation loads based on GARCH family-CatBoost-CNNLSTM. Energy 2023, 263, 126125. [Google Scholar] [CrossRef]
  32. RB, C. STL: A seasonal-trend decomposition procedure based on loess. J. Off. Stat. 1990, 6, 3–73. [Google Scholar]
  33. Schölkopf, B.; Smola, A.; Müller, K.-R. Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput. 1998, 10, 1299–1319. [Google Scholar] [CrossRef]
  34. Gong, R.; Li, X. A short-term load forecasting model based on crisscross grey wolf optimizer and dual-stage attention mechanism. Energies 2023, 16, 2878. [Google Scholar] [CrossRef]
  35. Tang, J.; Hou, H.; Chen, H.; Wang, S.; Sheng, G.; Jiang, C. Concentration prediction method based on Seq2Seg network improved by BI-GRU for dissolved gas intransformer oil. Electr. Power Autom. Equip. 2022, 42, 196–202. [Google Scholar]
  36. Xu, X.; Zhang, Y. Corn cash price forecasting with neural networks. Comput. Electron. Agric. 2021, 184, 106120. [Google Scholar] [CrossRef]
Figure 1. STL inner loop flowchart.
Figure 1. STL inner loop flowchart.
Agriculture 15 00469 g001
Figure 2. Grey wolf optimizer algorithm flow.
Figure 2. Grey wolf optimizer algorithm flow.
Agriculture 15 00469 g002
Figure 3. Internal structure of the BiGRU.
Figure 3. Internal structure of the BiGRU.
Agriculture 15 00469 g003
Figure 4. Structure of the attention mechanism.
Figure 4. Structure of the attention mechanism.
Agriculture 15 00469 g004
Figure 5. SGKGBA model process flow.
Figure 5. SGKGBA model process flow.
Agriculture 15 00469 g005
Figure 6. Comparison of corn price sequence before and after filling missing values.
Figure 6. Comparison of corn price sequence before and after filling missing values.
Agriculture 15 00469 g006
Figure 7. Semi-violin plot of the corn data and its influencing factors.
Figure 7. Semi-violin plot of the corn data and its influencing factors.
Agriculture 15 00469 g007
Figure 8. Time series graph of corn market price returns.
Figure 8. Time series graph of corn market price returns.
Agriculture 15 00469 g008
Figure 9. Sequence plot of the volatility clustering feature in corn market prices.
Figure 9. Sequence plot of the volatility clustering feature in corn market prices.
Agriculture 15 00469 g009
Figure 10. Results of STL decomposition: (a) trend, (b) seasonal, and (c) residual.
Figure 10. Results of STL decomposition: (a) trend, (b) seasonal, and (c) residual.
Agriculture 15 00469 g010
Figure 11. Corn time series and the three components obtained from the STL decomposition.
Figure 11. Corn time series and the three components obtained from the STL decomposition.
Agriculture 15 00469 g011
Figure 12. Correlation heatmap of corn market prices.
Figure 12. Correlation heatmap of corn market prices.
Agriculture 15 00469 g012
Figure 13. Prediction results of machine learning models with multiple influencing factors.
Figure 13. Prediction results of machine learning models with multiple influencing factors.
Agriculture 15 00469 g013
Figure 14. Prediction results of deep learning models with multiple influencing factors.
Figure 14. Prediction results of deep learning models with multiple influencing factors.
Agriculture 15 00469 g014
Figure 15. Cumulative contribution rate chart of different kernel functions.
Figure 15. Cumulative contribution rate chart of different kernel functions.
Agriculture 15 00469 g015
Figure 16. Weight distribution diagram of each feature variable.
Figure 16. Weight distribution diagram of each feature variable.
Agriculture 15 00469 g016
Figure 17. The loss convergence process chart of the proposed model.
Figure 17. The loss convergence process chart of the proposed model.
Agriculture 15 00469 g017
Figure 18. Prediction results of each model from the experiment.
Figure 18. Prediction results of each model from the experiment.
Agriculture 15 00469 g018
Figure 19. Marginal histogram fit for each model.
Figure 19. Marginal histogram fit for each model.
Agriculture 15 00469 g019
Table 1. Overview of selected statistical data.
Table 1. Overview of selected statistical data.
Data CategoryNameUnitDescription
Target VariableMPCYuan/kgCorn market price
Planting CostPTCFYuan/tFertilizer price
UREAPYuan/tFertilizer price
PCPYuan/tFertilizer price
Port InventoryNPIMtNorthern port inventory
GDPIMtGuangdong port inventory
Feed FarmingEMPYuan/kgEgg market price
PMPYuan/kgPiglet wholesale market price
AEFPWMSYuan/kgAverage carcass price
MPLPYuan/kgLive pig exit price
PBPAG-Pig-to-grain price ratio
PFLHFYuan/100Profit from layer hen farming
GPOFPFYuan/tGross profit of fattening pig feed
GPOECFYuan/tGross profit of layer feed
GPOBFYuan/tGross profit of broiler feed
EnterprisesSROSE%Starch enterprise operating rate
PFCAPYuan/tCorn ethanol processing profit
OROAE%Alcohol enterprise operating rate
SubstitutesSMPYuan/kgSoybean meal price
WMPYuan/tWheat market price
Freight RateCCBFIG-CCBFI: grain index
Price IndexCCPIAP-Economic indicators
Table 2. Stationarity test results of corn market price return series.
Table 2. Stationarity test results of corn market price return series.
t-Statisticp-Value
ADF test statistic−8.0481560.0000
1% significance level−2.569228-
5% significance level−1.941407-
1% significance level−1.616307-
Table 3. Autocorrelation test results.
Table 3. Autocorrelation test results.
Statistical Valuesp-Value
F-statistic0.0780030.7801
Obs*R-squared0.0784250.7794
Table 4. ARCH effect test results for corn market price return series.
Table 4. ARCH effect test results for corn market price return series.
Statistical Valuesp-Value
F-statistic16.202400.0000
Obs*R-squared135.79610.0000
Table 5. Optimal lag order selection for the model.
Table 5. Optimal lag order selection for the model.
Error Distribution(q,p,r)AICSCHQ
Student’s t(3,3,0)−8.249501−8.162451−8.215464
Student’s t(3,2,0)−8.250523−8.171387−8.219581
Student’s t(3,1,0)−8.249372−8.178149−8.221524
Student’s t(2,3,0)−8.251125−8.171989−8.220183
Student’s t(2,2,0)−8.253129−8.181906−8.225280
Student’s t(2,1,0)−8.253046−8.189736−8.228292
Student’s t(1,3,0)−8.253277−8.182054−8.225429
Student’s t(1,2,0)−8.253092−8.189783−8.228338
Student’s t(1,1,0)−8.256576 *−8.201180 *−8.234916 *
GED(3,3,0)−8.225419−8.138369−8.191382
GED(3,2,0)−8.212415−8.133279−8.181473
GED(3,1,0)−8.215526−8.144304−8.187678
GED(2,3,0)−8.210055−8.130919−8.179113
GED(2,2,0)−8.209535−8.138312−8.181687
GED(2,1,0)−8.214488−8.151179−8.189734
GED(1,3,0)−8.209567−8.138345−8.181719
GED(1,2,0)−8.212434−8.149125−8.187680
GED(1,1,0)−8.212317−8.156921−8.190657
* denotes the minimum value under the AIC, SC, and HQ criteria.
Table 6. Parameter settings for STL decomposition.
Table 6. Parameter settings for STL decomposition.
ParameterValueExplanation
period53Based on periodic characteristics of time series
low_pass55The smallest odd number greater than the period
seasonal{11, 23, 35, 47, 59, 71}Generally required to be an odd number no than 7
robustTruePerform robust decomposition
seasonal_jump1Not exceeding 10–20% of the seasonal and low_pass
trend_jump1Not exceeding 10–20% of the seasonal and low_pass
low_pass_jump1Not exceeding 10–20% of the seasonal and low_pass
Table 7. Hyperparameter settings for all models.
Table 7. Hyperparameter settings for all models.
ModelHyperparameter Settings
RFn_estimators = 100, max_depth = 5, random_state = 1, max_leaf_nodes = 10
LightGBMnum_leaves = 31, learning_rate = 0.1, max_depth = 10
XGBoostn_estimators = 300, learning_rate = 0.2, max_depth = 2, min_child_weight = 1
LSTMlstm_units = 64, batch_size = 16, epochs = 100, activation function = relu, optimizer = adam
GRUgru_units = 128, batch_size = 16, epochs = 100, activation function = relu, optimizer = adam
BiGRUbigru_units = 128, batch_size = 16, epochs = 100, activation function = relu, optimizer = adam
BiGRU-Attentionbigru_units = 128, batch_size = 16, epochs = 100, activation function = relu, optimizer = adam, att activation function = sigmoid
Table 8. Evaluation metrics of prediction errors for each model under multiple influencing factors.
Table 8. Evaluation metrics of prediction errors for each model under multiple influencing factors.
ModelMAERMSEMAPE (%)R2
RF0.09260.10663.20340.6129
LightGBM0.08970.10413.09800.6306
XGBoost0.08030.09372.76810.7011
LSTM0.05780.07261.95070.7893
GRU0.05340.06431.87960.8351
BiGRU0.05080.05991.75920.8566
BiGRU-Attention0.04330.05691.48840.8708
Table 9. Prediction error evaluation metrics for the improved models.
Table 9. Prediction error evaluation metrics for the improved models.
ModelMAERMSEMAPE (%)R2
BiGRU-Attention0.04330.05691.48840.8708
GM-BiGRU-Attention0.03990.04931.35430.9027
STL-BiGRU-Attention0.03820.04701.31970.9118
STLG-BiGRU-Attention0.02650.03950.92410.9376
STLG-KPCA-BiGRU-Attention0.02250.02790.76890.9687
STLG-KPCA-GWO-BiGRU-Attention0.01590.02150.55440.9815
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Feng, Y.; Hu, X.; Hou, S.; Guo, Y. A Novel BiGRU-Attention Model for Predicting Corn Market Prices Based on Multi-Feature Fusion and Grey Wolf Optimization. Agriculture 2025, 15, 469. https://doi.org/10.3390/agriculture15050469

AMA Style

Feng Y, Hu X, Hou S, Guo Y. A Novel BiGRU-Attention Model for Predicting Corn Market Prices Based on Multi-Feature Fusion and Grey Wolf Optimization. Agriculture. 2025; 15(5):469. https://doi.org/10.3390/agriculture15050469

Chicago/Turabian Style

Feng, Yang, Xiaonan Hu, Songsong Hou, and Yan Guo. 2025. "A Novel BiGRU-Attention Model for Predicting Corn Market Prices Based on Multi-Feature Fusion and Grey Wolf Optimization" Agriculture 15, no. 5: 469. https://doi.org/10.3390/agriculture15050469

APA Style

Feng, Y., Hu, X., Hou, S., & Guo, Y. (2025). A Novel BiGRU-Attention Model for Predicting Corn Market Prices Based on Multi-Feature Fusion and Grey Wolf Optimization. Agriculture, 15(5), 469. https://doi.org/10.3390/agriculture15050469

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop