Natural Gas Consumption Forecasting Model Based on Feature Optimization and Incremental Long Short-Term Memory
Abstract
:1. Introduction
- (1)
- Proposed a method for modeling missing/anomalous data sampling based on Gaussian mixture distributions, enhancing the robustness and generalization capability of the network;
- (2)
- Developed a weakly supervised cascade network feature selector, which improves the coupling between historical and current data, enabling the network to achieve more accurate predictions when dealing with sensitive data;
- (3)
- Introduced an incremental prediction method to address the prediction lag caused by anomalous data, fully leveraging the model’s feature-awareness capabilities;
- (4)
- Lastly, designed a regression difference loss in conjunction with the mean squared error loss to enhance the feature fusion of long-range data associations, improving the accuracy of natural gas consumption forecasting.
2. Materials and Methods
2.1. Data Process
2.2. Weakly Supervised Cascade Network Feature Selection
- (1)
- First, for the input samples, we construct the feature mapping relationship by minimizing the Gini index criterion. The Gini index is shown in Equation (5):
- (2)
- Subsequently, we use a multi-cascade network to evaluate feature responses in the dataset. Initially, the model analyzes various factors such as time, holidays, special events, and school schedules to establish feature mappings that influence the daily natural gas load. To address uncertainties in feature representation, we incorporate an uncertainty gradient into the mean squared error loss, enabling fine-tuning of the model’s parameters. This approach helps the model adapt effectively to datasets with similar characteristics, enhancing its feature selection capability. By iteratively refining this process, the model constructs highly responsive relationships for daily natural gas load forecasting. Equation (6) provides a detailed description of this process:
2.3. LSTM Network Architecture
- (1)
- Input gate: For the input data , in our time-series setting, the input sequence is denoted as {, , …, }, where each represents the feature vector at a time step, where it is first passed through an activation function for feature regularization, resulting in , as shown in the following formula:
- (2)
- Forget gate: By performing cascade network feature selection under weak supervision, the forget gate can better respond to high-weight features while suppressing the interference of disruptive features on the network.
- (3)
- Output gate: Features are interacted through short-distance skip connections, effectively preventing gradient vanishing.
2.4. Incremental Learning for Regression Prediction
2.5. Incremental Loss Function
3. Experiment and Result Analysis
3.1. Data Description
- (1)
- Climate dependency: the model is developed based on a subtropical monsoon climate (Köppen Cfa) and requires recalibration of temperature thresholds when applied to tropical or high-latitude regions.
- (2)
- Differences in gas consumption structure: in Wuhan, industrial usage accounts for 42% of total gas consumption, significantly higher than in many European and North American cities (typically <30%), which may lead to an overestimation of industrial demand when transferring the model.
- (3)
- Pipeline network topology: Wuhan adopts a looped gas network with 28% redundancy, whereas cities like Paris use a radial topology with only 12% redundancy, leading to different pressure response dynamics under the same load variations.
3.2. Implementation Details
3.3. Evaluation Metrics
3.4. Experimental Analysis
3.5. Ablation Study
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
Module | Input Dimension | Output Dimension | Parameters | Activation Function | |
---|---|---|---|---|---|
Input Layer | [N, M] | [N, D] | No trained parameters | None | |
R× | Hidden Layer | [N, D] | [N, H] | Weights: [N, H]; bias: [H] | ReLU |
Hidden Layer | [N, H] | [N, D] | Weights: [N, D]; bias: [D] | ReLU | |
Output Layer | [N, H] | [N, P] | Weights: [N, P]; bias: [P] | Linear |
N | D | H | P |
Batch size = 64 | Hidden output Feature dimension: 32 | Number of neurons in hidden layers: H = 64 | Output dimension: P = 1 |
Module | Input Dimension | Output Dimension | Parameter |
---|---|---|---|
Input Embedding | [N, T, M] | [N, T, d_model] | Linear projection: [M→d_model] |
Positional Encoding | [N, T, d_model] | [N, T, d_model] | Learnable positional embeddings |
Multi-Head Attention | [N, T, d_model] | [N, T, d_model] | Heads: h; single-head dim: d_k = d_v = d_model/h; weights: [d_model, h × d_k] |
Feed-Forward Network | [N, T, d_model] | [N, T, d_model] | Hidden dim: d_ff = 4 × d_model; Linear layer: d_model→d_ff→d_model |
Layer Normalization | [N, T, d_model] | [N, T, d_model] | Normalization along the last dimension: d_model |
Residual Connection | Input + sublayer output | Same as input | Element-wise addition |
Output Projection | [N, T, d_model] | [N, T, P] | Linear layer: [d_model→P] |
N | T | M | d_Model | h | d_ff | P |
Batch size = 64 | Sequence length (number of time steps) | Input feature dimension | Hidden dimension of the model | Number of attention heads (default = 8) | Hidden dimension of the feed-forward network (4 × d_model) | Output dimension (P = 1) |
Module Name | Input Dimensions | Output Dimensions | Hidden Layer | Activation Function |
---|---|---|---|---|
Variable Selection Network | Static: [N, S]; past: [N, T_past, C]; future: [N, S] | Same as input | MLP: [S→64→32→S] | GELU |
Static Covariate Encoder | Static features: [N, S] | [N, 4 × d] | MLP: [S→64→32→4 × d] | GELU + LayerNorm |
LSTM Layer | Past features: [N, T_past, C] | Hidden states: [N, T_past, d] | Hidden units: d; single-layer unidirectional | Tanh |
Temporal Self-Attention | Concatenated features: [N, T_total, d]; T_total = T_past + T_feature | Attention output: [N, T_total, d] | Multi-head (4 heads); Ker/value dim: d_k = d_v = d//4 | Scaled dot-product |
Gated Residual Network | Primary input: [N, T, d] | Gated output: [N, T, d] | FC layer: [d→64→32→d] Skip Connection | GLU (gated linear unit) |
Quantile Prediction | Final feature [N, T_feature, d] | [N, T_total, Q] (Q: number of quantiles) | FC layer: [d→32→Q] | Linear |
N | S | C | K | T_Past | T_Feature | d | Q |
Batch size = 64 | Static feature dimension (e.g., temperature + holiday = 2) | Past dynamic feature dimension | Feature dimension | Historical time steps | Forecast horizon (e.g., 7 days) | Hidden dimension (default = 64) | Q: number of quantiles |
Module | Input Dimension | Output Dimension | Parameter | |
---|---|---|---|---|
Weakly Supervised Cascade Network Feature Selection | Key feature: [N, F] | Selection of key feature: [N, M] | Weight: [N, M]; bias: [M] | |
Input Layer | [N, M] | [N, F] | Weight: [N, F]; bias: [F] | |
R× | LSTMLayer | [N, F] | [N, 64] | Weight: [N, 64]; bias: [64] |
DropoutLayer | [N, 64] | [N, 64] | rate = 0.5 | |
LSTMLayerOut | [N, D] | [N, L] | Weight: [N, L]; bias: [L]; D: (64→32→64); L = 64 | |
Prediction Head | Incremental Learning for Regression Prediction (LSTMLayer + DropoutLayer) | [N, L] | [N, T] | L: (64→128→64); T = 64 |
Dense Layer (n = 1) | [N, T] | [N, 1] | T: (64→32→1) | |
Dense Layer | [N, L] | [N, 1] | Activation = sigmoid |
References
- Looney, B. Statistical Review of World Energy globally consistent data on world energy markets and authoritative publications in the field of energy. Rev. World Energy Data 2021, 70, 8–20. [Google Scholar]
- Soldo, B. Forecasting natural gas consumption. Appl. Energy 2012, 92, 26–37. [Google Scholar] [CrossRef]
- Zhang, W.; Yang, J. Forecasting natural gas consumption in China by Bayesian model averaging. Energy Rep. 2015, 1, 216–220. [Google Scholar] [CrossRef]
- Yang, D.; Gao, X.; Yang, Y.; Jiang, M.; Guo, K.; Liu, B.; Li, S.; Yu, S. CSA-Net: Complex Scenarios Adaptive Network for Building Extraction for Remote Sensing Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 938–953. [Google Scholar] [CrossRef]
- Cheng, X.; Han, K.; Xu, J.; Li, G.; Xiao, X.; Zhao, W.; Gao, X. SPFDNet: Water Extraction Method Based on Spatial Partition and Feature Decoupling. Remote Sens. 2024, 16, 3959. [Google Scholar] [CrossRef]
- Gao, X.; Zhang, G.; Yang, Y.; Kuang, J.; Han, K.; Jiang, M.; Yang, J.; Tan, M.; Liu, B.; Sensing, R. Two-Stage Domain Adaptation Based on Image and Feature Levels for Cloud Detection in Cross-Spatiotemporal Domain. IEEE Trans. Geosci. Remote Sens. 2024, 62, 1–17. [Google Scholar] [CrossRef]
- Vitullo, S.; Brown, R.H.; Corliss, G.F.; Marx, B.M. Mathematical models for natural gas forecasting. Can. Appl. Math. Q. 2009, 17, 1005–1013. [Google Scholar]
- Taboh, S.; Méndez-Díaz, I.; Zabala, P. Heuristics for home appliances scheduling problems with energy consumption bounds. Sustain. Energy Grids Netw. 2024, 39, 101511. [Google Scholar] [CrossRef]
- Pappas, S.S.; Ekonomou, L.; Karamousantas, D.C.; Chatzarakis, G.; Katsikas, S.; Liatsis, P. Electricity demand loads modeling using AutoRegressive Moving Average (ARMA) models. Energy 2008, 33, 1353–1360. [Google Scholar] [CrossRef]
- Ervural, B.C.; Beyca, O.F.; Zaim, S. Model estimation of ARMA using genetic algorithms: A case study of forecasting natural gas consumption. Procedia-Soc. Behav. Sci. 2016, 235, 537–545. [Google Scholar] [CrossRef]
- Ediger, V.Ş.; Akar, S. ARIMA forecasting of primary energy demand by fuel in Turkey. Energy Policy 2007, 35, 1701–1708. [Google Scholar] [CrossRef]
- Ding, J.; Zhao, Y.; Jin, J. Forecasting natural gas consumption with multiple seasonal patterns. Appl. Energy 2023, 337, 120911. [Google Scholar] [CrossRef]
- Fragalla, M.A.M.; Yan, W.; Deng, J.; Xue, L.; Hegair, F.; Zhang, W.; Li, G. Time Series Production Forecasting of Natural Gas Based on Transformer Neural Networks. Geoenergy Sci. Eng. 2025, 248, 213749. [Google Scholar] [CrossRef]
- Zhou, X.; Ye, J.; Zhao, S.; Jin, M.; Hou, Z.; Yang, C.; Li, Z.; Wen, Y.; Yuan, X. Towards Universal Large-Scale Foundational Model for Natural Gas Demand Forecasting. arXiv 2024, arXiv:2409.15794. [Google Scholar]
- Chen, Y.; Wang, H.; Li, S.; Dong, R. A novel grey seasonal model for natural gas production forecasting. Fractal Fract. 2023, 7, 422. [Google Scholar] [CrossRef]
- Lin, Z.; Xie, L.; Zhang, S. A compound framework for short-term gas load forecasting combining time-enhanced perception transformer and two-stage feature extraction. Energy 2024, 298, 131365. [Google Scholar] [CrossRef]
- Gao, X.; Yang, J.; Xie, X.; Yang, Y.; Wang, N.; Cao, X.; Du, B.; Tan, M.; Xu, L.; Kou, Y. DG2-TCR: An Adaptive Clouds Removal Network for Optical Remote Sensing Images Using SAR-Driven Dual-Flow Fusion Guidance. IEEE Trans. Geosci. Remote Sens. 2025, 63, 5619016. [Google Scholar] [CrossRef]
- Zhang, Y.; Yang, Y.; Gao, X.; Xu, L.; Liu, B.; Liang, X. Robust extraction of multiple-type support positioning devices in the catenary system of railway dataset based on MLS point clouds. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–14. [Google Scholar] [CrossRef]
- Fabbiani, E.; Marziali, A.; Nicolao, G.D. Forecasting residential gas demand: Machine learning approaches and seasonal role of temperature forecasts. Int. J. Oil Gas Coal Technol. 2021, 26, 202–224. [Google Scholar] [CrossRef]
- Bai, S.; Huang, X.; Luo, M.; Su, J. Deep hybrid models for daily natural gas consumption forecasting and complexity measuring. Energy Sci. Eng. 2023, 11, 654–674. [Google Scholar] [CrossRef]
- Rubio, G.; Pomares, H.; Rojas, I.; Herrera, L.J. A heuristic method for parameter selection in LS-SVM: Application to time series prediction. Int. J. Forecast. 2011, 27, 725–739. [Google Scholar] [CrossRef]
- Wekalao, J.; Baz, A.; Patel, S.K. Design and analysis of high-sensitivity hormone sensor with KNN behavior prediction for healthcare and biomedical applications. Measurement 2025, 242, 116172. [Google Scholar] [CrossRef]
- Bortnowski, P.; Matla, J.; Sierzputowski, G.; Włostowski, R.; Wróbel, R. Prediction of toxic compounds emissions in exhaust gases based on engine vibration and Bayesian optimized decision trees. Measurement 2024, 235, 115018. [Google Scholar] [CrossRef]
- Li, F.; Zheng, H.; Li, X.; Yang, F. Day-ahead city natural gas load forecasting based on decomposition-fusion technique and diversified ensemble learning model. Appl. Energy 2021, 303, 117623. [Google Scholar] [CrossRef]
- Baldacci, L.; Golfarelli, M.; Lombardi, D.; Sami, F. Natural gas consumption forecasting for anomaly detection. Expert Syst. Appl. 2016, 62, 190–201. [Google Scholar] [CrossRef]
- Gorucu, F.B. Evaluation and forecasting of gas consumption by statistical analysis. Energy Sources 2004, 26, 267–276. [Google Scholar] [CrossRef]
- Szoplik, J. Forecasting of natural gas consumption with artificial neural networks. Energy 2015, 85, 208–220. [Google Scholar] [CrossRef]
- Graves, A.; Graves, A. Long Short-Term Memory. In Supervised Sequence Labelling with Recurrent Neural Networks; Springer: Berlin/Heidelberg, Germany, 2012; pp. 37–45. [Google Scholar]
- Zheng, J.; Xu, C.; Zhang, Z.; Li, X. Electric load forecasting in smart grids using long-short-term-memory based recurrent neural network. In Proceedings of the 2017 51st Annual Conference on Information Sciences and Systems (CISS), Baltimore, MD, USA, 22–24 March 2017; pp. 1–6. [Google Scholar]
- Laib, O.; Khadir, M.T.; Mihaylova, L. Toward efficient energy systems based on natural gas consumption prediction with LSTM Recurrent Neural Networks. Energy 2019, 177, 530–542. [Google Scholar] [CrossRef]
- Kim, T.-Y.; Cho, S.-B. Predicting the household power consumption using CNN-LSTM hybrid networks. In Proceedings of the Intelligent Data Engineering and Automated Learning–IDEAL 2018: 19th International Conference, Madrid, Spain, 21–23 November 2018; pp. 481–490. [Google Scholar]
- Kim, T.-Y.; Cho, S.-B. Predicting residential energy consumption using CNN-LSTM neural networks. Energy 2019, 182, 72–81. [Google Scholar] [CrossRef]
- Dash, P.; Mishra, J.; Dara, S. Bidirectional cnn-lstm architecture to predict cnxit stock prices. J. Theor. Appl. Inf. Technol. 2024, 102, 72–81. [Google Scholar]
- Lai, G.; Chang, W.-C.; Yang, Y.; Liu, H. Modeling long-and short-term temporal patterns with deep neural networks. In Proceedings of the 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, Ann Arbor, MI, USA, 8–12 July 2018; pp. 95–104. [Google Scholar]
- Lea, C.; Vidal, R.; Reiter, A.; Hager, G.D. Temporal convolutional networks: A unified approach to action segmentation. In Proceedings of the Computer Vision–ECCV 2016 Workshops, Amsterdam, The Netherlands, 8–10 and 15–16 October 2016; pp. 47–54. [Google Scholar]
- Huang, S.; Wang, D.; Wu, X.; Tang, A. Dsanet: Dual self-attention network for multivariate time series forecasting. In Proceedings of the 28th ACM International Conference on Information and Knowledge Management, Beijing, China, 3–7 November 2019; pp. 2129–2132. [Google Scholar]
- Na, X.; Jia, J.; Zhao, X.; Zhang, L.; Li, F. City natural gas load forecasting based on wavelet transform and LSTM model. Comput. Appl. Softw. 2021, 38, 61–66. [Google Scholar]
- Man, S.; Xialan, L.; Ting, X.; Wenjuan, X.; Bin, H. A new model for forecasting the short-term daily demand of urban natural gas. Nat. Gas Ind. 2018, 38, 128–132. [Google Scholar]
- Chen, T.; Guestrin, C. Xgboost: A scalable tree boosting system. In Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar]
- La Malfa, E.; La Malfa, G.; Nicosia, G.; Latora, V. Deep Neural Networks via Complex Network Theory: A Perspective. arXiv 2024, arXiv:2404.11172. [Google Scholar]
- Wang, Y.; Du, T.; Guo, Y.; Dong, F.; Si, J.; Xu, M. A novel Kalman smoothing (Ks)−Long Short-Term Memory (LSTM) hybrid model for filling in short-and long-term missing values in significant wave height. Measurement 2025, 242, 115947. [Google Scholar] [CrossRef]
- Wu, L.; Zhong, Z.; Fang, L.; He, X.; Liu, Q.; Ma, J.; Chen, H. Sparsely annotated semantic segmentation with adaptive gaussian mixtures. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 17–24 June 2023; pp. 15454–15464. [Google Scholar]
- Song, H.; Yang, Y.; Gao, X.; Zhang, M.; Li, S.; Liu, B.; Wang, Y.; Kou, Y. Joint classification of hyperspectral and LiDAR data using binary-tree transformer network. Remote Sens. 2023, 15, 2706. [Google Scholar] [CrossRef]
- Zhang, G.; Gao, X.; Yang, J.; Yang, Y.; Tan, M.; Xu, J.; Wang, Y. A multi-task driven and reconfigurable network for cloud detection in cloud-snow coexistence regions from very-high-resolution remote sensing images. Int. J. Appl. Earth Obs. Geoinf. 2022, 114, 103070. [Google Scholar] [CrossRef]
- Dai, T.-Y.; Niyogi, D.; Nagy, Z. CityTFT: A temporal fusion transformer-based surrogate model for urban building energy modeling. Appl. Energy 2025, 389, 125712. [Google Scholar] [CrossRef]
Category | Example | |
---|---|---|
Acronyms | LSTM | Long Short-Term Memory |
ARIMA | Autoregressive Integrated Moving Average | |
SVM | Support Vector Machine | |
KNN | K-Nearest Neighbors | |
ANN | Artificial Neural Network | |
RNN | Recurrent Neural Network | |
TCN | Temporal Convolutional Network | |
MSTL | Multi-Seasonal Time-series Decomposition | |
GMM | Gaussian Mixture Model | |
MAPE | Mean Absolute Percentage Error | |
BP | Backpropagation Neural Network | |
HDD | Heating Degree Day | |
NHD | Non-Heating Day | |
TFT | Temporal Fusion Transformer | |
Abbreviations | et al. | And others |
Eq. | Equation | |
Symbols | X | Input feature |
l | Feature sequence length |
Feature Type | Features Related to the Daily Natural Gas Load | |||||
---|---|---|---|---|---|---|
Next date event (L1) | Time | Holiday | Special events | School | ||
Current date event (L2) | Time | Holiday | Special events | School | ||
Current monitoring features (L3) | Weather | Humidity | Wind speed | Precipitation | Heating status | |
Previous monitoring features (L4) | Natural gas consumption | Air pressure | Heat index | Temperature variation | Humidity difference | Wind difference |
Model | Hyperparameters | Seed Initial | Train Strategy | Input Features | Dataset | Loss Functions |
---|---|---|---|---|---|---|
BP Neural Network | Batch size: 64 Optimizer: Adam Iteration: 5000 Init learning rate: 1 × 10−3 Decay rate: 1 × 10−2 | 1234 | Early stopping | Temperature, humidity, wind speed, atmospheric pressure, precipitation, time, and holidays | WNGD | MSE loss |
Transformer | ||||||
TFT | MSE loss + probabilistic loss | |||||
Ours | MSE loss + regression difference loss |
Method | Predict Date | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
---|---|---|---|---|---|---|---|---|---|---|---|
BP Neural Network | HDD | 0.066 | 0.108 | 0.158 | 0.191 | 0.221 | 0.248 | 0.297 | 0.311 | 0.327 | 0.358 |
NHD | 0.041 | 0.065 | 0.070 | 0.090 | 0.152 | 0.080 | 0.083 | 0.089 | 0.093 | 0.103 | |
Predict Date | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | |
HDD | 0.371 | 0.395 | 0.396 | 0.408 | 0.428 | 0.435 | 0.476 | 0.485 | 0.505 | 0.513 | |
NHD | 0.105 | 0.110 | 0.109 | 0.111 | 0.108 | 0.122 | 0.121 | 0.124 | 0.127 | 0.122 | |
Predict Date | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | |
HDD | 0.487 | 0.525 | 0.534 | 0.522 | 0.541 | 0.561 | 0.573 | 0.549 | 0.594 | 0.587 | |
NHD | 0.116 | 0.121 | 0.127 | 0.131 | 0.127 | 0.133 | 0.125 | 0.132 | 0.131 | 0.148 | |
Predict Date | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | |
HDD | 0.590 | 0.598 | 0.604 | 0.628 | 0.602 | 0.610 | 0.613 | 0.625 | 0.618 | 0.631 | |
NHD | 0.153 | 0.150 | 0.159 | 0.169 | 0.166 | 0.169 | 0.180 | 0.186 | 0.194 | 0.203 | |
Transformer | HDD | 0.0321 | 0.0552 | 0.0621 | 0.0683 | 0.0695 | 0.0712 | 0.0715 | 0.0777 | 0.0837 | 0.0872 |
NHD | 0.0295 | 0.0381 | 0.0433 | 0.0428 | 0.0464 | 0.0506 | 0.0527 | 0.0539 | 0.0580 | 0.0608 | |
Predict Date | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | |
HDD | 0.0928 | 0.0917 | 0.0959 | 0.0963 | 0.0950 | 0.1086 | 0.1178 | 0.1219 | 0.1219 | 0.1205 | |
NHD | 0.0619 | 0.0585 | 0.0602 | 0.0619 | 0.0604 | 0.0633 | 0.0704 | 0.0646 | 0.0653 | 0.0651 | |
Predict Date | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | |
HDD | 0.1253 | 0.1235 | 0.1239 | 0.1225 | 0.1174 | 0.1294 | 0.1337 | 0.1386 | 0.1423 | 0.1373 | |
NHD | 0.0504 | 0.0505 | 0.0501 | 0.0493 | 0.0507 | 0.0491 | 0.0494 | 0.0562 | 0.0501 | 0.0494 | |
Predict Date | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | |
HDD | 0.1174 | 0.1169 | 0.1135 | 0.1101 | 0.1189 | 0.1184 | 0.1247 | 0.1118 | 0.1263 | 0.1285 | |
NHD | 0.0532 | 0.04833 | 0.0494 | 0.0497 | 0.0481 | 0.0495 | 0.0498 | 0.0518 | 0.0527 | 0.0544 | |
TFT | Predict Date | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
HDD | 0.0298 | 0.0526 | 0.0583 | 0.0641 | 0.0662 | 0.0679 | 0.0687 | 0.074 | 0.0797 | 0.0789 | |
NHD | 0.0272 | 0.0359 | 0.0411 | 0.0406 | 0.0439 | 0.0479 | 0.0501 | 0.0513 | 0.0551 | 0.0577 | |
Predict Date | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | |
HDD | 0.0886 | 0.0874 | 0.0915 | 0.0919 | 0.0906 | 0.1037 | 0.1038 | 0.1126 | 0.1164 | 0.1213 | |
NHD | 0.0588 | 0.0555 | 0.0571 | 0.0588 | 0.0573 | 0.0612 | 0.0669 | 0.0613 | 0.0621 | 0.0618 | |
Predict Date | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | |
HDD | 0.1149 | 0.1196 | 0.1178 | 0.1231 | 0.1117 | 0.1236 | 0.1275 | 0.1324 | 0.1358 | 0.1312 | |
NHD | 0.0477 | 0.0478 | 0.0474 | 0.0467 | 0.0481 | 0.0466 | 0.0468 | 0.0534 | 0.0475 | 0.0468 | |
Predict Date | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | |
HDD | 0.112 | 0.1115 | 0.1083 | 0.1051 | 0.1136 | 0.1138 | 0.1192 | 0.1066 | 0.1204 | 0.1249 | |
NHD | 0.0501 | 0.0458 | 0.0469 | 0.0472 | 0.0456 | 0.0471 | 0.0473 | 0.0492 | 0.0501 | 0.0516 | |
Our network | Predict Date | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
HDD | 0.026 | 0.045 | 0.050 | 0.056 | 0.057 | 0.058 | 0.059 | 0.064 | 0.069 | 0.072 | |
NHD | 0.024 | 0.031 | 0.035 | 0.035 | 0.040 | 0.041 | 0.043 | 0.044 | 0.048 | 0.050 | |
Predict Date | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | |
HDD | 0.077 | 0.075 | 0.079 | 0.079 | 0.078 | 0.090 | 0.090 | 0.098 | 0.101 | 0.105 | |
NHD | 0.051 | 0.048 | 0.050 | 0.051 | 0.050 | 0.052 | 0.058 | 0.053 | 0.054 | 0.055 | |
Predict Date | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | |
HDD | 0.100 | 0.104 | 0.102 | 0.106 | 0.097 | 0.107 | 0.110 | 0.115 | 0.118 | 0.114 | |
NHD | 0.041 | 0.041 | 0.043 | 0.042 | 0.044 | 0.043 | 0.042 | 0.048 | 0.043 | 0.042 | |
Predict Date | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | |
HDD | 0.097 | 0.096 | 0.098 | 0.095 | 0.102 | 0.102 | 0.108 | 0.096 | 0.109 | 0.101 | |
NHD | 0.045 | 0.042 | 0.044 | 0.044 | 0.043 | 0.044 | 0.044 | 0.046 | 0.047 | 0.048 |
Method | Predict Date | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
---|---|---|---|---|---|---|---|---|---|---|---|
No incremental loss | HDD | 0.059 | 0.057 | 0.055 | 0.061 | 0.058 | 0.058 | 0.065 | 0.064 | 0.060 | 0.063 |
NHD | 0.054 | 0.059 | 0.063 | 0.063 | 0.058 | 0.062 | 0.064 | 0.062 | 0.058 | 0.062 | |
Predict Date | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | |
HDD | 0.065 | 0.063 | 0.066 | 0.061 | 0.076 | 0.074 | 0.074 | 0.085 | 0.076 | 0.084 | |
NHD | 0.064 | 0.055 | 0.058 | 0.069 | 0.061 | 0.069 | 0.062 | 0.060 | 0.063 | 0.067 | |
Predict Date | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | |
HDD | 0.089 | 0.087 | 0.083 | 0.084 | 0.083 | 0.085 | 0.090 | 0.091 | 0.085 | 0.114 | |
NHD | 0.053 | 0.052 | 0.055 | 0.055 | 0.054 | 0.058 | 0.061 | 0.052 | 0.056 | 0.065 | |
Predict Date | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | |
HDD | 0.094 | 0.097 | 0.094 | 0.102 | 0.091 | 0.104 | 0.099 | 0.099 | 0.101 | 0.098 | |
NHD | 0.058 | 0.061 | 0.069 | 0.059 | 0.057 | 0.058 | 0.060 | 0.050 | 0.056 | 0.056 | |
Our incremental loss | Predict Date | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
HDD | 0.026 | 0.045 | 0.050 | 0.056 | 0.057 | 0.058 | 0.059 | 0.064 | 0.069 | 0.072 | |
NHD | 0.024 | 0.031 | 0.035 | 0.035 | 0.040 | 0.041 | 0.043 | 0.044 | 0.048 | 0.050 | |
Predict Date | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | |
HDD | 0.077 | 0.075 | 0.079 | 0.079 | 0.078 | 0.090 | 0.090 | 0.098 | 0.101 | 0.105 | |
NHD | 0.051 | 0.048 | 0.050 | 0.051 | 0.050 | 0.052 | 0.058 | 0.053 | 0.054 | 0.055 | |
Predict Date | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | |
HDD | 0.100 | 0.104 | 0.102 | 0.106 | 0.097 | 0.107 | 0.110 | 0.115 | 0.118 | 0.114 | |
NHD | 0.041 | 0.041 | 0.043 | 0.042 | 0.044 | 0.043 | 0.042 | 0.048 | 0.043 | 0.042 | |
Predict Date | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | |
HDD | 0.097 | 0.096 | 0.098 | 0.095 | 0.102 | 0.102 | 0.108 | 0.096 | 0.109 | 0.101 | |
NHD | 0.045 | 0.042 | 0.044 | 0.044 | 0.043 | 0.044 | 0.044 | 0.046 | 0.047 | 0.048 |
Method | Predict Date | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
---|---|---|---|---|---|---|---|---|---|---|---|
No feature selection | HDD | 0.026 | 0.041 | 0.051 | 0.056 | 0.064 | 0.062 | 0.061 | 0.070 | 0.073 | 0.073 |
NHD | 0.027 | 0.036 | 0.042 | 0.044 | 0.046 | 0.054 | 0.046 | 0.053 | 0.052 | 0.052 | |
Predict Date | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | |
HDD | 0.079 | 0.082 | 0.076 | 0.080 | 0.083 | 0.091 | 0.097 | 0.097 | 0.107 | 0.106 | |
NHD | 0.056 | 0.056 | 0.059 | 0.058 | 0.055 | 0.055 | 0.051 | 0.053 | 0.056 | 0.056 | |
Predict Date | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | |
HDD | 0.116 | 0.107 | 0.110 | 0.102 | 0.103 | 0.106 | 0.115 | 0.121 | 0.107 | 0.105 | |
NHD | 0.042 | 0.044 | 0.045 | 0.043 | 0.045 | 0.044 | 0.042 | 0.045 | 0.044 | 0.044 | |
Predict Date | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | |
HDD | 0.113 | 0.104 | 0.095 | 0.104 | 0.103 | 0.107 | 0.115 | 0.105 | 0.106 | 0.106 | |
NHD | 0.043 | 0.043 | 0.048 | 0.046 | 0.045 | 0.046 | 0.048 | 0.055 | 0.049 | 0.052 | |
Our feature selection | Predict Date | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
HDD | 0.027 | 0.041 | 0.053 | 0.053 | 0.057 | 0.058 | 0.058 | 0.068 | 0.069 | 0.071 | |
NHD | 0.028 | 0.033 | 0.041 | 0.040 | 0.042 | 0.044 | 0.045 | 0.048 | 0.0500 | 0.053 | |
Predict Date | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | |
HDD | 0.074 | 0.075 | 0.075 | 0.074 | 0.077 | 0.095 | 0.087 | 0.093 | 0.101 | 0.112 | |
NHD | 0.055 | 0.05 | 0.053 | 0.055 | 0.056 | 0.054 | 0.055 | 0.058 | 0.054 | 0.056 | |
Predict Date | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | |
HDD | 0.110 | 0.102 | 0.098 | 0.102 | 0.100 | 0.105 | 0.126 | 0.11 | 0.111 | 0.102 | |
NHD | 0.041 | 0.041 | 0.043 | 0.042 | 0.042 | 0.043 | 0.043 | 0.043 | 0.045 | 0.043 | |
Predict Date | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | |
HDD | 0.113 | 0.095 | 0.097 | 0.104 | 0.107 | 0.110 | 0.111 | 0.106 | 0.102 | 0.100 | |
NHD | 0.041 | 0.041 | 0.043 | 0.046 | 0.050 | 0.045 | 0.045 | 0.043 | 0.046 | 0.049 |
r | 0.1 | 0.2 | 0.3 | 0.4 | 0.5 (Ours) | 0.8 | 1.0 | |
---|---|---|---|---|---|---|---|---|
1 | HHD | 0.0436 | 0.0372 | 0.0308 | 0.0281 | 0.063 | 0.0349 | 0.0462 |
NHD | 0.0392 | 0.0327 | 0.0274 | 0.0251 | 0.0241 | 0.0318 | 0.0415 | |
2 | HHD | 0.0738 | 0.0631 | 0.0524 | 0.0489 | 0.0607 | 0.0794 | 0.0452 |
NHD | 0.0504 | 0.0421 | 0.0357 | 0.0332 | 0.0309 | 0.0419 | 0.0543 | |
3 | HHD | 0.0835 | 0.0702 | 0.0589 | 0.0557 | 0.0503 | 0.0683 | 0.0881 |
NHD | 0.0581 | 0.0489 | 0.0413 | 0.0385 | 0.0355 | 0.0476 | 0.0612 | |
4 | HHD | 0.0921 | 0.0794 | 0.0662 | 0.0618 | 0.0057 | 0.0765 | 0.0973 |
NHD | 0.0579 | 0.0489 | 0.0411 | 0.0389 | 0.0351 | 0.0474 | 0.0611 | |
5 | HHD | 0.0954 | 0.0819 | 0.0687 | 0.0632 | 0.0569 | 0.0789 | 0.1012 |
NHD | 0.0653 | 0.0552 | 0.0468 | 0.0437 | 0.0399 | 0.0539 | 0.0691 | |
6 | HHD | 0.0987 | 0.0853 | 0.0715 | 0.0651 | 0.0584 | 0.0814 | 0.1056 |
NHD | 0.0682 | 0.0578 | 0.0491 | 0.0459 | 0.0414 | 0.0564 | 0.0723 | |
7 | HHD | 0.1002 | 0.0867 | 0.0728 | 0.0664 | 0.0588 | 0.0829 | 0.1078 |
NHD | 0.0707 | 0.0599 | 0.0511 | 0.0477 | 0.0431 | 0.0585 | 0.0751 | |
8 | HHD | 0.1073 | 0.0921 | 0.0775 | 0.0719 | 0.0639 | 0.0883 | 0.1154 |
NHD | 0.0724 | 0.0614 | 0.0523 | 0.0489 | 0.0444 | 0.0599 | 0.0768 | |
9 | HHD | 0.1146 | 0.0984 | 0.0829 | 0.0772 | 0.0689 | 0.0947 | 0.1231 |
NHD | 0.0779 | 0.0661 | 0.0564 | 0.0528 | 0.0477 | 0.0643 | 0.0821 | |
10 | HHD | 0.1198 | 0.1037 | 0.0873 | 0.0815 | 0.0715 | 0.0992 | 0.1289 |
NHD | 0.0815 | 0.0693 | 0.0592 | 0.0554 | 0.0672 | 0.0857 | 0.0511 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, H.; Gao, X.; Zhang, Y.; Yang, Y. Natural Gas Consumption Forecasting Model Based on Feature Optimization and Incremental Long Short-Term Memory. Sensors 2025, 25, 3079. https://doi.org/10.3390/s25103079
Wang H, Gao X, Zhang Y, Yang Y. Natural Gas Consumption Forecasting Model Based on Feature Optimization and Incremental Long Short-Term Memory. Sensors. 2025; 25(10):3079. https://doi.org/10.3390/s25103079
Chicago/Turabian StyleWang, Huilong, Xianjun Gao, Ying Zhang, and Yuanwei Yang. 2025. "Natural Gas Consumption Forecasting Model Based on Feature Optimization and Incremental Long Short-Term Memory" Sensors 25, no. 10: 3079. https://doi.org/10.3390/s25103079
APA StyleWang, H., Gao, X., Zhang, Y., & Yang, Y. (2025). Natural Gas Consumption Forecasting Model Based on Feature Optimization and Incremental Long Short-Term Memory. Sensors, 25(10), 3079. https://doi.org/10.3390/s25103079