Gate-iInformer: Enhancing Long-Sequence Fuel Forecasting in Aviation via Inverted Transformers and Gating Networks
Abstract
1. Introduction
2. Methods
2.1. Data Preprocessing
2.1.1. Alignment, Integration, and Equal-Interval Sampling of Source Data
2.1.2. One-Hot Encoding of Aircraft Type Features
2.1.3. Relative Processing of Spatiotemporal Features and Target Variables
2.1.4. Standardization
- •
- : Standardized value.
- •
- : A value in the original data.
- •
- : Mean of the original data.
- •
- : Standard deviation of the original data, calculated as:
2.1.5. Dataset Partitioning Strategy
2.2. Gate-iInformer
3. Results and Discussion
3.1. Inversion Difference Experiment
3.2. Fuel Quantity Constraint Experiment
3.3. Experiment on Increasing Lookback Length
3.4. Gating Network Experiment
3.5. Comparative Experiments
4. Conclusions and Future Work
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Nomenclature
Multi-dimensional time-series matrix | |
Transposed time-series matrix | |
Lookback length | |
Number of flight parameter variables | |
Initial Token of the n-th variable | |
TrmBlock | Transformer Block |
Number of Transformer block layers | |
Future sequence prediction result of the n-th variable | |
Gating hidden vector | |
Relative longitude change at time t | |
Relative latitude change at time t | |
Relative fuel change at time t | |
Weight of the k-th sub-model | |
MAE | Mean Absolute Error |
RMSE | Root Mean Square Error |
iInformer | Inverted Informer |
CNN-BiLSTM | Convolutional Neural Network-Bidirectional Long Short-Term Memory |
ADS-B | Automatic Dependent Surveillance-Broadcast |
References
- Khalili, S.; Rantanen, E.; Bogdanov, D.; Breyer, C. Global Transportation Demand Development with Impacts on the Energy Demand and Greenhouse Gas Emissions in a Climate-Constrained World. Energies 2019, 12, 3870. [Google Scholar] [CrossRef]
- Seymour, K.; Held, M.; Georges, G.; Boulouchos, K. Fuel Estimation in Air Transportation: Modeling Global Fuel Consumption for Commercial Aviation. Transp. Res. Part D Transp. Environ. 2020, 88, 102528. [Google Scholar] [CrossRef]
- Ding, S.; Ma, Q.; Qiu, T.; Gan, C.; Wang, X. An Engine-Level Safety Assessment Approach of Sustainable Aviation Fuel Based on a Multi-Fidelity Aerodynamic Model. Sustainability 2024, 16, 3814. [Google Scholar] [CrossRef]
- Gössling, S.; Humpe, A.; Fichert, F.; Creutzig, F. COVID-19 and Pathways to Low-Carbon Air Transport Until 2050. Environ. Res. Lett. 2021, 16, 034063. [Google Scholar] [CrossRef]
- Gómez Comendador, V.F.; Arnaldo Valdés, R.M.; Lisker, B. A Holistic Approach to the Environmental Certification of Green Airports. Sustainability 2019, 11, 4043. [Google Scholar] [CrossRef]
- Dalmau Codina, R.; Melgosa Farrés, M.; Vilardaga García-Cascón, S.; Prats Menéndez, X. A Fast and Flexible Aircraft Trajectory Predictor and Optimiser for ATM Research Applications. In Proceedings of the 8th International Conference for Research in Air Transportation (ICRAT), Castelldefels, Spain, 26–29 June 2018; pp. 1–8. [Google Scholar]
- Gardi, A.; Sabatini, R.; Ramasamy, S. Multi-Objective Optimisation of Aircraft Flight Trajectories in the ATM and Avionics Context. Prog. Aerosp. Sci. 2016, 83, 1–36. [Google Scholar] [CrossRef]
- Zhu, X.; Li, L. Flight Time Prediction for Fuel Loading Decisions with a Deep Learning Approach. Transp. Res. Part C Emerg. Technol. 2021, 128, 103179. [Google Scholar] [CrossRef]
- Metlek, S. A New Proposal for the Prediction of an Aircraft Engine Fuel Consumption: A Novel CNN-BiLSTM Deep Neural Network Model. Aircr. Eng. Aerosp. Technol. 2023, 95, 838–848. [Google Scholar] [CrossRef]
- Ma, J.; Zhao, J.; Hou, Y. Spatial–Temporal Transformer Networks for Traffic Flow Forecasting Using a Pre-Trained Language Model. Sensors 2024, 24, 5502. [Google Scholar] [CrossRef]
- Khan, W.A.; Ma, H.L.; Ouyang, X.; Mo, D.Y. Prediction of Aircraft Trajectory and the Associated Fuel Consumption Using Covariance Bidirectional Extreme Learning Machines. Transp. Res. Part E Logist. Transp. Rev. 2021, 145, 102189. [Google Scholar] [CrossRef]
- Lin, Y.; Guo, D.; Wu, Y.; Li, L.; Wu, E.Q.; Ge, W. Fuel Consumption Prediction for Pre-Departure Flights Using Attention-Based Multi-Modal Fusion. Inf. Fusion 2024, 101, 101983. [Google Scholar] [CrossRef]
- Brown, T.; Mann, B.; Ryder, N.; Subbiah, M.; Kaplan, J.D.; Dhariwal, P.; Neelakantan, A.; Shyam, P.; Sastry, G.; Askell, A.; et al. Language Models are Few-Shot Learners. In Proceedings of the Advances in Neural Information Processing Systems (NeurIPS), Virtual, 6–12 December 2020. [Google Scholar]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale. In Proceedings of the International Conference on Learning Representations (ICLR), Virtual, 3–7 May 2021. [Google Scholar]
- Nie, Y.; Nguyen, N.H.; Sinthong, P.; Kalagnanam, J. A Time Series is Worth 64 Words: Long-Term Forecasting with Transformers. In Proceedings of the International Conference on Learning Representations (ICLR), Kigali, Rwanda, 1–5 May 2023. [Google Scholar]
- Liu, Y.; Wu, H.; Wang, J.; Long, M. Non-Stationary Transformers: Rethinking the Stationarity in Time Series Forecasting. In Proceedings of the Advances in Neural Information Processing Systems (NeurIPS), New Orleans, LA, USA, 28 November–9 December 2022. [Google Scholar]
- Dong, J.; Wu, H.; Zhang, H.; Zhang, L.; Wang, J.; Long, M. SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling. arXiv 2023, arXiv:2302.00861. [Google Scholar]
- Ekambaram, V.; Jati, A.; Nguyen, N.; Sinthong, P.; Kalagnanam, J. TSMixer: Lightweight MLP-Mixer Model for Multivariate Time Series Forecasting. In Proceedings of the ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD), Long Beach, CA, USA, 6–10 August 2023. [Google Scholar]
- Zeng, A.; Chen, M.; Zhang, L.; Xu, Q. Are Transformers Effective for Time Series Forecasting? In Proceedings of the AAAI Conference on Artificial Intelligence, Washington, DC, USA, 7–14 February 2023; Volume 37, pp. 1121–11128. [Google Scholar]
- Liu, Y.; Hu, T.; Zhang, H.; Wu, H.; Wang, S.; Ma, L.; Long, M. iTransformer: Inverted Transformers Are Effective for Time Series Forecasting. arXiv 2023, arXiv:2310.06625. [Google Scholar]
- Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. Proc. AAAI Conf. Artif. Intell. 2021, 35, 11106–11115. [Google Scholar] [CrossRef]
- Jepsen, T.S.; Jensen, C.S.; Nielsen, T.D. Relational Fusion Networks: Graph Convolutional Networks for Road Networks. IEEE Trans. Intell. Transp. Syst. 2022, 23, 418–429. [Google Scholar] [CrossRef]
- Kang, L.; Hansen, M.; Ryerson, M.S. Evaluating Predictability Based on Gate-In Fuel Prediction and Cost-to-Carry Estimation. J. Air Transp. Manag. 2018, 67, 146–152. [Google Scholar] [CrossRef]
- Zhao, L.; Song, Y.; Zhang, C.; Liu, Y.; Wang, P.; Lin, T.; Li, H. T-GCN: A Temporal Graph Convolutional Network for Traffic Prediction. IEEE Trans. Intell. Transp. Syst. 2020, 21, 3848–3858. [Google Scholar] [CrossRef]
- Dong, S.; Yu, T.; Farahmand, H.; Mostafavi, A. A Hybrid Deep Learning Model for Predictive Flood Warning and Situation Awareness Using Channel Network Sensors Data. Comput. Aided Civ. Infrastruct. Eng. 2021, 36, 402–420. [Google Scholar] [CrossRef]
- Ryerson, M.S.; Hansen, M.; Bonn, J. Time to Burn: Flight Delay, Terminal Efficiency, and Fuel Consumption in the National Airspace System. Transp. Res. Part A Policy Pract. 2014, 69, 286–298. [Google Scholar] [CrossRef]
- Shazeer, N.; Mirhoseini, A.; Maziarz, K.; Davis, A.; Le, Q.V.; Hinton, G.E.; Dean, J. Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer. arXiv 2017, arXiv:1701.06538. [Google Scholar]
- Yan, C.; Xiang, X.; Wang, C. Towards Real-Time Path Planning Through Deep Reinforcement Learning for a UAV in Dynamic Environments. J. Intell. Robot. Syst. 2020, 98, 297–309. [Google Scholar] [CrossRef]
- Raissi, M.; Perdikaris, P.; Karniadakis, G.E. Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations. J. Comput. Phys. 2019, 378, 686–707. [Google Scholar] [CrossRef]
- Jacobs, R.A.; Jordan, M.I.; Nowlan, S.J.; Hinton, G.E. Adaptive Mixtures of Local Experts. Neural Comput. 1991, 3, 79–87. [Google Scholar] [CrossRef]
Hyperparameter | Selection |
---|---|
Loss Function | Mean Square Error (MSE) |
Dropout | 0.1 |
Optimizer | Adam |
Learning rate | 0.0001 |
Batch size | 32 |
Number of epochs | 24 |
Factor | 1 |
Moving_avg | 25 |
Lookback Length | Model | MAE | Promotion | p-Value (α = 0.05) | 95% Confidence Interval (Improvement Value) |
---|---|---|---|---|---|
50 | Informer | 0.202 ± 0.005 | - | - | - |
iInformer | 0.196 ± 0.004 | 2.97% ± 0.41% | 0.021 | [0.003, 0.009] | |
150 | Informer | 0.152 ± 0.004 | - | - | - |
iInformer | 0.134 ± 0.002 | 11.84% ± 0.52% | 0.012 | [0.016, 0.020] | |
250 | Informer | 0.114 ± 0.003 | - | - | - |
iInformer | 0.094 ± 0.002 | 17.54% ± 0.61% | 0.008 | [0.018, 0.022] | |
350 | Informer | 0.148 ± 0.004 | - | - | - |
iInformer | 0.068 ± 0.002 | 53.38% ± 1.25% | 0.002 | [0.076, 0.084] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wu, Y.; Fu, J.; Li, Y.; Feng, W.; Zhu, Y.; Li, L. Gate-iInformer: Enhancing Long-Sequence Fuel Forecasting in Aviation via Inverted Transformers and Gating Networks. Aerospace 2025, 12, 904. https://doi.org/10.3390/aerospace12100904
Wu Y, Fu J, Li Y, Feng W, Zhu Y, Li L. Gate-iInformer: Enhancing Long-Sequence Fuel Forecasting in Aviation via Inverted Transformers and Gating Networks. Aerospace. 2025; 12(10):904. https://doi.org/10.3390/aerospace12100904
Chicago/Turabian StyleWu, Yanxiong, Junqi Fu, Yu Li, Wenjing Feng, Yongshuo Zhu, and Lu Li. 2025. "Gate-iInformer: Enhancing Long-Sequence Fuel Forecasting in Aviation via Inverted Transformers and Gating Networks" Aerospace 12, no. 10: 904. https://doi.org/10.3390/aerospace12100904
APA StyleWu, Y., Fu, J., Li, Y., Feng, W., Zhu, Y., & Li, L. (2025). Gate-iInformer: Enhancing Long-Sequence Fuel Forecasting in Aviation via Inverted Transformers and Gating Networks. Aerospace, 12(10), 904. https://doi.org/10.3390/aerospace12100904