Development of a Tractor Hydrostatic Transmission Efficiency Prediction Model Using Novel Hybrid Deep Kernel Learning and Residual Radial Basis Function Interpolator Model
Abstract
1. Introduction
- (1)
- The proposed predictive model was developed using DKL, which is advantageous when a limited number of training samples are available and is further enhanced by incorporating a residual Radial Basis Function (RBF) interpolator to compensate for residual errors.
- (2)
- For model training, the input datasets were constructed using the maximum, minimum, and mean values of the input variables. The input variables are the input shaft speed, HST ratio, and load, and the output variable is the overall efficiency.
- (3)
- Despite relying on minimal datasets, the proposed model is expected to provide stable and accurate HST efficiency predictions across the full operating range of agricultural tractors.
2. Materials and Methods
2.1. HST System
2.2. HST Test Bench
2.3. Efficiency Prediction Model
2.3.1. Data Sampling from an Experimental Design Perspective
2.3.2. Data Partitioning and Preprocessing
2.3.3. DKL Regression Model
2.3.4. Residual Modeling with the RBF Interpolator
2.3.5. Hyperparameter Extraction via Bayesian Optimization
Neuron Numbers
Learning Rates
Weight Decay
RBF Smoothing
Epochs
2.4. Modeling Training and Validation
3. Results
3.1. Experimental Data Measurement Using HST Test Bench
3.2. Results of Data Partitioning and Preprocessing
3.3. Hyperparameter Extraction Through Bayesian Optimization
3.4. Validation of Prediction Model
3.4.1. Accuracy Comparison with Commonly Used Prediction Models
3.4.2. DKL Only, Residual RBF Interpolator Only, DKL with Residual RBF Interpolator
4. Discussion
5. Conclusions
- (1)
- A prediction model was developed for accurate HST efficiency estimation using 27 sets of experimental data. The model was constructed based on DKL, which is advantageous for prediction with limited datasets, and was further enhanced by integrating a residual RBF interpolator for error correction.
- (2)
- A minimal dataset was constructed by extracting samples, specifically the maximum, minimum, and mean values of each input variable (input shaft speed, HST ratio, and load). Consequently, 27 data points were selected from a full set of 5092 experimental samples.
- (3)
- Optimal hyperparameters were extracted to enhance the model’s prediction accuracy. Bayesian optimization was performed, and the neuron number, learning rate, weight decay, RBF smoothing, and epochs were determined.
- (4)
- The performance of the proposed DKL with the residual RBF interpolator model was validated. For this purpose, its predictive performance was compared with that of other existing models. The proposed model achieved a high prediction accuracy (= 0.93; MAPE = 5.94%; RMSE = 4.05), even with limited training data. These results showed higher prediction accuracy than the Neural NN (= 0.41; MAPE = 19.57%; RMSE = 11.94%), Random Forest (= 0.11; MAPE = 19.59%; RMSE = 14.70), XGBoost (= −2.25; MAPE = 39.35%; RMSE = 28.11), GP ( = 0.63; MAPE = 14.53%; RMSE = 9.46), SVR ( = 0.54; MAPE = 16.16%; RMSE = 10.57) models.
- (5)
- The proposed model was compared with the DKL-only and the residual-RBF interpolator-only models. The results show that the proposed model exhibited higher prediction accuracy than both the DKL-only model (= 0.83; MAPE = 9.26%; RMSE = 6.20) and the residual-RBF-interpolator-only model ( = 0.71; MAPE = 12.34%; RMSE = 8.31). Thus, it was confirmed that the combined model performed better than the individual models.
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
| HST | Hydrostatic transmission |
| HST ratio | Hydrostatic transmission pump swash-plate ratio (%) |
| Load | Hydrostatic transmission line pressure difference (bar) |
| NN | Neural network |
| DKL | Deep kernel learning |
| RBF | Radial basis function |
| GP | Gaussian process |
| RMSE | Root mean squared error |
| LOOCV | Leave-one-out cross-validation |
| MAPE | Mean absolute percentage error (%) |
References
- Kim, W.S.; Kim, Y.J.; Park, S.U.; Hong, S.J.; Kim, Y.S. Evaluation of PTO severness for 78kW-class tractor according to disk plow tillage and rotary tillage. J. Drive Control 2019, 16, 23–31. [Google Scholar] [CrossRef]
- Kim, W.S.; Kim, Y.J.; Kim, Y.S.; Baek, S.Y.; Baek, S.M.; Lee, D.H.; Nam, K.C.; Kim, T.B.; Lee, H.J. Development of control system for automated manual transmission of 45-kW agricultural tractor. Appl. Sci. 2020, 10, 2930. [Google Scholar] [CrossRef]
- Lukas, B.; Patrick, B.; Leon, S.; Markus, K. Enhanced efficiency prediction of an electrified off-highway vehicle transmission utilizing machine learning methods. Procedia Comput. Sci. 2021, 192, 417–426. [Google Scholar] [CrossRef]
- Park, Y.J.; Kim, S.C.; Kim, J.G. Analysis and verification of power transmission characteristics of the hydromechanical transmission for agricultural tractors. J. Mech. Sci. Technol. 2016, 30, 5063–5072. [Google Scholar] [CrossRef]
- Kim, D.M.; Kim, S.C.; Noh, D.K.; Jang, J.S. Jerk phenomenon of the hydrostatic transmission through the experiment and analysis. Int. J. Automot. Technol. 2015, 16, 783–790. [Google Scholar] [CrossRef]
- Ho, T.H.; Ahn, K.K. Modeling and simulation of hydrostatic transmission system with energy regeneration using hydraulic accumulator. J. Mech. Sci. Technol. 2010, 24, 1163–1175. [Google Scholar] [CrossRef]
- Jung, G.H. Gear train design of 8-speed automatic transmission for tractor. J. Drive Control 2013, 10, 30–36. [Google Scholar] [CrossRef]
- Manring, N.D. Mapping the efficiency for a hydrostatic transmission. J. Dyn. Sys. Meas. Control 2016, 138, 031004. [Google Scholar] [CrossRef]
- Kim, S.D.; Cho, H.S.; Lee, C.O. A parameter sensitivity analysis for the dynamic model of a variable displacement axial piston pump. Proc. Inst. Mech. Eng. 1987, 201, 235–243. [Google Scholar] [CrossRef]
- Manring, N.D.; Luecke, G.R. Modeling and designing a hydrostatic transmission with a fixed-displacement motor. J. Dyn. Sys. Meas. Control 1998, 120, 45–49. [Google Scholar] [CrossRef]
- Dasgupta, K. Analysis of a hydrostatic transmission system using low speed high torque motor. Mech. Mach. Theory 2000, 35, 1481–1499. [Google Scholar] [CrossRef]
- Mandal, S.K.; Singh, A.K.; Verma, Y.; Dasgupta, K. Performance investigation of hydrostatic transmission system as a function of pump speed and load torque. J. Inst. Eng. India Ser. C 2012, 93, 187–193. [Google Scholar] [CrossRef]
- Pandey, A.K.; Vardhan, A.; Dasgupta, K. Theoretical and experimental studies of the steady-state performance of a primary and secondary-controlled closed-circuit hydrostatic drive. J. Process Mech. Eng. 2019, 233, 1024–1035. [Google Scholar] [CrossRef]
- Singh, V.P.; Pandey, A.K.; Dasgupta, K. Steady-state performance investigation of closed-circuit hydrostatic drive using variable displacement pump and variable displacement motor. J. Process Mech. Eng. 2020, 235, 249–258. [Google Scholar] [CrossRef]
- Wöhling, T.; Delgadillo, A.O.C.; Kraft, M.; Guthke, A. Comparing Physics-Based, Conceptual and Machine-Learning Models to Predict Groundwater Levels by BMA. Groundwater 2025, 63, 484–505. [Google Scholar] [CrossRef]
- Choi, D.; An, Y.; Lee, N.; Park, J.; Lee, J. Comparative Study of Physics-Based Modeling and Neural Network Approach to Predict Cooling in Vehicle Integrated Thermal Management System. Energies 2020, 13, 5301. [Google Scholar] [CrossRef]
- Gumiere, S.J.; Camporses, M.; Botto, A.; Lafond, J.A.; Paniconi, C.; Gallichand, J.M.; Rousseau, A.N. Machine Learning vs. Physics-Based Modeling for Real-Time Irrigation Management. Front. Water 2020, 2, 8. [Google Scholar] [CrossRef]
- Brunton, S.L.; Noack, B.R.; Koumoutsakos, P. Machine learning for fluid mechanics. Annu. Rev. Fluid Mech. 2020, 52, 477–508. [Google Scholar] [CrossRef]
- Lee, J.W.; Kim, H.G.; Jang, J.H.; Park, S.D. An experimental study of the characteristics of hydrostatic transmission. Adv. Robot. 2015, 29, 939–946. [Google Scholar] [CrossRef]
- Lu, K.; Liu, M.; Lu, Z.; Shi, J.; Xing, P.; Wang, L. Research of transmission efficiency prediction of heavy-duty tractors HMCVT based on VMD and PSO-BP. Agriculture 2014, 14, 539. [Google Scholar] [CrossRef]
- Li, C.; Xia, Z.; Tang, Y. Prediction and dynamic simulation verification of output characteristics of radial piston motors based on neural networks. Machines 2024, 12, 491. [Google Scholar] [CrossRef]
- Hagiwara, K.; Fukumizu, K. Relation between Weight Size and Degree of Over-Fitting in Neural Network Regression. Neural Netw. 2008, 21, 377–386. [Google Scholar] [CrossRef]
- Baiz, A.A.; Ahmadi, H.; Shariatmadari, F.; Karimi Torshizi, M.A. A Gaussian Process Regression Model to Predict Energy Contents of Corn for Poultry. Poult. Sci. 2020, 99, 5838–5843. [Google Scholar] [CrossRef]
- Shi, X.; Jiang, D.; Qian, W.; Liang, Y. Application of the gaussian process regression method based on a combined kernel function in engine performance prediction. ASC Omega 2022, 7, 41732–41743. [Google Scholar] [CrossRef]
- Mallick, A.; Dwivedi, C.; Kailkhura, B.; Joshi, G.; Han, T.Y. Deep kernels with probabilistic embeddings for small-data learning. arXiv 2021, arXiv:1910.05858. [Google Scholar] [CrossRef]
- Wilson, A.G.; Hu, Z.; Salakhutdinov, R.; Xing, E.P. Deep Kernel Learning. arXiv 2016, arXiv:1511.02222. [Google Scholar] [CrossRef]
- Singh, S.; Hernández-Lobato, J.M. Deep Kernel learning for reaction outcome prediction and optimization. Commun. Chem. 2024, 7, 136. [Google Scholar] [CrossRef]
- Costa, G.K.; Sepehri, N. Understanding overall efficiency of hydrostatic pumps and motors. Int. J. Fluid Power 2018, 19, 106–116. [Google Scholar] [CrossRef]
- Skorek, G. Study of losses and energy efficiency of hydrostatic drives with hydraulic cylinder. Pol. Marit. Res. 2018, 25, 114–129. [Google Scholar] [CrossRef]
- KS B 6516:2021; Test Methods for Electronically Controlled Oil Hydraulic Pumps. Korean Standards Association: Seoul, Republic of Korea, 2021.
- Van Hoeven, L.R.; Janssen, M.P.; Roes, K.C.; Koffijberg, H. Aiming for a representative sample: Simulating random versus purposive strategies for hospital selection. BMC Med. Res. Methodol. 2015, 15, 90. [Google Scholar] [CrossRef]
- Palinkas, L.A.; Horwitz, S.M.; Green, C.A.; Wisdom, J.P.; Duan, N.; Hoagwood, K. Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Adm. Policy Ment. Health Ment. Health Serv. Res. 2015, 42, 533–544. [Google Scholar] [CrossRef]
- Box, G.E.P.; Wilson, K.B. On the Experimental Attainment of Optimum Conditions. J. R. Stat. Soc. Ser. B 1951, 13, 1–38. [Google Scholar] [CrossRef]
- Baker, T.B.; Smith, S.S.; Bolt, D.M.; Loh, W.Y.; Mermelstein, R.; Fiore, M.C.; Piper, M.E.; Collins, L.M. Implementing Clinical Research Using Factorial Designs: A Primer. Behav. Ther. 2017, 48, 567–580. [Google Scholar] [CrossRef]
- Mukangango, J.; Muyskens, A.; Priest, B.W. A robust approach to Gaussian process implementation. ASCMOAdv. Stat. Clim. Meteorol. Oceanogr. 2024, 10, 143–158. [Google Scholar] [CrossRef]
- Islam, M.J.; Ahmad, S.; Haque, F.; Reaz, M.B.I.; Bhuiyan, M.A.S.; Islam, M.R. Application of min-max normalization on subject-invariant EMG pattern recognition. IEEE Trans. Instrum. Meas. 2022, 71, 2521612. [Google Scholar] [CrossRef]
- Zinage, S.; Mondal, S.; Sarkar, S. Dkl-kan: Scalable deep kernel learning using kolmogorovarnold networks. arXiv 2024, arXiv:2407.21176. [Google Scholar] [CrossRef]
- Milsom, E.; Anson, B.; Aitchison, L. Convolutional deep kernel machines. arXiv 2023, arXiv:2309.09814. [Google Scholar] [CrossRef]
- Ament, S.; Santorella, E.; Eriksson, D.; Letham, B.; Balandat, M.; Bakshy, E. Robust Gaussian processes via relevance pursuit. Adv. Neural Inf. Process Syst. 2024, 37, 61700–61734. [Google Scholar]
- Wenger, J.; Wu, K.; Hennig, P.; Gardner, J.; Pleiss, G.; Cunningham, J.P. Computation-aware gaussian processes: Model selection and linear-time inference. Adv. Neural Inf. Process Syst. 2024, 37, 31316–31349. [Google Scholar]
- Wang, X.; Aitchison, L. How to set AdamW’s weight decay as you scale model and dataset size. arXiv 2024, arXiv:2405.13698. [Google Scholar] [CrossRef]
- Hussein, B.M.; Shareef, S.M. An empirical study on the correlation between early stopping patience and epochs in deep learning. In Proceedings of the ITM Web of Conferences, Erbil, Iraq, 20–21 May 2024; Volume 64, p. 01003. [Google Scholar] [CrossRef]
- Candido, A.; Debbio, L.D.; Giani, T.; Petrillo, G. Bayesian inference with Gaussian processes for the determination of parton distribution functions. Eur. Phys. J. C 2024, 84, 716. [Google Scholar] [CrossRef]
- Kapadia, H.; Feng, L.; Benner, P. Active-learning-driven surrogate modeling for efficient simulation of parametric nonlinear systems. Comput. Methods Appl. Mech. Eng. 2024, 419, 116657. [Google Scholar] [CrossRef]
- Franco-Villoria, M.; Ignaccolo, R. Universal, Residual, and External Drift Functional Kriging. In Geostatistical Functional Data Analysis; Mateu, J., Giraldo, R., Eds.; Wiley: Hoboken, NJ, USA, 2022; pp. 55–72. [Google Scholar]
- Nakamura, A.; Yamanaka, Y.; Nomura, R.; Moriguchi, S.; Terada, K. Radial basis function-based surrogate computational homogenization for elastoplastic composites at finite strain. Comput. Methods Appl. Mech. Eng. 2025, 436, 117708. [Google Scholar] [CrossRef]
- He, H.; Chen, Z.; He, C.; Ni, L.; Chen, G. A hierarchical updating method for finite element model of airbag buffer system under landing impact. Chin. J. Aeronaut 2015, 28, 1629–1639. [Google Scholar] [CrossRef]
- Huang, T.; Liu, Y.; Pan, Z. Deep residual surrogate model. Inf. Sci. 2022, 605, 86–98. [Google Scholar] [CrossRef]
- de Gooijer, B.M.; Havinga, J.; Geijselaers, H.J.; van den Boogaard, A.H. Radial basis function interpolation of fields resulting from nonlinear simulations. Eng. Comput. 2024, 40, 129–145. [Google Scholar] [CrossRef]
- Poggio, T.; Girosi, F.; Jones, M. From regularization to radial, tensor and additive splines. In Proceedings of the 1993 International Conference on Neural Networks (IJCNN), Nagoya, Japan, 25-29 October 1993. [Google Scholar]
- Guo, Y.; Nath, P.; Mahadevan, S.; Witherell, P. Active learning for adaptive surrogate model improvement in high-dimensional problems. Struct. Multidiscip. Optim. 2024, 67, 122. [Google Scholar] [CrossRef]
- Pietrenko-Dabrowska, A.; Koziel, S.; Golunski, L. Two-stage variable-fidelity modeling of antennas with domain confinement. Sci. Rep. 2022, 12, 17275. [Google Scholar] [CrossRef]
- Ludot, A.; Snedker, T.H.; Kolios, A.; Bayati, I. Data-Driven Surrogate Models for Real-Time Fatigue Monitoring of Chain Mooring Lines in Floating Wind Turbines. Wind Energy Sci. 2025, 2025, 1–36. [Google Scholar] [CrossRef]
- Iyengar, G.; Lam, H.; Wang, T. Is cross-validation the gold standard to estimate out-of-sample model performance? Adv. Neural Inf. Process Syst. 2024, 37, 94736–94775. [Google Scholar] [CrossRef]
- Greber, K.E.; Topka Kłończyński, K.; Nicman, J.; Judzińska, B.; Jarzyńska, K.; Singh, Y.R.; Sawicki, W.; Puzyn, T.; Jagiello, K.; Ciura, K. Application of biomimetic chromatography and QSRR approach for characterizing organophosphate pesticides. Int. J. Mol. Sci. 2025, 26, 1855. [Google Scholar] [CrossRef]
- Sudhakar, A.; Sujatha, S.; Sathiya, M.; Sivaramakrishnan, A.; Subramanian, B.; Venkata, R.K. Bayesian Optimization for Hyperparameter Tuning in Healthcare for Diabetes Prediction. Informing Sci. 2025, 28, 8. [Google Scholar] [CrossRef]
- Cavieres, J.; Karkulik, M. Efficient estimation for a smoothing thin plate spline in a two-dimensional space. arXiv 2024, arXiv:2404.01902. [Google Scholar] [CrossRef]
- Al-Shedivat, M.; Wilson, A.G.; Saatchi, Y.; Hu, Z.; Xing, E.P. Learning scalable deep kernels with recurrent structure. J. Mach. Learn. Res. 2017, 18, 1–37. [Google Scholar]
- van der Lende, M.; Ferrao, J.L.; Müller-Hof, N. Evaluating Uncertainty in Deep Gaussian Processes. arXiv 2025, arXiv:2504.17719. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1097–1105. [Google Scholar] [CrossRef]
- Yong, H.; Huang, J.; Hua, X.; Zhang, L. Gradient centralization: A new optimization technique for deep neural networks. arXiv 2020, arXiv:2004.01461. [Google Scholar] [CrossRef]
- Sun, X.; Wang, N.; Chen, C.Y.; Ni, J.; Agrawal, A.; Cui, X.; Venkataramani, S.; Maghraoui, K.E.; Srinivasan, V.; Gopalakrishnan, K. Ultra-low precision 4-bit training of deep neural networks. Adv. Neural Inf. Process. Syst. 2020, 33, 1796–1807. [Google Scholar]
- Issa, Z.; Horvath, B. Non-parametric online market regime detection and regime clustering for multidimensional and path-dependent data structures. arXiv 2023, arXiv:2306.15835. [Google Scholar] [CrossRef]
- Horňas, J.; Běhal, J.; Homola, P.; Doubrava, R.; Holzleitner, M.; Senck, S. A machine learning based approach with an augmented dataset for fatigue life prediction of additively manufactured Ti-6Al-4V samples. Eng. Fract. Mech. 2023, 293, 109709. [Google Scholar] [CrossRef]
- Ober, S.W.; Rasmussen, C.E.; van der Wilk, M. The Promises and Pitfalls of Deep Kernel Learning. arXiv 2021, arXiv:2102.12108. [Google Scholar] [CrossRef]
- Zhang, K.; Karanth, S.; Patel, B.; Murphy, R.; Jiang, X. Real-time prediction for mechanical ventilation in covid-19 patients using a multi-task gaussian process multi-objective self-attention network. arXiv 2021, arXiv:2102.01147. [Google Scholar] [CrossRef]
- Plevris, V.; Solorzano, G.; Bakas, N.P.; Seghier, M.E.A. Investigation of Performance Metrics in Regression Analysis and Machine Learning-Based Prediction Models. In Proceedings of the 8th European Congress on Computational Methods in Applied Sciences and Engineering (ECCOMAS 2022), Oslo, Norway, 5–9 June 2022. [Google Scholar]
- Botchkarev, A. Performance metrics (error measures) in machine learning regression, forecasting and prognostics: Properties and typology. arXiv 2018, arXiv:1809.03306. [Google Scholar] [CrossRef]
- Chicco, D.; Warrens, M.J.; Jurman, G. The Coefficient of Determination R-Squared Is More Informative than SMAPE, MAE, MAPE, MSE and RMSE in Regression Analysis Evaluation. PeerJ Comput. Sci. 2021, 7, e623. [Google Scholar] [CrossRef]
- Kern, C.; Klausch, T.; Kreuter, F. Tree-based Machine Learning Methods for Survey Research. Surv. Res. Methods 2019, 13, 73–93. [Google Scholar]
- Cao, J.; Tao, T. Using machine-learning models to understand nonlinear relationships between land use and travel. Transp. Res. Part D 2023, 123, 103930. [Google Scholar] [CrossRef]
- Lyu, C.; Liu, X.; Mihaylova, L. Review of Recent Advances in Gaussian Process Regression Methods. arXiv 2022, arXiv:2409.08112. [Google Scholar] [CrossRef]
- Drucker, H.; Burges, C.C.; Kaufman, L.; Smola, A.; Vapnik, V. Support vector regression machines. Advances in Neural Information Processing Systems 1997, 28(7), 779–784. [Google Scholar]
- Rivas-Perea, P.; Cota-Ruiz, J.; Garcia Chaparro, D.; Perez Venzor, J.A.; Quezada Carreón, A.; Rosiles, J.G. Support vector machines for regression: A succinct review of large-scale and linear programming formulations. Int. J. Intell. Sci. 2013, 3, 5–14. [Google Scholar] [CrossRef]
- Bloch, G.; Lauer, F.; Colin, G.; Chamaillard, Y. Support vector regression from simulation data and few experimental samples. Inf. Sci. 2008, 178, 3813–3827. [Google Scholar] [CrossRef]
- Kaliappan, J.; Srinivasan, K.; Qaisar, S.M.; Sundararajan, K.; Chang, C.Y.; C, S. Performance evaluation of regression models for the prediction of the COVID-19 reproductive rate. Front. Public Health 2021, 9, 729795. [Google Scholar] [CrossRef]
- Chen, Y.; Zhang, Y.; Li, C.; Zhou, J. Application of XGBoost model optimized by multi-algorithm ensemble in predicting FRP-concrete interfacial bond strength. Materials 2025, 18, 2868. [Google Scholar] [CrossRef]
- Yang, H.; Guo, S.; Xie, H.; Wen, J.; Wang, J. Evaluation of Machine Learning Models for Predicting Performance Metrics of Aero-Engine Combustors. Case Stud. Therm. Eng. 2025, 65, 105627. [Google Scholar] [CrossRef]
- Lee, J.; Cho, Y. National-Scale Electricity Peak Load Forecasting: Traditional, Machine Learning, or Hybrid Model? Energy 2022, 239, 122366. [Google Scholar] [CrossRef]
- Brunzema, P.; Jordahn, M.; Willes, J.; Trimpe, S.; Snoek, J.; Harrison, J. Bayesian Optimization via Continual Variational Last Layer Training. arXiv 2024, arXiv:2412.09477. [Google Scholar] [CrossRef]
- Cong, H.; Wang, B.; Wang, Z. A novel Gaussian process surrogate model with expected prediction error for optimization under constraints. Mathematics 2024, 12, 1115. [Google Scholar] [CrossRef]
- Cha, G.W.; Moon, H.J.; Kim, Y.C. Comparison of Random Forest and Gradient Boosting Machine Models for Predicting Demolition Waste Based on Small Datasets and Categorical Variables. Int. J. Environ. Res. Public Health 2021, 18, 8530. [Google Scholar] [CrossRef]
- Alexander, D.L.; Tropsha, A.; Winkler, D.A. Beware of R2: Simple, unambiguous assessment of the prediction accuracy of QSAR and QSPR models. J. Chem. Inf. Model. 2015, 55, 1316–1322. [Google Scholar] [CrossRef] [PubMed]
- Chen, Y.; Hua, C.; Tong, C.; Zang, Y.; Ruan, J. Volumetric Efficiency in Motor-Driven Two-Dimensional Piston Pumps with Leakage and Reverse Flow under High Pressure. Sci. Rep. 2024, 14, 28963. [Google Scholar] [CrossRef]
- Huang, Y.; Ruan, J.; Zhang, C.; Ding, C.; Li, S. Research on the Mechanical Efficiency of High-Speed 2D Piston Pumps. Processes 2020, 8, 853. [Google Scholar] [CrossRef]
- Kauranne, H. Effect of Operating Parameters on Efficiency of Swash-Plate Type Axial Piston Pump. Energies 2022, 15, 4030. [Google Scholar] [CrossRef]
- Manring, N.D.; Mehta, V.S.; Nelson, B.E.; Graf, K.J.; Kuehn, J.L. Increasing the Power Density for Axial-Piston Swash-Plate Type Hydrostatic Machines. J. Mech. Des. 2013, 135, 071002. [Google Scholar] [CrossRef]
- Wang, Z.; Xing, W.; Kirby, R.; Zhe, S. Physics Informed Deep Kernel Learning. arXiv 2022, arXiv:2006.04976. [Google Scholar] [CrossRef]
- Inaguma, Y.; Yoshida, N. Mathematical Analysis of Influence of Oil Temperature on Efficiencies in Hydraulic Pumps for Automatic Transmissions. SAE Int. J. Passeng. Cars–Mech. Syst. 2013, 6, 786–797. [Google Scholar] [CrossRef]
- Gaugel, S.; Reichert, M. Industrial Transfer Learning for Multivariate Time Series Segmentation: A Case Study on Hydraulic Pump Testing Cycles. Sensors 2023, 23, 3636. [Google Scholar] [CrossRef]
- Orhan, N. Predicting deep well pump performance with machine learning methods during hydraulic head changes. Heliyon 2024, 10, e31505. [Google Scholar] [CrossRef]
- Peric, B.; Engler, M.; Schuler, M.; Gutshce, K.; Woias, P. Using Neural Networks as a Data-Driven Model to Predict the Behavior of External Gear Pumps. Processes 2024, 12, 526. [Google Scholar] [CrossRef]

















| Variable | Range | Interval |
|---|---|---|
| Input shaft speed [rpm] | 1000.00–2600.00 | 200.00 |
| HST ratio [%] | 10.00–100.00 | 5.00 |
| Load [bar] | 50.00–340.00 | 10.00 |
| Variable | Minimum | Mean | Maximum | STD | Range |
|---|---|---|---|---|---|
| Input shaft speed [rpm] | 1000.00 | 1800.00 | 2600.00 | 655.64 | 1600.00 |
| HST ratio [%] | 10.00 | 55.56 | 100.00 | 36.78 | 90.00 |
| Load [bar] Efficiency [%] | 50.00 8.41 | 194.81 48.49 | 340.00 78.10 | 120.24 25.07 | 290.00 69.69 |
| Hyperparameters | Range | Data Type | Selected Value |
|---|---|---|---|
| Neuron numbers | [64, 128, 256] | Continuous | 64 |
| Learning rate | [1 × 10−3, 1 × 10−1] | Log-uniform | 3.147 × 10−2 |
| Weight decay | [1 × 10−5, 1 × 10−2] | Log-uniform | 7.749 × 10−5 |
| RBF smoothing | [1 × 10−6, 1 × 10−1] | Log-uniform | 9.100 × 10−6 |
| Epochs | [20, 100] | Integral | 59 |
| Model | MAPE (%) | RMSE | |
|---|---|---|---|
| NN | 0.41 | 19.57 | 11.94 |
| Random Forest | 0.11 | 19.59 | 14.70 |
| XGBoost | −2.25 | 39.35 | 28.11 |
| GP | 0.63 | 14.53 | 9.46 |
| SVM | 0.54 | 16.16 | 10.57 |
| Model | MAPE (%) | RMSE | |
|---|---|---|---|
| DKL-only | 0.83 | 9.26 | 6.20 |
| residual-RBF-interpolator-only | 0.71 | 12.34 | 8.31 |
| DKL with residual RBF interpolator | 0.93 | 5.94 | 4.05 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Park, J.K.; Yuhai, O.; Lee, J.W.; Cho, Y.; Mun, J.H. Development of a Tractor Hydrostatic Transmission Efficiency Prediction Model Using Novel Hybrid Deep Kernel Learning and Residual Radial Basis Function Interpolator Model. Agriculture 2025, 15, 2325. https://doi.org/10.3390/agriculture15222325
Park JK, Yuhai O, Lee JW, Cho Y, Mun JH. Development of a Tractor Hydrostatic Transmission Efficiency Prediction Model Using Novel Hybrid Deep Kernel Learning and Residual Radial Basis Function Interpolator Model. Agriculture. 2025; 15(22):2325. https://doi.org/10.3390/agriculture15222325
Chicago/Turabian StylePark, Jin Kam, Oleksandr Yuhai, Jin Woong Lee, Yubin Cho, and Joung Hwan Mun. 2025. "Development of a Tractor Hydrostatic Transmission Efficiency Prediction Model Using Novel Hybrid Deep Kernel Learning and Residual Radial Basis Function Interpolator Model" Agriculture 15, no. 22: 2325. https://doi.org/10.3390/agriculture15222325
APA StylePark, J. K., Yuhai, O., Lee, J. W., Cho, Y., & Mun, J. H. (2025). Development of a Tractor Hydrostatic Transmission Efficiency Prediction Model Using Novel Hybrid Deep Kernel Learning and Residual Radial Basis Function Interpolator Model. Agriculture, 15(22), 2325. https://doi.org/10.3390/agriculture15222325

