Learning Data-Driven Stable Corrections of Dynamical Systems—Application to the Simulation of the Top-Oil Temperature Evolution of a Power Transformer
Abstract
:1. Introduction
1.1. The Three Main Simulation-Based Engineering Methodologies Revisited
- Physics-based model improvement. This approach consists of refining the modeling by enriching the model itself, , such that its solution exhibits a smaller error; i.e., ;
- Fully data-driven description. The data-driven route consists of widely sampling the space , , with large enough and with the location of the points , , maximizing the domain coverage. These points are grouped into the set .The coverage is defined by the convex hull of the set , ensuring interpolation for and limiting the risky extrapolation to the region of outside the convex hull .Factorial samplings try to maximize the coverage; however, factorial samplings, or those based on the use of Gauss–Lobatto quadratures, related to approximations making use of orthogonal polynomials [1], fail when the dimensionality of the space increases.When , sparse sampling is preferred, the Latin hyper cube (LHP), for instance. Samplings based on Gaussian processes (GPs) aim at distributing the points in locations where the uncertainty is maximum (with respect to the predictions inferred from the previously collected data).Finally, the so-called active learning techniques drive the sampling with the aim of maximizing the representation of a certain goal-oriented quantity of interest [2].In what follows, we assume generic sampling to access the reference solution , and perfect measurability.Now, to infer the solution at , it suffices to construct an interpolation or approximation; more generally, an adequate regression :Different possibilities exist, including regularized polynomial regressions [3], neural networks (NNs) [4,5], support vector regression (SVR) [6], decision trees, and their random forest counterparts [7,8], to name a few.The trickiest issue concerns the error evaluation, which is quantified from a part of the data kept outside the training set, the so-called test set, used to quantify the performance of the trained regression.The main challenges of such a general procedure, particularly exacerbated in the multi-dimensional case (), are the following:
- -
- Ability to explain the regression ;
- -
- The size of the dataset (), which scales with the problem dimensionality ;
- -
- The optimal sampling to cover while guaranteeing the accuracy of or that of the goal-oriented quantities of interest;
- Hybrid approach. The hybrid approach proceeds by embracing the physics-based and data-driven approaches. As described in the next section, it can improve the physics-based accuracy (while profiting of the physics-based explanatory capabilities) through the use of data-driven enrichment, which, for its part, and under certain conditions, needs less data than the fully data-driven approach just discussed [9].
1.2. Paper Organization
2. Illustrating the Hybrid Approach
2.1. A Simple Linear Regression Reasoning
2.2. General Remarks
2.3. On the Domain of Application of the Hybrid Modeling Approach
- Sometimes, the physics-based model operates very accurately in a part of the domain, whereas strong nonlinearities localize in a small region, which can, in that case, be captured by a data-driven learning model, as considered in [12] to address the inelastic behavior of spot-welds;
- When considering the constitutive modeling of materials, the augmented rationale (or hybrid paradigm) expresses the real behavior from first-order behavior (calibrated from the available data) complemented by enrichment (or correction) filling the gap between the collected data and the predictions obtained from the assumed model [13];
- When addressing plates (or shells) with noticeable 3D behaviors (deviating from the usual shell theory) a valuable solution consists of using an enriched kinematics consisting of two contributions: the first-order one (usual shell kinematics) enriched with a second-order contribution [14] that can be learned from data;
- The hybrid modeling can also transfer existing knowledge slightly outside its domain of applicability with small amounts of collected data, as performed in [15] to correct state-of-the-art structural beam models;
- Sometimes, the discrepancy concerns an imperfect alignment in the solution between the prediction and the measures. That discrepancy may seem very high when evaluating it at each location; however, a small transport allows aligning both solutions. Optimal transport is very suitable in these situations where the hybrid model consists of the usual nominal model enriched from a parametric correction formulated in an optimal transport setting, as described in [16,17];
- In [18], a correction of a Mises yield function was performed from the deviation between the results predicted by using it and the measures obtained at the structure level;
3. Methods
3.1. On the Integration Stability
3.2. Learning Integrators
3.2.1. Recurrent Neural Network
3.2.2. On the Model Memory
3.2.3. Learners with Larger Memory
3.2.4. Residual Nets
4. Simple Procedures for Stabilizing ResNet-Based Integration
4.1. Learning Stable Linear Dynamical Systems
4.2. Learning Stable Nonlinear Dynamical Systems
4.3. Numerical Examples and Discussion
4.3.1. Linear Dynamical System
4.3.2. Nonlinear Dynamical System
4.4. Final Remarks
5. Application to the Evaluation of the Top-Oil Temperature of an Electric Power Transformer
- is the position of the tap-changer;
- is the load factor, the ratio between the nominal load current and the actual current;
- P represents the iron, copper, and supplementary losses. The power that heats the oil is composed of the losses that do not depend on the transformer load (iron losses, assumed to be constant) and the losses that do depend on the transformer load (copper and supplementary losses), which depend on the average winding temperature and the load factor in accordance with: , with and (k being a correction factor related to the material resistivity);
- is the ambient temperature;
- is the simulated top-oil temperature
- is the temperature difference between the simulated top-oil temperature and the ambient temperature;
- and are the thermal resistance and thermal capacitance of the equivalent transformer thermal circuit;
- is the average winding temperature;
- is the difference between the average winding temperature and the simulated oil temperature . It is assumed to be constant and found during the commissioning test (standards).
- The hybrid approach improves the physics-based model performances;
- Enriching the richer nonlinear physics-based model produced better results than enriching the linear counterpart of the simplified physics-based model;
- When the considered physics-based models were too far from the reference solution (experimental data), the data-driven model could outperform the hybrid modeling.
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Borzacchiello, D.; Aguado, J.V.; Chinesta, F. Non-intrusive sparse subspace learning for parametrized problems. Arch. Comput. Methods Eng. 2019, 26, 303–326. [Google Scholar] [CrossRef]
- Settles, B. Active Learning Literature Survey. In Computer Sciences Technical Report 1648; University of Wisconsin-Madison: Madison, WI, USA, 2009. [Google Scholar]
- Sancarlos, A.; Champaney, V.; Cueto, E.; Chinesta, F. Regularized regressions for parametric models based on separated representations. Adv. Model. Simul. Eng. Sci. 2023, 10, 4. [Google Scholar] [CrossRef]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Schmidhuber, J. Deep learning in neural networks: An overview. Neural Netw. 2015, 61, 85–117. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cristianini, N.; Shawe-Taylor, J. An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods; Cambridge University Press: New York, NY, USA, 2000. [Google Scholar]
- Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
- Kirkwood, C.W. Decision Tree Primer. 2002. Available online: https://www.public.asu.edu/~kirkwood/DAStuff/refs/decisiontrees/index.html (accessed on 24 June 2023).
- Chinesta, F.; Cueto, E.; Abisset-Chavanne, E.; Duval, J.L.; Khaldi, F.E. Virtual, Digital and Hybrid Twins: A New Paradigm in Data-Based Engineering and Engineered Data. Arch. Comput. Methods Eng. 2020, 27, 105–134. [Google Scholar] [CrossRef] [Green Version]
- Sancarlos, A.; Cameron, M.; Abel, A.; Cueto, E.; Duval, J.L.; Chinesta, F. From ROM of electrochemistry to AI-based battery digital and hybrid twin. Arch. Comput. Methods Eng. 2021, 28, 979–1015. [Google Scholar] [CrossRef]
- Sancarlos, A.; Cameron, M.; Peuvedic, J.M.L.; Groulier, J.; Duval, J.L.; Cueto, E.; Chinesta, F. Learning stable reduced-order models for hybrid twins. Data Centric Eng. 2021, 2, e10. [Google Scholar] [CrossRef]
- Reille, A.; Champaney, V.; Daim, F.; Tourbier, Y.; Hascoet, N.; Gonzalez, D.; Cueto, E.; Duval, J.L.; Chinesta, F. Learning data-driven reduced elastic and inelastic models of spot-welded patches. Mech. Ind. 2021, 22, 32. [Google Scholar] [CrossRef]
- Gonzalez, D.; Chinesta, F.; Cueto, E. Learning corrections for hyper-elastic models from data. Front. Mater.-Sect. Comput. Mater. Sci. 2019, 6, 14. [Google Scholar]
- Quaranta, G.; Ziane, M.; Haug, E.; Duval, J.L.; Chinesta, F. A minimally-intrusive fully 3D separated plate formulation in computational structural mechanics. Adv. Model. Simul. Eng. Sci. 2019, 6, 11. [Google Scholar] [CrossRef] [Green Version]
- Moya, B.; Badias, A.; Alfaro, I.; Chinesta, F.; Cueto, E. Digital twins that learn and correct themselves. Int. J. Numer. Methods Eng. 2022, 123, 3034–3044. [Google Scholar] [CrossRef]
- Torregrosa, S.; Champaney, V.; Ammar, A.; Hebert, V.; Chinesta, F. Surrogate Parametric Metamodel based on Optimal Transport. Math. Comput. Simul. 2022, 194, 36–63. [Google Scholar] [CrossRef]
- Torregrosa, S.; Champaney, V.; Ammar, A.; Herbert, V.; Chinesta, F. Hybrid Twins based on Optimal Transport. Comput. Math. Appl. 2022, 127, 12–24. [Google Scholar] [CrossRef]
- Ibanez, R.; Abisset-Chavanne, E.; Gonzalez, D.; Duval, J.L.; Cueto, E.; Chinesta, F. Hybrid Constitutive Modeling: Data-driven learning of corrections to plasticity models. Int. J. Mater. Form. 2019, 12, 717–725. [Google Scholar] [CrossRef]
- Argerich, C.; Carazo, A.; Sainges, O.; Petiot, E.; Barasinski, A.; Piana, M.; Ratier, L.; Chinesta, F. Empowering Design Based on Hybrid Twin: Application to Acoustic Resonators. Designs 2020, 4, 44. [Google Scholar] [CrossRef]
- Casteran, F.; Delage, K.; Cassagnau, P.; Ibanez, R.; Argerich, C.; Chinesta, F. Application of Machine Learning tools for the improvement of reactive extrusion simulation. Macromol. Mater. Eng. 2020, 305, 2000375. [Google Scholar] [CrossRef]
- Ghanem, R.; Soize, C.; Mehrez, L.; Aitharaju, V. Probabilistic learning and updating of a digital twin for composite material systems. Int. J. Numer. Methods Eng. 2022, 123, 3004–3020. [Google Scholar] [CrossRef]
- Ghnatios, C.; Gérard, P.; Barasinski, A. An advanced resin reaction modeling using data-driven and digital twin techniques. Int. J. Mater. Form. 2023, 16, 5. [Google Scholar] [CrossRef]
- Kapteyn, M.G.; Willcox, K.E. From Physics-Based Models to Predictive Digital Twins via Interpretable Machine Learning. arXiv 2020, arXiv:2004.11356v3. [Google Scholar]
- Tuegel, E.J.; Ingraffea, A.R.; Eason, T.G.; Spottswood, S.M. Reengineering Aircraft Structural Life Prediction Using a Digital Twin. Int. J. Aerosp. Eng. 2011, 2011, 154798. [Google Scholar] [CrossRef] [Green Version]
- Distefano, G.P. Stability of numerical integration techniques. AIChE J. 1968, 14, 946–955. [Google Scholar] [CrossRef]
- Dar, S.H.; Chen, W.; Zheng, F.; Gao, S.; Hu, K. An LSTM with Differential Structure and Its Application in Action Recognition. Math. Probl. Eng. 2022, 2022, 7316396. [Google Scholar]
- Zhou, G.B.; Wu, J.; Zhang, C.L.; Zhou, Z.H. Minimal gated unit for recurrent neural networks. Int. J. Autom. Comput. 2016, 13, 226–234. [Google Scholar] [CrossRef] [Green Version]
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Blaud, P.C.; Chevrel, P.; Claveau, F.; Haurant, P.; Mouraud, A. Resnet and polynet based identification and (mpc) control of dynamical systems: A promising way. IEEE Access 2022, 11, 20657–20672. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Identity Mappings in Deep Residual Networks. In Computer Vision—ECCV 2016; ECCV 2016. Lecture Notes in Computer Science; Leibe, B., Matas, J., Sebe, N., Welling, M., Eds.; Springer: Cham, Switzerland, 2016; Volume 9908. [Google Scholar]
- Schmid, P.J. Dynamic mode decomposition of numerical and experimental data. J. Fluid Mech. 2010, 656, 528. [Google Scholar] [CrossRef] [Green Version]
- Chen, R.T.Q.; Rubanova, Y.; Bettencourt, J.; Duvenaud, D. Neural ordinary differential equations. In Proceedings of the Conference on Neural Information Processing Systems (NeurIPS 2018), Montreal, QC, Canada, 2–8 December 2018; Volume 32, pp. 1–18. [Google Scholar]
- Chen, R.T.Q.; Amos, B.; Nickel, M. Learning Neural Event Functions for Ordinary Differential Equations. arXiv 2020, arXiv:2011.03902. [Google Scholar]
- Enciso-Salas, L.; Perez-Zuniga, G.; Sotomayor-Moriano, J. Fault Detection and Isolation for UAVs using Neural Ordinary Differential Equations. IFAC-PapersOnLine 2022, 55, 643–648. [Google Scholar] [CrossRef]
- Kestelyn, X.; Denis, G.; Champaney, V.; Hascoet, N.; Ghnatios, C.; Chinesta, F. Towards a hybrid twin for infrastructure asset management: Investigation on power transformer asset maintenance management. In Proceedings of the 7th International Advanced Research Workshop on Transformers (ARWtr), Baiona, Spain, 23–26 October 2022; pp. 109–114. [Google Scholar]
- IEC 60076-7:2018; Loading Guide for Mineral-Oil-Immersed Power Transformers. International electrotechnical Commission: Geneva, Switzerland, 2018.
Layer | Building Block | Activation |
---|---|---|
1 | LSTM layer with five outputs, return sequence true | sigmoid + tanh |
2 | Flatten | no activation |
3 | Dense connection with one output | relu |
4 | Lambda layer returning inputs | no activation |
Layer | Building Block | Activation |
---|---|---|
1 | LSTM layer with five outputs, return sequence true | sigmoid + tanh |
2 | Flatten | no activation |
3 | Dense connection with one output | linear |
ResNet | Fully | HT from a Linear | HT from a Nonlinear |
---|---|---|---|
Data-Driven | Physical Model | Physical Model | |
Linear stabilized | 2.173 | 3.143 | 1.620 |
Nonlinear stabilized | 1.716 | 1.516 | 1.439 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ghnatios, C.; Kestelyn, X.; Denis, G.; Champaney, V.; Chinesta, F. Learning Data-Driven Stable Corrections of Dynamical Systems—Application to the Simulation of the Top-Oil Temperature Evolution of a Power Transformer. Energies 2023, 16, 5790. https://doi.org/10.3390/en16155790
Ghnatios C, Kestelyn X, Denis G, Champaney V, Chinesta F. Learning Data-Driven Stable Corrections of Dynamical Systems—Application to the Simulation of the Top-Oil Temperature Evolution of a Power Transformer. Energies. 2023; 16(15):5790. https://doi.org/10.3390/en16155790
Chicago/Turabian StyleGhnatios, Chady, Xavier Kestelyn, Guillaume Denis, Victor Champaney, and Francisco Chinesta. 2023. "Learning Data-Driven Stable Corrections of Dynamical Systems—Application to the Simulation of the Top-Oil Temperature Evolution of a Power Transformer" Energies 16, no. 15: 5790. https://doi.org/10.3390/en16155790
APA StyleGhnatios, C., Kestelyn, X., Denis, G., Champaney, V., & Chinesta, F. (2023). Learning Data-Driven Stable Corrections of Dynamical Systems—Application to the Simulation of the Top-Oil Temperature Evolution of a Power Transformer. Energies, 16(15), 5790. https://doi.org/10.3390/en16155790