Next Article in Journal
What Motivates Speculators to Speculate?
Next Article in Special Issue
Gaussian Process Regression for Data Fulfilling Linear Differential Equations with Localized Sources
Previous Article in Journal
Multi-Domain Entropy-Random Forest Method for the Fusion Diagnosis of Inter-Shaft Bearing Faults with Acoustic Emission Signals
Previous Article in Special Issue
Deep Learning and Artificial Intelligence for the Determination of the Cervical Vertebra Maturation Degree from Lateral Radiography
 
 
Article

Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection

1
Institute of Theoretical Physics-Computational Physics, Graz University of Technology, 8010 Graz, Austria
2
Institute of Mechanics, Graz University of Technology, 8010 Graz, Austria
3
Institute of Fundamentals and Theory in Electrical Engineering, Graz University of Technology, 8010 Graz, Austria
*
Authors to whom correspondence should be addressed.
Entropy 2020, 22(1), 58; https://doi.org/10.3390/e22010058
Received: 26 November 2019 / Revised: 26 December 2019 / Accepted: 27 December 2019 / Published: 31 December 2019
In 2000, Kennedy and O’Hagan proposed a model for uncertainty quantification that combines data of several levels of sophistication, fidelity, quality, or accuracy, e.g., a coarse and a fine mesh in finite-element simulations. They assumed each level to be describable by a Gaussian process, and used low-fidelity simulations to improve inference on costly high-fidelity simulations. Departing from there, we move away from the common non-Bayesian practice of optimization and marginalize the parameters instead. Thus, we avoid the awkward logical dilemma of having to choose parameters and of neglecting that choice’s uncertainty. We propagate the parameter uncertainties by averaging the predictions and the prediction uncertainties over all the possible parameters. This is done analytically for all but the nonlinear or inseparable kernel function parameters. What is left is a low-dimensional and feasible numerical integral depending on the choice of kernels, thus allowing for a fully Bayesian treatment. By quantifying the uncertainties of the parameters themselves too, we show that “learning” or optimising those parameters has little meaning when data is little and, thus, justify all our mathematical efforts. The recent hype about machine learning has long spilled over to computational engineering but fails to acknowledge that machine learning is a big data problem and that, in computational engineering, we usually face a little data problem. We devise the fully Bayesian uncertainty quantification method in a notation following the tradition of E.T. Jaynes and find that generalization to an arbitrary number of levels of fidelity and parallelisation becomes rather easy. We scrutinize the method with mock data and demonstrate its advantages in its natural application where high-fidelity data is little but low-fidelity data is not. We then apply the method to quantify the uncertainties in finite element simulations of impedance cardiography of aortic dissection. Aortic dissection is a cardiovascular disease that frequently requires immediate surgical treatment and, thus, a fast diagnosis before. While traditional medical imaging techniques such as computed tomography, magnetic resonance tomography, or echocardiography certainly do the job, Impedance cardiography too is a clinical standard tool and promises to allow earlier diagnoses as well as to detect patients that otherwise go under the radar for too long. View Full-Text
Keywords: uncertainty quantification; multi fidelity; Gaussian processes; probability theory; Bayes; impedance cardiography; aortic dissection uncertainty quantification; multi fidelity; Gaussian processes; probability theory; Bayes; impedance cardiography; aortic dissection
Show Figures

Figure 1

MDPI and ACS Style

Ranftl, S.; Melito, G.M.; Badeli, V.; Reinbacher-Köstinger, A.; Ellermann, K.; von der Linden, W. Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection. Entropy 2020, 22, 58. https://doi.org/10.3390/e22010058

AMA Style

Ranftl S, Melito GM, Badeli V, Reinbacher-Köstinger A, Ellermann K, von der Linden W. Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection. Entropy. 2020; 22(1):58. https://doi.org/10.3390/e22010058

Chicago/Turabian Style

Ranftl, Sascha, Gian Marco Melito, Vahid Badeli, Alice Reinbacher-Köstinger, Katrin Ellermann, and Wolfgang von der Linden. 2020. "Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection" Entropy 22, no. 1: 58. https://doi.org/10.3390/e22010058

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop