Solving fluid dynamics problems mainly rely on experimental methods and numerical simulation. However, in experimental methods it is difficult to simulate the physical problems in reality, and there is also a high-cost to the economy while numerical simulation methods are sensitive about meshing a complicated structure. It is also time-consuming due to the billion degrees of freedom in relevant spatial-temporal flow fields. Therefore, constructing a cost-effective model to settle fluid dynamics problems is of significant meaning. Deep learning (DL) has great abilities to handle strong nonlinearity and high dimensionality that attracts much attention for solving fluid problems. Unfortunately, the proposed surrogate models in DL are almost black-box models and lack interpretation. In this paper, the Physical Informed Neural Network (PINN) combined with Resnet blocks is proposed to solve fluid flows depending on the partial differential equations (i.e., Navier-Stokes equation) which are embedded into the loss function of the deep neural network to drive the model. In addition, the initial conditions and boundary conditions are also considered in the loss function. To validate the performance of the PINN with Resnet blocks, Burger’s equation with a discontinuous solution and Navier-Stokes (N-S) equation with continuous solution are selected. The results show that the PINN with Resnet blocks (Res-PINN) has stronger predictive ability than traditional deep learning methods. In addition, the Res-PINN can predict the whole velocity fields and pressure fields in spatial-temporal fluid flows, the magnitude of the mean square error of the fluid flow reaches to
. The inverse problems of the fluid flows are also well conducted. The errors of the inverse parameters are 0.98% and 3.1% in clean data and 0.99% and 3.1% in noisy data.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited