Next Article in Journal
Simulation of Dynamic Rearrangement Events in Wall-Flow Filters Applying Lattice Boltzmann Methods
Next Article in Special Issue
Application of Machine Learning Algorithms in Predicting Rheological Behavior of BN-diamond/Thermal Oil Hybrid Nanofluids
Previous Article in Journal
Experimental Analysis of a Cracked Cardan Shaft System under the Influence of Viscous Hydrodynamic Forces
Previous Article in Special Issue
Physics-Informed Super-Resolution of Turbulent Channel Flows via Three-Dimensional Generative Adversarial Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Can Artificial Intelligence Accelerate Fluid Mechanics Research?

by
Dimitris Drikakis
1,* and
Filippos Sofos
2,*
1
Institute for Advanced Modelling and Simulation, University of Nicosia, CY-2417 Nicosia, Cyprus
2
Condensed Matter Physics Laboratory, Department of Physics, University of Thessaly, 35100 Lamia, Greece
*
Authors to whom correspondence should be addressed.
Fluids 2023, 8(7), 212; https://doi.org/10.3390/fluids8070212
Submission received: 25 May 2023 / Revised: 27 June 2023 / Accepted: 11 July 2023 / Published: 19 July 2023
(This article belongs to the Special Issue Machine Learning and Artificial Intelligence in Fluid Mechanics)

Abstract

:
The significant growth of artificial intelligence (AI) methods in machine learning (ML) and deep learning (DL) has opened opportunities for fluid dynamics and its applications in science, engineering and medicine. Developing AI methods for fluid dynamics encompass different challenges than applications with massive data, such as the Internet of Things. For many scientific, engineering and biomedical problems, the data are not massive, which poses limitations and algorithmic challenges. This paper reviews ML and DL research for fluid dynamics, presents algorithmic challenges and discusses potential future directions.

1. Introduction

In the past few years, machine learning (ML) and deep learning (DL) have been pursued in most social, scientific, engineering, and industry branches. Initially established at the point where computer and statistical science meet, ML evolution has been driven by advancements in Artificial Intelligence (AI) and its related algorithms, fortified by the “big data” and access to cost-effective computational architectures. ML-based approaches span fields such as construction engineering, aerospace, biomedicine, materials science, education, financial modelling, and marketing [1,2].
Machine learning techniques fall into three groups: unsupervised, semi-supervised, and supervised [2,3]. This paper considers DL as a subset of the broader ML field. The supervised approaches rely on labelled data to train the ML algorithm effectively. This is the most common category, with wide applicability in science and technology [4,5,6,7]. Unsupervised ML involves extracting features from high-dimensional data sets without the need for pre-labelled training data. Well-established clustering methods, such as K-means, hierarchical agglomerative clustering, and DBSCAN [8,9], apply here. Furthermore, dimensionality reduction techniques such as proper orthogonal decomposition (POD) [10], principal component analysis (PCA) [11,12] and dynamic mode decomposition (DMD) [13] emerged from fundamental statistical analysis, and Reduced Order Model techniques have also been bound to unsupervised ML. Meanwhile, semi-supervised learning combines elements from supervised and unsupervised ML and has been successfully incorporated in applications concerning time-dependence and image data [14,15].
At the heart of ML theories, the algorithms proposed range from linear feed-forward models to complex DL implementations. To distinguish the complexity of various models, the algorithms may fall into shallow or deep ML, with possible combinations. Shallow learning (SL) algorithms, both in classification and regression tasks, involve linear and multiple linear algorithms, their linear variants such as LASSO and Ridge, support vector machines (SVM), stochastic methods based on Gaussian processes and Bayes’ theorem, and decision trees [16]. Ensemble and super-learner methods, combining multiple shallow algorithms in more complex instances, such as the random forest, gradient boosting, and extremely randomized trees, have also increased in popularity in the past years [17].
Notwithstanding its wide acceptance, the applicability of SL is limited to small-to-medium-sized data sets, which might be a problem in demanding fluid mechanics tasks. Thus, the need for more robust methods has emerged, and DL has come to the fore [18]. The idea of DL is based on the multi-layer perceptron, the artificial neural network (ANN) architecture that can be seen as the corresponding artificial mechanism that mimics the biological functions of the neural networks of the human brain [6].
DL models have embedded complex mathematical concepts and advanced computer programming techniques in a great number of interconnected nodes and layers [19], making them an appropriate choice for dealing with applications that involve image and video processing [20,21,22,23], speech recognition [24], biological data manipulation [25], and flow reconstruction [26]. Nevertheless, a physical interpretation of the outcome is still lacking, which limits their application in physical sciences, where interpretability has a central role. In such cases, it would be beneficial to adopt models that produce meaningful results, i.e., mathematical expressions, which can spot correlations with existing empirical relations and propose an analytical approach bound to physical laws.
Conventional neural networks operate solely on data and are often considered “black box” models. On the other hand, symbolic regression (SR) is a class of ML algorithms in which mathematical equations bound exclusively to data are derived. In contrast to usual regression procedures, SR searches within a wide mathematical operator set and constants to connect input features to produce meaningful outputs [27] based on evolutionary algorithms. This process of extracting results that can be interpreted and exploited simultaneously has led to the development of numerous genetic programming-based tools that implement SR for disciplines from materials science to construction engineering to medical science [28].
ML methods have been employed in fluid dynamics research but have not been in engineering practice. Computational fluid dynamics (CFD) discretizes space using meshes: as the grid density increases, so does the accuracy, but at the expense of computational cost. Such techniques have benefited the aerospace, automotive and biomedical industries. The challenges are numerical and physical, and several reviews have been written on specific industrial applications or areas of physics [29,30,31,32]. However, despite significant advances in the field, there is still a need to improve efficiency and accuracy.
The most critical example is turbulence, a flow regime characterised by velocity and pressure fluctuations, instabilities, and rotating fluid structures (eddies) at vast, macroscopic and molecular scales. Accurately describing transient flow physics using CFD would require many grid points. Such an approach where the Navier–Stokes equations are directly applied is called direct numerical simulation (DNS) and is accompanied by tremendous computational overhead. As such, it is limited to specific problems and requires large high-performance computing (HPC) facilities and long run times. Consequently, reduced turbulent models are often used at the expense of accuracy.
Furthermore, while many such techniques exist, no universal model works consistently across all turbulence flows, so while many computational methods exist for different problems, the computational barrier is often unavoidable. Recent efforts in numerical modelling frequently focus on how to speed up computations, a task that would benefit many industries. A growing school of thought is to couple these numerical methods with data-driven algorithms to improve accuracy and performance. This is where ML becomes relevant.
The topic of ML and its applications in fluid mechanics is broad, and a single review article does not suffice to cover everything. Our primary aim is to touch on some applications as indicative examples, summarise the ML methods used, and identify challenges and future perspectives.

2. A Brief Overview and Methods Classification

Many articles and reviews are produced each year on subjects related to ML. This review covers the past five years when a vast increase in reviews on these subjects was observed. Figure 1 shows the number of publications referring to ML, DL and the number of relevant reviews, with data taken from Scopus. Further analysis of the results is presented in the pie charts in Figure 2. The key terms used in the Scopus search were: Machine Learning and Review, from 2017 to 2023.
The findings from this survey revealed some interesting points. A great number of reviews meant that an even greater number of original research articles had been published. If we limited the results of the Scopus search to reviews related to fluids, three major fields of application emerge that attracted research interest in applying ML methods to (a) theoretical and industrial oil and gas [33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52], (b) CFD simulations [53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68], and (c) health and medical applications [69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86] (Figure 2a).
Another popular subject is fluid property estimation with ML algorithms, a promising alternative to prohibitively expensive experimental measurements.Most reviews have exploited historical data from experiments or simulations to predict fluid behaviour and calculate the properties of interest [87,88,89,90,91,92,93,94,95,96]. Furthermore, heat transfer applications, concerning the addition of nanoparticles in simple fluids [97,98,99,100,101,102] or the investigation of flows in microfluidic channels [103,104,105,106,107,108] play a central role. Of equal significance, though with fewer instances, several recent reviews refer to aerodynamics [109,110,111,112,113], geosciences, geoengineering and environmental subjects [114,115,116,117,118,119,120,121,122,123,124,125,126], energy applications [127,128,129,130], biological processes [131,132,133,134], control processes [135,136,137], multiscale simulations [138,139,140,141], and general ML [2,142].
Common to all these applications is the difficulty of obtaining accurate data; industrial processes usually obtain them from field sensors. In addition to data acquisition and resolution, compression might be needed to handle quantity effectively, especially in real-time applications [143]. Data availability is a prerequisite because, for example, a lack of available data in industrial multiphase oil and gas flows makes it hard to experiment with ML techniques, and the need for interdisciplinary collaboration between industry and academia is highlighted [42]. Using only high-quality data is important to avoid incorporating erroneous, missing, or redundant data points. Before data are applied on a specific ML algorithmic platform, an extensive statistical analysis is conducted on the implied dataset, and feature engineering is employed to curate the data. Feature engineering involves raw data manipulation and extracting characteristic features before transforming them into an acceptable ML format. This process encompasses a set of statistical tools (e.g., feature importance selection, correlation tests, and variable transformation and normalization) that can be used to predict the desired outcome and enhance the performance of the model.
All the studies above have incorporated several supervised ML algorithms, while only a few unsupervised algorithms have been exploited for dimensionality reduction (DR) tasks. In Figure 2b, most of them went through various algorithm reviews that argued for their applicability to a specific subject. Property prediction with historical data can be realized with SL algorithms such as decision trees, linear regression [144] and simple NNs like the multi-layer perceptron (MLP) [11]. In general, the most widely incorporated SL algorithms in fluid research can be categorized into five areas [96]: (i) linear models based mainly on linear regression techniques and their variants such as LASSO and Ridge, (ii) algorithms that employ kernel functions to transform input data into a higher-dimensional space (e.g., support vector machines (SVMs)), (iii) decision-tree methods, which exploit the non-linear tree-structure to achieve accurate predictions, (iv) neural networks built on the perceptron architecture, and (v) instance-based, which exploits the proximity of instances inside the training set such as K-nearest neighbours.
A new category, the ensemble technique, has emerged to boost prediction accuracy further. This approach incorporates more than one base ML algorithm to construct a combined estimator, ensuring accuracy, robustness and lower variance [145]. Individual predictions are combined in classification or averaging in regression tasks. In this way, each base learner in an ensemble model can achieve better results as it is trained in a specific region of the dataset, thereby improving prediction accuracy and generalizability. Some widely applied ensemble techniques involve random forest (RF), gradient boosting machine (GBM), and extreme gradient boosting (XGBoost) [146].
On the other hand, a DL architecture with convolutional NNs (CNNs) or Recurrent NNs (RNNs) is incorporated to address flow reconstruction problems [26]. Moreover, as imaging applications in most branches of science, engineering and medical studies become more demanding, DL techniques are further combined with ensemble techniques [147]. Physics-informed neural networks (PINNs), another class of algorithm, are gaining ground since they add physical intuition in common NNs, and can bind experimental data with Navier–Stokes equations for fluid mechanics [140,141,148,149,150]. Algorithms based on genetic programming (GP) are best suited for flow control in various applications [151] and can be an excellent choice when the output needs to be interpreted as a mathematical equation [8].
Another research direction that has attracted attention is Physics-informed machine learning (PIML), which integrates ML algorithms with high-fidelity numerical simulations [58]. PIML could potentially reduce computing time in complex turbulence simulations.

3. Applications of ML in Fluid Mechanics

As mentioned in the Introduction, a long-standing problem in fluid dynamics is turbulence, a flow regime characterised by unpredictable, time-dependent velocity and pressure fluctuations [152,153,154,155]. The use of data-driven methods has increased significantly over the last few years, providing new modelling tools for complex turbulent flows [156]. ML has been used to model turbulence, such as the use of ANNs to predict the instantaneous velocity vector field in turbulent wake flows [157] and turbulent channel flows [158,159]. The popularity of many ML methods, especially deep learning, as a promising method for solving complex fluid flow problems is also supported by high-quality open-source data sets published by researchers [160,161,162].
Much effort is currently put into using ML algorithms to improve the accuracy of Reynolds-averaged Navier–Stokes (RANS) models for which GPs have been used to quantify and reduce uncertainties [163]. Tracey et al. [164] used an ANN that inputs flow properties and reconstructs the SA closure equations. The authors studied how different components of ML algorithms, such as the choice of the cost function and feature scaling, affected their performance. Other examples include using DL to model the Reynolds stress anisotropy tensor [165,166]. Bayesian inference has also been used to optimize RANS model coefficients [167,168], as well as to derive functions that quantify and reduce the gap between model predictions and high-fidelity data, such as DNS data [169,170,171]. Furthermore, a Bayesian neural network (BNN) has been used to improve the predictions of RANS models and to specify the uncertainty associated with the forecast [172].
In the field of large eddy simulations (LES), an ANN was used to replace these subgrid models [173,174]. Convolutional neural networks (CNNs) have also been used to calculate subgrid closure terms [175]. Moreover, DL modelling using input vorticity and stream function stencils was implemented to predict a dynamic, time- and space-dependent closure term for the subgrid models [176]. More specifically, starting from the detailed DNS simulation output, the main focus was to employ low-resolution LES data to train a NN model and reconstruct fluid dynamics to replace DNS where possible. To this end, many relevant applications have emerged. Bao et al. [177] have suggested combining physics-guided, two-component, PDE-based techniques with recurrent temporal data models and super-resolution spatial data methods. An extension of this approach includes a multi-scale temporal path UNet (MST-UNet) model that can reconstruct both temporal and spatial flow information from low-resolution data [177].
Fukami et al. [178] have applied super-resolution techniques, common in image reconstruction applications, on 2D course-flow fields and managed to reconstruct laminar and turbulent flows. A more recent version of this method employed autoencoders to achieve order reduction for fluid flows [179]. A complete super-resolution case study is presented in another work, covering all the details of a CNN-based, super-resolution architecture for turbulent fluid flows with DNS training data for various Reynolds numbers [180]. Nevertheless, generative models are often incorporated because DNS data are hard to obtain. In the review of Buzzicotti [53], three prevalent models are discussed: variational autoencoders (VAE), GANS, and diffusion models. These models are based on CNNs, and their implementation is focused on reproducing the multiscale and multifrequency nature of fluid dynamics, which is more complex than classical image recognition applications.
Another exciting application of ML in fluid mechanics is related to food processing [181]. The ML models aim to optimize process parameters and kinetics for reduced energy consumption, speed up the processing time, and continuously improve product quality. Several food processing operations, including drying, frying, baking, and extrusion, each require the development of ML algorithms and their training and validation against accurate data.
Furthermore, ML has found application in many chemical engineering flows. These applications include transport in porous media, catalytic reactors, batteries, and carbon dioxide storage [182]. Many of these problems are linked to energy transition efforts such as batteries, where long-term safety, energy density, and cycle life are particularly interesting. Diverse ML applications in fluid mechanics include nanofluids and ternary hybrid nanofluids [183] as well as metallurgy processes [184]. The domain of applications is broad and goes beyond the scope of a single review article.

4. Challenges

The data from fluid simulations or flow sensors are vast and have prohibitive degrees of freedom to cope with. Bearing in mind also that 2D flow fields in the pictorial form are currently gaining ground as research items for fluid flow characterization, with added complexity when they evolve in time (e.g., multiple images), it becomes clear that great effort is needed to harness and derive meaning from processing and analysis. On the contrary, there are cases where information from data is sparse, and the implied ML algorithm must be accurate. Methods and techniques from reinforced learning could be exploited here, such as GANs [185,186], which can train an ML model with fictitious data without affecting accuracy.
We summarized the challenges of ML in fluid mechanics in Figure 3 and detail them below.
  • ML algorithms are well-defined mathematically and widely supported by software. An open question on their applicability in high-precision fluid mechanics, such as in biological or aerodynamic flows, is the availability of high-fidelity data to train ML models effectively [187].
  • Open databases should become available to the scientific community and play a crucial role in machine learning, enabling knowledge sharing and advanced development.
  • Modern computers and experimental devices have higher speed, power, and flexibility, making access to data easy. However, concerns about storage, retrieval, post-processing, and other challenges may arise, especially when dealing with massive turbulence data.
  • ML in fluids extends beyond image processing as fluid imaging is not static and includes dynamic information [3]. Furthermore, when flow images are provided in time sequence, the computational load increases dramatically, and algorithmic selection is essential.
  • Algorithms perform differently depending on the problem they are called on to solve. For example, there is no need to dive into a complex DL architecture when the task is the property prediction of a specific fluid based on several variable-type input parameters [188].
  • Differential equation solutions with PINNs may fail in complex physical phenomena, such as turbulence, compared to traditional numerical methods [189]. There seems to be much to do to consider PINNs as the dominant PDE solver for fluid dynamics.
  • At the nanoscale, problems originate from time and length scales. Complex aqueous environments and fluid/solid interfaces can only come to light through atomistic simulation techniques with first-principles accuracy. Machine learning potentials have been successfully introduced over the past years and are now close to standardization as ab-initio simulation alternatives [190].
  • There are many instances in the physical sciences where decision-making is based on empirical relations. The challenge for future ML methods is to provide accurate data-derived physics-based equations rather than empirical ones. Symbolic regression can help towards this goal. [191].

5. Perspectives

Following the discussion on state-of-the-art ML in fluid dynamics and considering the emerging challenges, it is imperative to summarize where scientists and engineers should focus next for ML fluid dynamics applications.
As was made clear in the literature review, AI and ML methods have been widely incorporated in the industry - from oil and gas to energy production and aerospace engineering -particularly in research. There is ample historical and newly generated data to be analyzed in these fields, and ML has much to offer to promote domain knowledge. The statistical nature of ML, either in interpolation or short extrapolation instances, will continue to evolve.
Nevertheless, these applications exploit traditional ML methods. Although widely incorporated, they were developed in a different era and cannot keep pace with current research needs. Moreover, issues originating from big data, such as size, heterogeneity, real-time processing speed or various complex forms (e.g., 2D and 3D), will continue to pose serious obstacles [192].
Novel algorithmic implementations are expected to continue to adapt to these profound demands. This is unavoidable from a computer scientist’s and engineer’s point of view since new versions of the traditional algorithms should be proposed to handle big data. But here comes the impact of domain knowledge. Data and hidden patterns inside it may be enlightening, but it is doubtful if it can replace physical understanding. Therefore, fluid dynamics researchers should work in parallel to embed fluid concepts and physical laws onto the backbone of the proposed algorithms.
The increasing interest in SR algorithmic implementations can significantly boost computational science by suggesting practical and physically meaningful analytical equations that outperform established empirical or approximate equations [8]. This direction unveils the “black box” nature of ML, enabling these approaches to play a prominent role in predicting properties and guiding fluid dynamics research. This field of investigation still has much to offer, and extensive research on real fluids and mixtures, although computationally demanding and challenging to interpret, is to come.
In another direction, Vinuesa and Brunton [193] have highlighted areas of future research on DNS acceleration, turbulence closure modelling improvement, and new reduced-order model (ROM) establishment, i.e., the areas where numerical computations are highly demanding and in many cases impossible to perform. Care has to be taken here since physics-informed procedures only sometimes adhere to physical consistency. When complex NNs are constructed for an intensive application, it is not guaranteed that they will lead to generalization. Although NNs may produce valuable predictions, they remain a “black-box” model.
Another engineering area of increasing importance concerns the effort and cost of performing expensive numerical simulations and laboratory experiments. This cost could be reduced by using ML models that use coarse data to produce refined predictions. This is particularly important when data are limited or exhibit noise or both. The above factors significantly influence the accuracy of interpolation methods and reduced-order models, including ML.
Poulinakis et al. [194] presented a study on these issues and concluded that ML and DL can be significantly more promising than interpolation under certain data sparsity and noise conditions. Otherwise, ML and DL can be more inaccurate than simple interpolation. For example, DL is significantly more robust to noise than cubic Splines. Therefore, DL can lead to a ’true’ function hidden under the noise, thus making it a valuable tool in engineering applications. Furthermore, we know that increasing the data will increase the accuracy of DL to a certain level. What this level is will depend on the application problem. Therefore, future emphasis should be placed on establishing the limitations and best practices for DL methods in different applications. The explainability of DLs remains problematic, and research should be carried out in this direction.

6. Conclusions

As AI methods and algorithms evolve, simulation and experimental data will play increasingly important roles in practical designs, covering demanding applications in science and technology. Fluid dynamics research should take advantage of AI development but carefully explore the possibilities without hyping the potential benefits. ML and DL methods should adapt to specific problems, and their accuracy and limitations should be investigated. The key is that AI and ML approaches should fly above simple prediction tasks and dive deep into interpretability and causation. Their potential impact will be high only if the output abides by physical laws.
Although many of the issues discussed here apply generally to ML and DL methods, there are some specific areas where these methods can benefit fluid dynamics and areas where caution should be exercised:
1.
Exploring how to accelerate numerical simulations by introducing ML.
2.
Investigating the accuracy limitations of ML methods in experiments and numerical simulations.
3.
Exploring ML methods when using big fluid dynamics data and when data are scarce.
4.
Developing a better theoretical understanding of ML methods that will allow better explainability of the results.
5.
Avoiding AI hyping such as “new equations will be discovered through AI”, as first principles will drive fluid dynamics (and other physics) research.
Research in AI evolves fast, and fluid dynamics will benefit from it. Still, its applicability should follow rigorous verification and testing and allow researchers to publish not only their successes in using AI but also their failed attempts.

Author Contributions

Conceptualization, D.D. and F.S.; methodology, D.D. and F.S.; formal analysis, D.D. and F.S.; investigation, D.D. and F.S.; resources, D.D. and F.S.; writing, D.D. and F.S.; supervision, D.D. and F.S.; project administration, D.D. and F.S.; funding acquisition, D.D. and F.S.; contribution to the discussion D.D. and F.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AIArtificial Intelligence
BFBasis Function
CFDComputational Fluid Dynamics
CNNConvolutional Neural Network
DLDeep Learning
DNNDeep Neural Network
DMDDynamic Mode Decomposition
ELUExponential Linear Unit
FNNFeedforward Neural Network
FFCFeedforward Fully Connected
GANsGenerative Adversarial Networks
GCVGeneralized Cross Validation
GELUGaussian Linear Unit
GPGenetic Programming
NNNeural Network(s)
MARSMultivariate Adaptive Regression Splines
MLMachine Learning
MLPMulti-Layer Perceptron
MSEMean Squared Error
PINNsPhysics Informed Neural Networks
PODProper Orthogonal Decomposition
ReLURectified Linear Unit
RNNRecurrent Neural Network
SLShallow Learning
SNRSignal to Noise Ratio
SRSymbolic Regression
SVMSupport Vector Machine

References

  1. Pugliese, R.; Regondi, S.; Marini, R. Machine learning-based approach: Global trends, research directions, and regulatory standpoints. Data Sci. Manag. 2021, 4, 19–29. [Google Scholar] [CrossRef]
  2. Frank, M.; Drikakis, D.; Charissis, V. Machine-Learning Methods for Computational Science and Engineering. Computation 2020, 8, 15. [Google Scholar] [CrossRef] [Green Version]
  3. Brunton, S.L.; Noack, B.R.; Koumoutsakos, P. Machine Learning for Fluid Mechanics. Annu. Rev. Fluid Mech. 2020, 52, 477–508. [Google Scholar] [CrossRef] [Green Version]
  4. Singh, M.P.; Alatyar, A.M.; Berrouk, A.S.; Saeed, M. Numerical modelling of rotating packed beds used for CO2 capture processes: A review. Can. J. Chem. Eng. 2023. [Google Scholar] [CrossRef]
  5. Guo, S.; Agarwal, M.; Cooper, C.; Tian, Q.; Gao, R.X.; Guo, W.; Guo, Y. Machine learning for metal additive manufacturing: Towards a physics-informed data-driven paradigm. J. Manuf. Syst. 2022, 62, 145–163. [Google Scholar] [CrossRef]
  6. Nazemi, E.; Dinca, M.; Movafeghi, A.; Rokrok, B.; Choopan Dastjerdi, M. Estimation of volumetric water content during imbibition in porous building material using real time neutron radiography and artificial neural network. Nucl. Instrum. Methods Phys. Res. Sect. Accel. Spectrometers Detect. Assoc. Equip. 2019, 940, 344–350. [Google Scholar] [CrossRef]
  7. Leverant, C.J.; Greathouse, J.A.; Harvey, J.A.; Alam, T.M. Machine Learning Predictions of Simulated Self-Diffusion Coefficients for Bulk and Confined Pure Liquids. J. Chem. Theory Comput. 2023, 19, 3054–3062. [Google Scholar] [CrossRef]
  8. Sofos, F.; Charakopoulos, A.; Papastamatiou, K.; Karakasidis, T.E. A combined clustering/symbolic regression framework for fluid property prediction. Phys. Fluids 2022, 34, 062004. [Google Scholar] [CrossRef]
  9. Papastamatiou, K.; Sofos, F.; Karakasidis, T.E. Calculating material properties with purely data-driven methods: From clusters to symbolic expressions. In Proceedings of the 12th Hellenic Conference on Artificial Intelligence (SETN ’22), Corfu, Greece, 7–9 September 2022; Association for Computing Machinery: New York, NY, USA, 2022; pp. 1–9. [Google Scholar] [CrossRef]
  10. Amsallem, D.; Farhat, C. Interpolation Method for Adapting Reduced-Order Models and Application to Aeroelasticity. AIAA J. 2008, 46, 1803–1813. [Google Scholar] [CrossRef] [Green Version]
  11. Sofos, F.; Karakasidis, T. Nanoscale slip length prediction with machine learning tools. Sci. Rep. 2021, 11, 12520. [Google Scholar] [CrossRef]
  12. Scherl, I.; Strom, B.; Shang, J.K.; Williams, O.; Polagye, B.L.; Brunton, S.L. Robust principal component analysis for modal decomposition of corrupt fluid flows. Phys. Rev. Fluids 2020, 5, 054401. [Google Scholar] [CrossRef]
  13. Schmid, P.J. Dynamic Mode Decomposition and Its Variants. Annu. Rev. Fluid Mech. 2022, 54, 225–254. [Google Scholar] [CrossRef]
  14. Deo, I.K.; Jaiman, R. Predicting waves in fluids with deep neural network. Phys. Fluids 2022, 34, 067108. [Google Scholar] [CrossRef]
  15. M S, V.M.; Menon, V. Measuring Viscosity of Fluids: A Deep Learning Approach Using a CNN-RNN Architecture. In Proceedings of the First International Conference on AI-ML Systems, AIML Systems ’21, Bangalore, India, 21–23 October 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 1–5. [Google Scholar] [CrossRef]
  16. Stergiou, K.; Ntakolia, C.; Varytis, P.; Koumoulos, E.; Karlsson, P.; Moustakidis, S. Enhancing property prediction and process optimization in building materials through machine learning: A review. Comput. Mater. Sci. 2023, 220, 112031. [Google Scholar] [CrossRef]
  17. Hadavimoghaddam, F.; Ostadhassan, M.; Sadri, M.A.; Bondarenko, T.; Chebyshev, I.; Semnani, A. Prediction of Water Saturation from Well Log Data by Machine Learning Algorithms: Boosting and Super Learner. J. Mar. Sci. Eng. 2021, 9, 666. [Google Scholar] [CrossRef]
  18. Guo, K.; Yang, Z.; Yu, C.H.; Buehler, M.J. Artificial intelligence and machine learning in design of mechanical materials. Mater. Horiz. 2021, 8, 1153–1172. [Google Scholar] [CrossRef]
  19. Xu, H.; Zhang, D.; Zeng, J. Deep-learning of parametric partial differential equations from sparse and noisy data. Phys. Fluids 2021, 33, 037132. [Google Scholar] [CrossRef]
  20. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. In Advances in Neural Information Processing Systems; Pereira, F., Burges, C., Bottou, L., Weinberger, K., Eds.; Curran Associates, Inc.: Red Hook, NY, USA, 2012; Volume 25. [Google Scholar]
  21. Farabet, C.; Couprie, C.; Najman, L.; LeCun, Y. Learning Hierarchical Features for Scene Labeling. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 1915–1929. [Google Scholar] [CrossRef] [Green Version]
  22. Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going Deeper with Convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015. [Google Scholar]
  23. Jiang, Y.G.; Wu, Z.; Wang, J.; Xue, X.; Chang, S.F. Exploiting Feature and Class Relationships in Video Categorization with Regularized Deep Neural Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 40, 352–364. [Google Scholar] [CrossRef]
  24. Hinton, G.; Deng, L.; Yu, D.; Dahl, G.E.; Mohamed, A.r.; Jaitly, N.; Senior, A.; Vanhoucke, V.; Nguyen, P.; Sainath, T.N.; et al. Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups. IEEE Signal Process. Mag. 2012, 29, 82–97. [Google Scholar] [CrossRef]
  25. Leung, M.K.K.; Xiong, H.Y.; Lee, L.J.; Frey, B.J. Deep learning of the tissue-regulated splicing code. Bioinformatics 2014, 30, i121–i129. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Callaham, J.L.; Maeda, K.; Brunton, S.L. Robust flow reconstruction from limited measurements via sparse representation. Phys. Rev. Fluids 2019, 4, 103907. [Google Scholar] [CrossRef] [Green Version]
  27. Tohme, T.; Liu, D.; Youcef-Toumi, K. GSR: A Generalized Symbolic Regression Approach. arXiv 2022, arXiv:2205.15569. [Google Scholar] [CrossRef]
  28. Angelis, D.; Sofos, F.; Karakasidis, T.E. Artificial Intelligence in Physical Sciences: Symbolic Regression Trends and Perspectives. Arch. Comput. Methods Eng. 2023, 30, 3845–3865. [Google Scholar] [CrossRef] [PubMed]
  29. Rider, W.; Kamm, J.; Weirs, V. Verification, Validation, and Uncertainty Quantification for Coarse Grained Simulation. In Coarse Grained Simulation and Turbulent Mixing; Cambridge University Press: Cambridge, UK, 2016; pp. 168–189. [Google Scholar]
  30. Drikakis, D.; Kwak, D.; Kiris, C. Computational Aerodynamics: Advances and Challenges. Aeronaut. J. 2016, 120, 13–36. [Google Scholar] [CrossRef] [Green Version]
  31. Norton, T.; Sun, D.W. Computational fluid dynamics (CFD) ‚Äì an effective and efficient design and analysis tool for the food industry: A review. Trends Food Sci. Technol. 2006, 17, 600–620. [Google Scholar] [CrossRef]
  32. Kobayashi, T.; Tsubokura, M. CFD Application in Automotive Industry. In Notes on Numerical Fluid Mechanics and Multidisciplinary Design; Hirschel, E.H., Krause, E., Eds.; Springer: Berlin/Heidelberg, Germany, 2009; Volume 100. [Google Scholar]
  33. Bai, K.; Fan, H.; Zhang, H.; Zhou, F.; Tao, X. Real Time Torque and Drag Analysis by Combining of Physical Model and Machine Learning Method. In Proceedings of the 10th Unconventional Resources Technology Conference, Houston, TX, USA, 20–22 June 2022. [Google Scholar] [CrossRef]
  34. Bikmukhametov, T.; Jäschke, J. First Principles and Machine Learning Virtual Flow Metering: A Literature Review. J. Pet. Sci. Eng. 2020, 184, 106487. [Google Scholar] [CrossRef]
  35. Gul, S. Machine learning applications in drilling fluid engineering: A review. In Proceedings of the ASME 2021 40th International Conference on Ocean, Offshore and Arctic Engineering, Online, 21–30 June 2021; Volume 10. [Google Scholar] [CrossRef]
  36. Belazreg, L.; Mahmood, S.; Aulia, A. Random forest algorithm for CO2 water alternating gas incremental recovery factor prediction. Int. J. Adv. Sci. Technol. 2020, 29, 168–188. [Google Scholar]
  37. Khan, A.; BinZiad, A.; Al Subaii, A. Boosting algorithm choice in predictive machine learning models for fracturing applications. In Proceedings of the SPE/IATMI Asia Pacific Oil & Gas Conference and Exhibition, Virtual, 12–14 October 2021. [Google Scholar] [CrossRef]
  38. Olukoga, T.; Feng, Y. Practical Machine-Learning Applications in Well-Drilling Operations. SPE Drill. Complet. 2021, 36, 849–867. [Google Scholar] [CrossRef]
  39. Zhong, R.; Salehi, C.; Johnson, R. Machine learning for drilling applications: A review. J. Nat. Gas Sci. Eng. 2022, 108, 104807. [Google Scholar] [CrossRef]
  40. Agwu, O.; Akpabio, J.; Ekpenyong, M.; Inyang, U.; Asuquo, D.; Eyoh, I.; Adeoye, O. A critical review of drilling mud rheological models. J. Pet. Sci. Eng. 2021, 203, 108659. [Google Scholar] [CrossRef]
  41. Anifowose, F.; Mezghani, M.; Badawood, S.; Ismail, J. Contributions of machine learning to quantitative and real-time mud gas data analysis: A critical review. Appl. Comput. Geosci. 2022, 16, 100095. [Google Scholar] [CrossRef]
  42. Arief, H.A.; Wiktorski, T.; Thomas, P.J. A survey on distributed fibre optic sensor data modelling techniques and machine learning algorithms for multiphase fluid flow estimation. Sensors 2021, 21, 2801. [Google Scholar] [CrossRef] [PubMed]
  43. Cao, W.; Zhang, W. Data-driven and physical-based identification of partial differential equations for multivariable system. Theor. Appl. Mech. Lett. 2022, 12, 100334. [Google Scholar] [CrossRef]
  44. Chen, G.; Zhu, G.; Zhu, Y. Advances in safety assessment and risk management for deepwater oil and gas exploitation. J. China Univ. Pet. (Ed. Nat. Sci.) 2019, 43, 136–145. [Google Scholar] [CrossRef]
  45. Guo, J.; Lu, Q.; He, Y. Key issues and explorations in shale gas fracturing. Nat. Gas Ind. B 2023, 10, 183–197. [Google Scholar] [CrossRef]
  46. Mata, C.; Saputelli, L.; Badmaev, D.; Zhao, W.; Mohan, R.; Gönczi, D.; Schweiger, A.; Manasipov, R.; Schweiger, G.; Krenn, L.; et al. Automated Reservoir Management Workflows to Identify Candidates and Rank Opportunities for Production Enhancement and Cost Optimization in a Giant Field in Offshore Abu Dhabi. In Proceedings of the Offshore Technology Conference, Virtual and Houston, TX, USA, 16–19 August 2021. [Google Scholar] [CrossRef]
  47. Mawlod, A.; Memon, A.; Varotsis, N.; Gaganis, V.; Anastasiadou, V.; Nighswander, J.; Al Shuaibi, M. Reducing Composition Characterization Uncertainties Through Advanced Machine Learning (ML) Techniques—Data Clustering. In Proceedings of the ADIPEC, Abu Dhabi, United Arab Emirates, 31 October–3 November 2022. [Google Scholar] [CrossRef]
  48. Mehmani, Y.; Anderson, T.; Wang, Y.; Aryana, S.; Battiato, I.; Tchelepi, H.; Kovscek, A. Striving to translate shale physics across ten orders of magnitude: What have we learned? Earth-Sci. Rev. 2021, 223, 103848. [Google Scholar] [CrossRef]
  49. Ojeda, L.; Olubode, M.; Karami, H.; Podio, T. Application of Machine Learning to Evaluate the Performances of Various Downhole Centrifugal Separator Types in Oil and Gas Production Systems. In Proceedings of the SPE Oklahoma City Oil and Gas Symposium, Oklahoma City, OK, USA, 17–19 April 2023. [Google Scholar] [CrossRef]
  50. Osarogiagbon, A.; Khan, F.; Venkatesan, R.; Gillard, P. Review and analysis of supervised machine learning algorithms for hazardous events in drilling operations. Process. Saf. Environ. Prot. 2021, 147, 367–384. [Google Scholar] [CrossRef]
  51. Syed, F.; AlShamsi, A.; Dahaghi, A.; Neghabhan, S. Application of ML & AI to model petrophysical and geomechanical properties of shale reservoirs – A systematic literature review. Petroleum 2022, 8, 158–166. [Google Scholar] [CrossRef]
  52. Dindoruk, B.; Ratnakar, R.; He, J. Review of recent advances in petroleum fluid properties and their representation. J. Nat. Gas Sci. Eng. 2020, 83, 103541. [Google Scholar] [CrossRef]
  53. Buzzicotti, M. Data reconstruction for complex flows using AI: Recent progress, obstacles, and perspectives. EPL 2023, 142, 23001. [Google Scholar] [CrossRef]
  54. Duraisamy, K. Perspectives on machine learning-augmented Reynolds-averaged and large eddy simulation models of turbulence. Phys. Rev. Fluids 2021, 6, 050504. [Google Scholar] [CrossRef]
  55. Rabault, J.; Ren, F.; Zhang, W.; Tang, H.; Xu, H. Deep reinforcement learning in fluid mechanics: A promising method for both active flow control and shape optimization. J. Hydrodyn. 2020, 32, 234–246. [Google Scholar] [CrossRef]
  56. Zhao, Y.; Xu, X. Data-driven turbulence modelling based on gene-expression programming. Lixue Xuebao/Chin. J. Theor. Appl. Mech. 2021, 53, 2640–2655. [Google Scholar] [CrossRef]
  57. Hammond, J.; Pepper, N.; Montomoli, F.; Michelassi, V. Machine Learning Methods in CFD for Turbomachinery: A Review. Int. J. Turbomach. Propuls. Power 2022, 7, 16. [Google Scholar] [CrossRef]
  58. Sharma, P.; Chung, W.T.; Akoush, B.; Ihme, M. A Review of Physics-Informed Machine Learning in Fluid Mechanics. Energies 2023, 16, 2343. [Google Scholar] [CrossRef]
  59. Vadyala, S.; Betgeri, S.; Matthews, J.; Matthews, E. A review of physics-based machine learning in civil engineering. Results Eng. 2022, 13, 100316. [Google Scholar] [CrossRef]
  60. Andrés-Pérez, E.; Paulete-Perianezez, C. On the application of surrogate regression models for aerodynamic coefficient prediction. Complex Intell. Syst. 2021, 7, 1991–2021. [Google Scholar] [CrossRef]
  61. Ma, L.; Guo, Q.; Li, X.; Xu, S.; Zhou, J.; Ye, M.; Liu, Z. Drag correlations for flow past monodisperse arrays of spheres and porous spheres based on symbolic regression: Effects of permeability. Chem. Eng. J. 2022, 445, 136653. [Google Scholar] [CrossRef]
  62. Panchigar, D.; Kar, K.; Shukla, S.; Mathew, R.; Chadha, U.; Selvaraj, S. Machine learning-based CFD simulations: A review, models, open threats, and future tactics. Neural Comput. Appl. 2022, 34, 21677–21700. [Google Scholar] [CrossRef]
  63. Panda, J. A review of pressure strain correlation modeling for Reynolds stress models. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2020, 234, 1528–1544. [Google Scholar] [CrossRef]
  64. Razdan, S.; Shah, S. Optimization of Fluid Modeling and Flow Control Processes Using Machine Learning: A Brief Review. In Advances in Mechanical Engineering and Material Science; Lecture Notes in Mechanical Engineering; Springer: Singapore, 2022; pp. 63–85. [Google Scholar] [CrossRef]
  65. Singh, S.; Podder, P.; Russo, M.; Henry, C.; Cinti, S. Tailored point-of-care biosensors for liquid biopsy in the field of oncology. Lab Chip 2022, 71, 44–61. [Google Scholar] [CrossRef]
  66. Yu, J.; Yan, C.; Guo, M. Non-intrusive reduced-order modeling for fluid problems: A brief review. Proc. Inst. Mech. Eng. Part G J. Aerosp. Eng. 2019, 233, 5896–5912. [Google Scholar] [CrossRef]
  67. Chen, Q.; Wang, Y.; Wang, H.; Yang, X. Data-driven simulation in fluids animation: A survey. Virtual Real. Intell. Hardw. 2021, 3, 87–104. [Google Scholar] [CrossRef]
  68. Gibou, F.; Hyde, D.; Fedkiw, R. Sharp interface approaches and deep learning techniques for multiphase flows. J. Comput. Phys. 2019, 380, 442–463. [Google Scholar] [CrossRef]
  69. Alhaddad, A.; Aly, H.; Gad, H.; Al-Ali, A.; Sadasivuni, K.; Cabibihan, J.J.; Malik, R. Sense and Learn: Recent Advances in Wearable Sensing and Machine Learning for Blood Glucose Monitoring and Trend-Detection. Front. Bioeng. Biotechnol. 2022, 10, 876672. [Google Scholar] [CrossRef]
  70. Gerraty, R.; Provost, A.; Li, L.; Wagner, E.; Haas, M.; Lancashire, L. Machine learning within the Parkinson’s progression markers initiative: Review of the current state of affairs. Front. Aging Neurosci. 2023, 15, 1076657. [Google Scholar] [CrossRef]
  71. He, B.; Lu, Q.; Yang, Q.; Luo, J.; Wang, Z. Taylor Genetic Programming for Symbolic Regression. arXiv 2022, arXiv:2205.09751. [Google Scholar] [CrossRef]
  72. Mir, A.; Sarwar, A. Artificial intelligence-based techniques for analysis of body cavity fluids: A review. Artif. Intell. Rev. 2021, 54, 4019–4061. [Google Scholar] [CrossRef]
  73. Moradi, H.; Al-Hourani, A.; Concilia, G.; Khoshmanesh, F.; Nezami, F.; Needham, S.; Baratchi, S.; Khoshmanesh, K. Recent developments in modeling, imaging, and monitoring of cardiovascular diseases using machine learning. Biophys. Rev. 2023, 15, 19–33. [Google Scholar] [CrossRef]
  74. Arzani, A.; Dawson, S. Data-driven cardiovascular flow modelling: Examples and opportunities. J. R. Soc. Interface 2021, 18, 20200802. [Google Scholar] [CrossRef] [PubMed]
  75. del Real Mata, C.; Jeanne, O.; Jalali, M.; Lu, Y.; Mahshid, S. Nanostructured-Based Optical Readouts Interfaced with Machine Learning for Identification of Extracellular Vesicles. Adv. Healthc. Mater. 2023, 12, 2202123. [Google Scholar] [CrossRef]
  76. Balana, C.; Castañer, S.; Carrato, C.; Moran, T.; Lopez-Paradís, A.; Domenech, M.; Hernandez, A.; Puig, J. Preoperative Diagnosis and Molecular Characterization of Gliomas with Liquid Biopsy and Radiogenomics. Front. Neurol. 2022, 13, 865171. [Google Scholar] [CrossRef] [PubMed]
  77. Benton, S.; Tesche, C.; De Cecco, C.; Duguay, T.; Schoepf, U.; Bayer, R. Noninvasive Derivation of Fractional Flow Reserve From Coronary Computed Tomographic Angiography: A Review. J. Thorac. Imaging 2018, 33, 88–96. [Google Scholar] [CrossRef]
  78. Bhattacharjee, G.; Gohil, N.; Shukla, M.; Sharma, S.; Mani, I.; Pandya, A.; Chu, D.T.; Bui, N.; Thi, Y.V.; Khambhati, K.; et al. Exploring the potential of microfluidics for next-generation drug delivery systems. OpenNano 2023, 12, 100150. [Google Scholar] [CrossRef]
  79. Bratulic, S.; Gatto, F.; Nielsen, J. The Translational Status of Cancer Liquid Biopsies. Regen. Eng. Transl. Med. 2021, 7, 312–352. [Google Scholar] [CrossRef] [Green Version]
  80. Gottlieb, E.; Samuel, M.; Bonventre, J.; Celi, L.; Mattie, H. Machine Learning for Acute Kidney Injury Prediction in the Intensive Care Unit. Adv. Chronic Kidney Dis. 2022, 29, 431–438. [Google Scholar] [CrossRef] [PubMed]
  81. Khan, S.; Manialawy, Y.; Wheeler, M.; Cox, B. Unbiased data analytic strategies to improve biomarker discovery in precision medicine. Drug Discov. Today 2019, 24, 1735–1748. [Google Scholar] [CrossRef]
  82. Ko, J.; Baldassano, S.; Loh, P.L.; Kording, K.; Litt, B.; Issadore, D. Machine learning to detect signatures of disease in liquid biopsies-a user’s guide. Lab Chip 2018, 18, 395–405. [Google Scholar] [CrossRef]
  83. Sandys, V.; Sexton, D.; O’Seaghdha, C. Artificial intelligence and digital health for volume maintenance in hemodialysis patients. Hemodial. Int. 2022, 26, 480–495. [Google Scholar] [CrossRef]
  84. Serafim, M.; dos Santos Júnior, V.; Gertrudes, J.; Maltarollo, V.; Honorio, K. Machine learning techniques applied to the drug design and discovery of new antivirals: A brief look over the past decade. Expert Opin. Drug Discov. 2021, 16, 961–975. [Google Scholar] [CrossRef] [PubMed]
  85. Sreedharan, S.; Zekry, S.; Leipsic, J.; Brown, R. Updates on Fractional Flow Reserve Derived by CT (FFRCT). Curr. Treat. Options Cardiovasc. Med. 2020, 22, 17. [Google Scholar] [CrossRef]
  86. Tasoglu, S. Toilet-based continuous health monitoring using urine. Nat. Rev. Urol. 2022, 19, 219–230. [Google Scholar] [CrossRef]
  87. Cassola, S.; Duhovic, M.; Schmidt, T.; May, D. Machine learning for polymer composites process simulation—A review. Compos. Part B Eng. 2022, 246, 110208. [Google Scholar] [CrossRef]
  88. Yu, C.; Bi, X.; Fan, Y. Deep learning for fluid velocity field estimation: A review. Ocean. Eng. 2023, 271, 113693. [Google Scholar] [CrossRef]
  89. Jirasek, F.; Hasse, H. Machine Learning of Thermophysical Properties. Fluid Phase Equilibria 2021, 549, 113206. [Google Scholar] [CrossRef]
  90. Desgranges, C.; Delhommelle, J. Towards a machine learned thermodynamics: Exploration of free energy landscapes in molecular fluids, biological systems and for gas storage and separation in metal-organic frameworks. Mol. Syst. Des. Eng. 2021, 6, 52–65. [Google Scholar] [CrossRef]
  91. Ahmadi, M.; Kumar, R.; Assad, M.; Ngo, P. Applications of machine learning methods in modeling various types of heat pipes: A review. J. Therm. Anal. Calorim. 2021, 146, 2333–2341. [Google Scholar] [CrossRef]
  92. Souayeh, B.; Bhattacharyya, S.; Hdhiri, N.; Alam, M. Heat and fluid flow analysis and ann-based prediction of a novel spring corrugated tape. Sustainability 2021, 13, 3023. [Google Scholar] [CrossRef]
  93. Yang, B.; Zhu, X.; Wei, B.; Liu, M.; Li, Y.; Lv, Z.; Wang, F. Computer Vision and Machine Learning Methods for Heat Transfer and Fluid Flow in Complex Structural Microchannels: A Review. Energies 2023, 16, 1500. [Google Scholar] [CrossRef]
  94. Koutsoukos, S.; Philippi, F.; Malaret, F.; Welton, T. A review on machine learning algorithms for the ionic liquid chemical space. Chem. Sci. 2021, 12, 6820–6843. [Google Scholar] [CrossRef] [PubMed]
  95. Upot, N.; Fazle Rabbi, K.; Khodakarami, S.; Ho, J.; Kohler Mendizabal, J.; Miljkovic, N. Advances in micro and nanoengineered surfaces for enhancing boiling and condensation heat transfer: A review. Nanoscale Adv. 2022, 5, 1232–1270. [Google Scholar] [CrossRef]
  96. Sofos, F.; Stavrogiannis, C.; Exarchou-Kouveli, K.K.; Akabua, D.; Charilas, G.; Karakasidis, T.E. Current Trends in Fluid Research in the Era of Artificial Intelligence: A Review. Fluids 2022, 7, 116. [Google Scholar] [CrossRef]
  97. Adun, H.; Wole-Osho, I.; Okonkwo, E.; Kavaz, D.; Dagbasi, M. A critical review of specific heat capacity of hybrid nanofluids for thermal energy applications. J. Mol. Liq. 2021, 340, 116890. [Google Scholar] [CrossRef]
  98. Hemmati-Sarapardeh, A.; Varamesh, A.; Nait Amar, M.; Husein, M.; Dong, M. On the evaluation of thermal conductivity of nanofluids using advanced intelligent models. Int. Commun. Heat Mass Transf. 2020, 118, 104825. [Google Scholar] [CrossRef]
  99. Gonçalves, I.; Souza, R.; Coutinho, G.; Miranda, J.; Moita, A.; Pereira, J.; Moreira, A.; Lima, R. Thermal conductivity of nanofluids: A review on prediction models, controversies and challenges. Appl. Sci. 2021, 11, 2525. [Google Scholar] [CrossRef]
  100. Ma, T.; Guo, Z.; Lin, M.; Wang, Q. Recent trends on nanofluid heat transfer machine learning research applied to renewable energy. Renew. Sustain. Energy Rev. 2021, 138, 110494. [Google Scholar] [CrossRef]
  101. Sharma, P.; Said, Z.; Kumar, A.; Nižetić, S.; Pandey, A.; Hoang, A.; Huang, Z.; Afzal, A.; Li, C.; Le, A.; et al. Recent Advances in Machine Learning Research for Nanofluid-Based Heat Transfer in Renewable Energy System. Energy Fuels 2022, 36, 6626–6658. [Google Scholar] [CrossRef]
  102. Wang, H.; Chen, X. A Comprehensive Review of Predicting the Thermophysical Properties of Nanofluids Using Machine Learning Methods. Ind. Eng. Chem. Res. 2022, 61, 14711–14730. [Google Scholar] [CrossRef]
  103. Fani, M.; Pourafshary, P.; Mostaghimi, P.; Mosavat, N. Application of microfluidics in chemical enhanced oil recovery: A review. Fuel 2022, 315, 123225. [Google Scholar] [CrossRef]
  104. Galan, E.; Zhao, H.; Wang, X.; Dai, Q.; Huck, W.; Ma, S. Intelligent Microfluidics: The Convergence of Machine Learning and Microfluidics in Materials Science and Biomedicine. Matter 2020, 3, 1893–1922. [Google Scholar] [CrossRef]
  105. Tsai, H.F.; Podder, S.; Chen, P.Y. Microsystem Advances through Integration with Artificial Intelligence. Micromachines 2023, 14, 826. [Google Scholar] [CrossRef]
  106. Gao, J.; Hu, Z.; Yang, Q.; Liang, X.; Wu, H. Fluid flow and heat transfer in microchannel heat sinks: Modelling review and recent progress. Therm. Sci. Eng. Prog. 2022, 29, 113096. [Google Scholar] [CrossRef]
  107. Srikanth, S.; Dubey, S.; Javed, A.; Goel, S. Droplet based microfluidics integrated with machine learning. Sens. Actuators A Phys. 2021, 332, 113096. [Google Scholar] [CrossRef]
  108. Zhong, J.; Riordon, J.; Wu, T.; Edwards, H.; Wheeler, A.; Pardee, K.; Aspuru-Guzik, A.; Sinton, D. When robotics met fluidics. Lab Chip 2020, 20, 709–716. [Google Scholar] [CrossRef] [PubMed]
  109. Zerouaoui, J.; Alaoui, A.; Ettaki, B.; Chakir, E. Assessing the Improvements Brought by Artificial Intelligence on the Prediction of Aerodynamic Coefficients. In Artificial Intelligence and Smart Environment; Lecture Notes in Networks and Systems; Springer: Cham, Switzerland, 2023; Volume 635, pp. 254–263. [Google Scholar] [CrossRef]
  110. Kou, J.; Zhang, W. Data-driven modeling for unsteady aerodynamics and aeroelasticity. Prog. Aerosp. Sci. 2021, 125, 100725. [Google Scholar] [CrossRef]
  111. Le Clainche, S.; Ferrer, E.; Gibson, S.; Cross, E.; Parente, A.; Vinuesa, R. Improving aircraft performance using machine learning: A review. Aerosp. Sci. Technol. 2023, 138, 108354. [Google Scholar] [CrossRef]
  112. Li, J.; Du, X.; Martins, J. Machine learning in aerodynamic shape optimization. Prog. Aerosp. Sci. 2022, 134, 100849. [Google Scholar] [CrossRef]
  113. Zhu, H.; Yang, B.; Zhang, Q.; Pan, L.; Sun, S. Wind engineering for high-rise buildings: A review. Wind. Struct. Int. J. 2021, 32, 249–265. [Google Scholar] [CrossRef]
  114. Tahmasebi, P.; Kamrava, S.; Bai, T.; Sahimi, M. Machine learning in geo- and environmental sciences: From small to large scale. Adv. Water Resour. 2020, 142, 103619. [Google Scholar] [CrossRef]
  115. Nallakukkala, S.; Lal, B. Modeling of seawater desalination by gas hydrate method. In Gas Hydrate in Water Treatment: Technological, Economic, and Industrial Aspects; Wiley: Hoboken, NJ, USA, 2022; pp. 77–111. [Google Scholar] [CrossRef]
  116. Irschick, D.; Christiansen, F.; Hammerschlag, N.; Martin, J.; Madsen, P.; Wyneken, J.; Brooks, A.; Gleiss, A.; Fossette, S.; Siler, C.; et al. 3D visualization processes for recreating and studying organismal form. iScience 2022, 25, 104867. [Google Scholar] [CrossRef] [PubMed]
  117. Brookfield, A.; Hansen, A.; Sullivan, P.; Czuba, J.; Kirk, M.; Li, L.; Newcomer, M.; Wilkinson, G. Predicting algal blooms: Are we overlooking groundwater? Sci. Total Environ. 2021, 769, 144442. [Google Scholar] [CrossRef] [PubMed]
  118. Li, J.; Yuan, Y.; Shen, H.B. Symbolic Expression Transformer: A Computer Vision Approach for Symbolic Regression. arXiv 2022, arXiv:2205.11798. [Google Scholar] [CrossRef]
  119. Singh, N.; Yadav, M.; Singh, V.; Padhiyar, H.; Kumar, V.; Bhatia, S.; Show, P.L. Artificial intelligence and machine learning-based monitoring and design of biological wastewater treatment systems. Bioresour. Technol. 2023, 369, 128486. [Google Scholar] [CrossRef]
  120. Umenweke, G.; Afolabi, I.; Epelle, E.; Okolie, J. Machine learning methods for modeling conventional and hydrothermal gasification of waste biomass: A review. Bioresour. Technol. Rep. 2022, 17, 100976. [Google Scholar] [CrossRef]
  121. Zaghloul, M.; Achari, G. A review of mechanistic and data-driven models of aerobic granular sludge. J. Environ. Chem. Eng. 2022, 10, 107500. [Google Scholar] [CrossRef]
  122. Chandra, D.; Vishal, V. A critical review on pore to continuum scale imaging techniques for enhanced shale gas recovery. Earth-Sci. Rev. 2021, 217, 103638. [Google Scholar] [CrossRef]
  123. Zhang, J.; Li, J.; Chen, X.; Li, Y.; Tang, W. A Spatially Coupled Data-Driven Approach for Lithology/Fluid Prediction. IEEE Trans. Geosci. Remote Sens. 2021, 59, 5526–5534. [Google Scholar] [CrossRef]
  124. Viswanathan, H.; Ajo-Franklin, J.; Birkholzer, J.; Carey, J.; Guglielmi, Y.; Hyman, J.; Karra, S.; Pyrak-Nolte, L.; Rajaram, H.; Srinivasan, G.; et al. From Fluid Flow to Coupled Processes in Fractured Rock: Recent Advances and New Frontiers. Rev. Geophys. 2022, 60, e2021RG000744. [Google Scholar] [CrossRef]
  125. Bhattacharya, S. Summarized Applications of Machine Learning in Subsurface Geosciences. In Primer on Machine Learning in Subsurface Geosciences; SpringerBriefs in Petroleum Geoscience & Engineering; Springer: Cham, Switzerland, 2021; pp. 123–165. [Google Scholar] [CrossRef]
  126. Khan, H.; Srivastav, A.; Kumar Mishra, A.; Anh Tran, T. Machine learning methods for estimating permeability of a reservoir. Int. J. Syst. Assur. Eng. Manag. 2022, 13, 2118–2131. [Google Scholar] [CrossRef]
  127. Wu, Q.; Guo, Y.; Liu, Y.; Wang, G. Review on the cavitating flow-induced vibrations. Kongqi Donglixue Xuebao/Acta Aerodyn. Sin. 2020, 38, 746–760. [Google Scholar] [CrossRef]
  128. Lopes, N.; Chao, Y.; Dasarla, V.; Sullivan, N.; Ricklick, M.; Boetcher, S. Comprehensive Review of Heat Transfer Correlations of Supercritical CO2in Straight Tubes Near the Critical Point: A Historical Perspective. J. Heat Transf. 2022, 144, 120801. [Google Scholar] [CrossRef]
  129. Mostafa, K.; Zisis, I.; Moustafa, M. Machine Learning Techniques in Structural Wind Engineering: A State-of-the-Art Review. Appl. Sci. 2022, 12, 5232. [Google Scholar] [CrossRef]
  130. Drikakis, D.; Dbouk, T. The Role of Computational Science in Wind and Solar Energy: A Critical Review. Energies 2022, 15, 9609. [Google Scholar] [CrossRef]
  131. Maksymov, I.; Huy Nguyen, B.; Suslov, S. Biomechanical Sensing Using Gas Bubbles Oscillations in Liquids and Adjacent Technologies: Theory and Practical Applications. Biosensors 2022, 12, 624. [Google Scholar] [CrossRef]
  132. Du, Y.H.; Wang, M.Y.; Yang, L.H.; Tong, L.L.; Guo, D.S.; Ji, X.J. Optimization and Scale-Up of Fermentation Processes Driven by Models. Bioengineering 2022, 9, 473. [Google Scholar] [CrossRef]
  133. Zottl, A.; Stark, H. Modeling Active Colloids: From Active Brownian Particles to Hydrodynamic and Chemical Fields. Annu. Rev. Condens. Matter Phys. 2023, 14, 109–127. [Google Scholar] [CrossRef]
  134. Zou, Y.; Chu, Z.; Guo, J.; Liu, S.; Ma, X.; Guo, J. Minimally invasive electrochemical continuous glucose monitoring sensors: Recent progress and perspective. Biosens. Bioelectron. 2023, 225, 115103. [Google Scholar] [CrossRef]
  135. Lv, H.; Zhang, S.; Sun, Q.; Chen, R.; Zhang, W. The Dynamic Models, Control Strategies and Applications for Magnetorheological Damping Systems: A Systematic Review. J. Vib. Eng. Technol. 2021, 9, 131–147. [Google Scholar] [CrossRef]
  136. Anush, C.; Yashwanth, K.; Shashank, S.; Venkat Reddy, M.; Kumar, A. Bottle Line Detection using Digital Image Processing with Machine Learning. J. Phys. Conf. Ser. 2021, 1998, 012033. [Google Scholar] [CrossRef]
  137. Akavalappil, V.; Radhakrishnan, T. Comparison of current state of control valve stiction detection and quantification techniques. Trans. Inst. Meas. Control 2022, 44, 562–579. [Google Scholar] [CrossRef]
  138. Vowinckel, B. Incorporating grain-scale processes in macroscopic sediment transport models: A review and perspectives for environmental and geophysical applications. Acta Mech. 2021, 232, 2023–2050. [Google Scholar] [CrossRef]
  139. Zhang, Y.; Xu, J.; Chang, Q.; Zhao, P.; Wang, J.; Ge, W. Numerical simulation of fluidization: Driven by challenges. Powder Technol. 2023, 414, 118092. [Google Scholar] [CrossRef]
  140. Asproulis, N.; Drikakis, D. Nanoscale materials modelling using neural networks. J. Comput. Theor. Nanosci. 2009, 6, 514–518. [Google Scholar] [CrossRef]
  141. Asproulis, N.; Drikakis, D. An artificial neural network-based multiscale method for hybrid atomistic-continuum simulations. Microfluid. Nanofluid. 2013, 15, 559–574. [Google Scholar] [CrossRef]
  142. Alzubaidi, L.; Bai, J.; Al-Sabaawi, A.; Santamaría, J.; Albahri, A.; Al-dabbagh, B.; Fadhel, M.; Manoufali, M.; Zhang, J.; Al-Timemy, A.; et al. A survey on deep learning tools dealing with data scarcity: Definitions, challenges, solutions, tips, and applications. J. Big Data 2023, 10, 46. [Google Scholar] [CrossRef]
  143. Goh, G.D.; Sing, S.L.; Yeong, W.Y. A review on machine learning in 3D printing: Applications, potential, and challenges. Artif. Intell. Rev. 2021, 54, 63–94. [Google Scholar] [CrossRef]
  144. Bhattacharya, S.; Verma, M.K.; Bhattacharya, A. Predictions of Reynolds and Nusselt numbers in turbulent convection using machine-learning models. Phys. Fluids 2022, 34, 025102. [Google Scholar] [CrossRef]
  145. Md, A.Q.; Kulkarni, S.; Joshua, C.J.; Vaichole, T.; Mohan, S.; Iwendi, C. Enhanced Preprocessing Approach Using Ensemble Machine Learning Algorithms for Detecting Liver Disease. Biomedicines 2023, 11, 581. [Google Scholar] [CrossRef]
  146. Mahdaviara, M.; Sharifi, M.; Bakhshian, S.; Shokri, N. Prediction of spontaneous imbibition in porous media using deep and ensemble learning techniques. Fuel 2022, 329, 125349. [Google Scholar] [CrossRef]
  147. Rahil, M.; Anoop, B.N.; Girish, G.N.; Kothari, A.R.; Koolagudi, S.G.; Rajan, J. A Deep Ensemble Learning-Based CNN Architecture for Multiclass Retinal Fluid Segmentation in OCT Images. IEEE Access 2023, 11, 17241–17251. [Google Scholar] [CrossRef]
  148. Cai, S.; Mao, Z.; Wang, Z.; Yin, M.; Karniadakis, G.E. Physics-informed neural networks (PINNs) for fluid mechanics: A review. Acta Mech. Sin. 2021, 37, 1727–1738. [Google Scholar] [CrossRef]
  149. Raissi, M.; Karniadakis, G.E. Hidden physics models: Machine learning of nonlinear partial differential equations. J. Comput. Phys. 2018, 357, 125–141. [Google Scholar] [CrossRef] [Green Version]
  150. Raissi, M.; Wang, Z.; Triantafyllou, M.S.; Karniadakis, G.E. Deep learning of vortex-induced vibrations. J. Fluid Mech. 2019, 861, 119–137. [Google Scholar] [CrossRef] [Green Version]
  151. Agostini, L. Exploration and prediction of fluid dynamical systems using auto-encoder technology. Phys. Fluids 2020, 32, 067103. [Google Scholar] [CrossRef]
  152. Thornber, B.; Mosedale, A.; Drikakis, D. On the implicit large eddy simulations of homogeneous decaying turbulence. J. Comput. Phys. 2007, 226, 1902–1929. [Google Scholar] [CrossRef]
  153. Jiménez, J. Near-wall turbulence. Phys. Fluids 2013, 25, 101302. [Google Scholar] [CrossRef] [Green Version]
  154. Drikakis, D. Advances in turbulent flow computations using high-resolution methods. Prog. Aerosp. Sci. 2003, 39, 405–424. [Google Scholar] [CrossRef]
  155. Kobayashi, H.; Matsumoto, E.; Fukushima, N.; Tanahashi, M.; Miyauchi, T. Statistical properties of the local structure of homogeneous isotropic turbulence and turbulent channel flows. J. Turbul. 2011, 12, N12. [Google Scholar] [CrossRef]
  156. Duraisamy, K.; Iaccarino, G.; Xiao, H. Turbulence modeling in the age of data. Annu. Rev. Fluid Mech. 2019, 51, 357–377. [Google Scholar] [CrossRef] [Green Version]
  157. Giralt, F.; Arenas, A.; Ferre-Gine, J.; Rallo, R.; Kopp, G. The simulation and interpretation of free turbulence with a cognitive neural system. Phys. Fluids 2000, 12, 1826–1835. [Google Scholar] [CrossRef] [Green Version]
  158. Milano, M.; Koumoutsakos, P. Neural network modeling for near wall turbulent flow. J. Comput. Phys. 2002, 182, 1–26. [Google Scholar] [CrossRef] [Green Version]
  159. Chang, F.J.; Yang, H.C.; Lu, J.Y.; Hong, J.H. Neural network modelling for mean velocity and turbulence intensities of steep channel flows. Hydrol. Process. Int. J. 2008, 22, 265–274. [Google Scholar] [CrossRef]
  160. McConkey, R.; Yee, E.; Lien, F.S. A curated dataset for data-driven turbulence modelling. Sci. Data 2021, 8, 255. [Google Scholar] [CrossRef] [PubMed]
  161. Bonnet, F.; Mazari, A.J.; Cinnella, P.; Gallinari, P. AirfRANS: High Fidelity Computational Fluid Dynamics Dataset for Approximating Reynolds-Averaged Navier-Stokes Solutions. arXiv 2023, arXiv:2212.07564. [Google Scholar]
  162. Ribeiro, M.D.; Rehman, A.; Ahmed, S.; Dengel, A. DeepCFD: Efficient Steady-State Laminar Flow Approximation with Deep Convolutional Neural Networks. arXiv 2021, arXiv:2004.08826. [Google Scholar]
  163. Xiao, H.; Wu, J.L.; Wang, J.X.; Sun, R.; Roy, C. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach. J. Comput. Phys. 2016, 324, 115–136. [Google Scholar] [CrossRef] [Green Version]
  164. Tracey, B.D.; Duraisamy, K.; Alonso, J.J. A machine learning strategy to assist turbulence model development. In Proceedings of the 53rd AIAA Aerospace Sciences Meeting, Kissimmee, FL, USA, 5–9 January 2015; p. 1287. [Google Scholar]
  165. Ling, J.; Kurzawski, A.; Templeton, J. Reynolds averaged turbulence modelling using deep neural networks with embedded invariance. J. Fluid Mech. 2016, 807, 155–166. [Google Scholar] [CrossRef]
  166. Kutz, J.N. Deep learning in fluid dynamics. J. Fluid Mech. 2017, 814, 1–4. [Google Scholar] [CrossRef] [Green Version]
  167. Cheung, S.H.; Oliver, T.A.; Prudencio, E.E.; Prudhomme, S.; Moser, R.D. Bayesian uncertainty analysis with applications to turbulence modeling. Reliab. Eng. Syst. Saf. 2011, 96, 1137–1149. [Google Scholar] [CrossRef]
  168. Edeling, W.; Cinnella, P.; Dwight, R.P.; Bijl, H. Bayesian estimates of parameter variability in the k–ε turbulence model. J. Comput. Phys. 2014, 258, 73–94. [Google Scholar] [CrossRef] [Green Version]
  169. Zhang, Z.J.; Duraisamy, K. Machine learning methods for data-driven turbulence modeling. In Proceedings of the 22nd AIAA Computational Fluid Dynamics Conference, Dallas, TX, USA, 22–26 June 2015; p. 2460. [Google Scholar]
  170. Duraisamy, K.; Zhang, Z.J.; Singh, A.P. New approaches in turbulence and transition modeling using data-driven techniques. In Proceedings of the 53rd AIAA Aerospace Sciences Meeting, Kissimmee, FL, USA, 5–9 January 2015; p. 1284. [Google Scholar]
  171. Parish, E.J.; Duraisamy, K. A paradigm for data-driven predictive modeling using field inversion and machine learning. J. Comput. Phys. 2016, 305, 758–774. [Google Scholar] [CrossRef] [Green Version]
  172. Geneva, N.; Zabaras, N. Quantifying model form uncertainty in Reynolds-averaged turbulence models with Bayesian deep neural networks. J. Comput. Phys. 2019, 383, 125–147. [Google Scholar] [CrossRef] [Green Version]
  173. Sarghini, F.; De Felice, G.; Santini, S. Neural networks based subgrid scale modeling in large eddy simulations. Comput. Fluids 2003, 32, 97–108. [Google Scholar] [CrossRef]
  174. Moreau, A.; Teytaud, O.; Bertoglio, J.P. Optimal estimation for large-eddy simulation of turbulence and application to the analysis of subgrid models. Phys. Fluids 2006, 18, 105101. [Google Scholar] [CrossRef] [Green Version]
  175. Beck, A.D.; Flad, D.G.; Munz, C.D. Deep neural networks for data-driven turbulence models. arXiv 2018, arXiv:1806.04482. [Google Scholar]
  176. Maulik, R.; San, O.; Rasheed, A.; Vedula, P. Subgrid modelling for two-dimensional turbulence using neural networks. J. Fluid Mech. 2019, 858, 122–144. [Google Scholar] [CrossRef] [Green Version]
  177. Bao, T.; Chen, S.; Johnson, T.T.; Givi, P.; Sammak, S.; Jia, X. Physics guided neural networks for spatio-temporal super-resolution of turbulent flows. In Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence, PMLR, Eindhoven, The Netherlands, 1–5 August 2022; pp. 118–128, ISSN 2640-3498. [Google Scholar]
  178. Fukami, K.; Fukagata, K.; Taira, K. Super-resolution reconstruction of turbulent flows with machine learning. J. Fluid Mech. 2019, 870, 106–120. [Google Scholar] [CrossRef] [Green Version]
  179. Fukami, K.; Hasegawa, K.; Nakamura, T.; Morimoto, M.; Fukagata, K. Model Order Reduction with Neural Networks: Application to Laminar and Turbulent Flows. Comput. Sci. 2021, 2, 467. [Google Scholar] [CrossRef]
  180. Fukami, K.; Fukagata, K.; Taira, K. Super-resolution analysis via machine learning: A survey for fluid flows. Theor. Comput. Fluid Dyn. 2023. [Google Scholar] [CrossRef]
  181. Khan, M.I.H.; Sablani, S.S.; Nayak, R.; Gu, Y. Machine learning-based modeling in food processing applications: State of the art. Compr. Rev. Food Sci. Food Saf. 2022, 21, 1409–1438. [Google Scholar] [CrossRef] [PubMed]
  182. Marcato, A.; Boccardo, G.; Marchisio, D. From Computational Fluid Dynamics to Structure Interpretation via Neural Networks: An Application to Flow and Transport in Porous Media. Ind. Eng. Chem. Res. 2022, 61, 8530–8541. [Google Scholar] [CrossRef]
  183. Dinesh Kumar, M.; Ameer Ahammad, N.; Raju, C.; Yook, S.J.; Shah, N.A.; Tag, S.M. Response surface methodology optimization of dynamical solutions of Lie group analysis for nonlinear radiated magnetized unsteady wedge: Machine learning approach (gradient descent). Alex. Eng. J. 2023, 74, 29–50. [Google Scholar] [CrossRef]
  184. Priyadharshini, P.; Archana, M.V.; Ahammad, N.A.; Raju, C.; Yook, S.J.; Shah, N.A. Gradient descent machine learning regression for MHD flow: Metallurgy process. Int. Commun. Heat Mass Transf. 2022, 138, 106307. [Google Scholar] [CrossRef]
  185. Deng, Z.; He, C.; Liu, Y.; Kim, K.C. Super-resolution reconstruction of turbulent velocity fields using a generative adversarial network-based artificial intelligence framework. Phys. Fluids 2019, 31, 125111. [Google Scholar] [CrossRef]
  186. Yousif, M.Z.; Yu, L.; Lim, H.C. Super-resolution reconstruction of turbulent flow fields at various Reynolds numbers based on generative adversarial networks. Phys. Fluids 2022, 34, 015130. [Google Scholar] [CrossRef]
  187. Brenner, M.P.; Eldredge, J.D.; Freund, J.B. Perspective on machine learning for advancing fluid mechanics. Phys. Rev. Fluids 2019, 4, 100501. [Google Scholar] [CrossRef]
  188. Dubey, V.; Sharma, A.K.; Pimenov, D.Y. Prediction of Surface Roughness Using Machine Learning Approach in MQL Turning of AISI 304 Steel by Varying Nanoparticle Size in the Cutting Fluid. Lubricants 2022, 10, 81. [Google Scholar] [CrossRef]
  189. Cuomo, S.; Di Cola, V.S.; Giampaolo, F.; Rozza, G.; Raissi, M.; Piccialli, F. Scientific Machine Learning Through Physics—Informed Neural Networks: Where we are and What’s Next. J. Sci. Comput. 2022, 92, 88. [Google Scholar] [CrossRef]
  190. Schran, C.; Thiemann, F.L.; Rowe, P.; Muller, E.A.; Marsalek, O.; Michaelides, A. Machine learning potentials for complex aqueous systems made simple. Proc. Natl. Acad. Sci. USA 2021, 118, e2110077118. [Google Scholar] [CrossRef]
  191. Alam, T.M.; Allers, J.P.; Leverant, C.J.; Harvey, J.A. Symbolic regression development of empirical equations for diffusion in Lennard-Jones fluids. J. Chem. Phys. 2022, 157, 014503. [Google Scholar] [CrossRef] [PubMed]
  192. L’Heureux, A.; Grolinger, K.; Elyamany, H.F.; Capretz, M.A.M. Machine Learning with Big Data: Challenges and Approaches. IEEE Access 2017, 5, 7776–7797. [Google Scholar] [CrossRef]
  193. Vinuesa, R.; Brunton, S.L. The Potential of Machine Learning to Enhance Computational Fluid Dynamics. arXiv 2021, arXiv:2110.02085. [Google Scholar]
  194. Poulinakis, K.; Drikakis, D.; Kokkinakis, I.W.; Spottswood, S.M. Machine-Learning Methods on Noisy and Sparse Data. Mathematics 2023, 11, 236. [Google Scholar] [CrossRef]
Figure 1. Publications and reviews on machine and deep learning from 2017 to 2022.
Figure 1. Publications and reviews on machine and deep learning from 2017 to 2022.
Fluids 08 00212 g001
Figure 2. (a) Field of application and (b) algorithms employed in review articles on ML and fluid mechanics, from 2017 to 2023.
Figure 2. (a) Field of application and (b) algorithms employed in review articles on ML and fluid mechanics, from 2017 to 2023.
Fluids 08 00212 g002
Figure 3. Machine learning challenges in fluid mechanics.
Figure 3. Machine learning challenges in fluid mechanics.
Fluids 08 00212 g003
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Drikakis, D.; Sofos, F. Can Artificial Intelligence Accelerate Fluid Mechanics Research? Fluids 2023, 8, 212. https://doi.org/10.3390/fluids8070212

AMA Style

Drikakis D, Sofos F. Can Artificial Intelligence Accelerate Fluid Mechanics Research? Fluids. 2023; 8(7):212. https://doi.org/10.3390/fluids8070212

Chicago/Turabian Style

Drikakis, Dimitris, and Filippos Sofos. 2023. "Can Artificial Intelligence Accelerate Fluid Mechanics Research?" Fluids 8, no. 7: 212. https://doi.org/10.3390/fluids8070212

APA Style

Drikakis, D., & Sofos, F. (2023). Can Artificial Intelligence Accelerate Fluid Mechanics Research? Fluids, 8(7), 212. https://doi.org/10.3390/fluids8070212

Article Metrics

Back to TopTop