Next Article in Journal
Computing Entropy for Long-Chain Alkanes Using Linear Regression: Application to Hydroisomerization
Previous Article in Journal
EXIT Charts for Low-Density Algebra-Check Codes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Machine Learning Advances in High-Entropy Alloys: A Mini-Review

1
State Key Laboratory of Low-Dimensional Quantum Physics, Department of Physics, Tsinghua University, Beijing 100084, China
2
Frontier Science Center for Quantum Information, Beijing 100084, China
*
Author to whom correspondence should be addressed.
Entropy 2024, 26(12), 1119; https://doi.org/10.3390/e26121119
Submission received: 11 November 2024 / Revised: 19 December 2024 / Accepted: 20 December 2024 / Published: 20 December 2024

Abstract

:
The efficacy of machine learning has increased exponentially over the past decade. The utilization of machine learning to predict and design materials has become a pivotal tool for accelerating materials development. High-entropy alloys are particularly intriguing candidates for exemplifying the potency of machine learning due to their superior mechanical properties, vast compositional space, and intricate chemical interactions. This review examines the general process of developing machine learning models. The advances and new algorithms of machine learning in the field of high-entropy alloys are presented in each part of the process. These advances are based on both improvements in computer algorithms and physical representations that focus on the unique ordering properties of high-entropy alloys. We also show the results of generative models, data augmentation, and transfer learning in high-entropy alloys and conclude with a summary of the challenges still faced in machine learning high-entropy alloys today.

1. Introduction

The traditional alloying strategy is to add a small amount of a performance-enhancing element to a primary metal. However, the discovery of high-entropy alloys (HEAs) by Yeh and Cantor in 2004 introduced a different alloying strategy [1,2]. Since then, HEAs have been attracting widespread research interest not only due to their excellent mechanical properties but also due to their unique magnetic, electrical, and chemical properties compared to conventional alloys [3,4,5,6,7,8]. Because these materials have high configurational entropy, which is thought to be responsible for their stability, these alloys are named HEAs [1,9]. HEAs are also often referred to as multicomponent alloys, multi-principal-element alloys, compositionally complex alloys, or complex concentrated alloys [10] because the role of configurational entropy is not always as important as originally envisioned [9,11,12]. The novel alloy synthesis strategy of combining multiple primary elements indicates promising avenues for investigating the vast unexplored chemical compositional space beyond the corners of chemical compositional maps. On the inspiration of the HEA concept, Senkov et al. proposed refractory high-entropy alloys (RHEAs) which consist of a variety of primary refractory metal elements, including W, Mo, Ta, V, Nb, etc. [13]. As a branch of HEAs, RHEAs not only inherit the core properties of HEAs but also exhibit advantageous properties at high temperatures, including enhanced strength [14,15], corrosion resistance [16,17,18], and oxidation resistance [19,20], due to their refractory elemental composition with a high melting point. Therefore, RHEAs are recognized as structural materials with potential applications in the aerospace, nuclear reactor, automotive, and other industries. Another appealing aspect of HEAs is that the varying occupancy of lattice positions by multiple principal elements produces varying degrees of randomness and order, which are usually classified as short-range order (SRO) and long-range order (LRO) [21,22]. The degree of order and disorder in HEAs can be tuned to enhance their mechanical properties [23,24,25]. HEAs have become one of the most exciting research directions in materials science, and branches of HEAs with unprecedented properties continue to emerge [26,27,28].
While the incorporation of multiple-principal elements offers substantial prospects for alloy design, it concurrently presents considerable challenges for theoretical modeling and simulation. First, experimental methods usually require multipurpose equipment and a lot of human power. The experimental process takes a long time for even a small fraction of HEAs. It is almost impossible to discover HEAs by trial-and-error experiments alone due to the extremely large compositional space. Even if substitute models such as phase formation rules [29] and ductility criteria [30] are relied upon to guide effective exploration of the vast compositional space, these empirical rules are challenging for HEAs due to the large variety of chemicals involved. Second, the construction of empirical atomic interaction models is hindered by the considerable number of chemical interactions involved. It can be reasonably considered that the number of interactions will increase exponentially with the introduction of higher-order interactions [22]. This will render the traditional cluster unfolding methods [31] and the embedded atom method [32] susceptible to overfitting. Finally, although computational methods such as Density Functional Theory (DFT) [33], molecular dynamics (MD) simulation [34], and phase diagram calculation (such as CALPHAD) [35] have been successfully used to explore HEAs, the high computational cost, time consumption, and uncertainty of these methods have severely hindered their application to HEAs.
Over the past decade, machine learning has seen rapid growth in its performance as an artificial intelligence (AI) tool [36]. This development has a profound impact on numerous fields within computer science, including computer vision and natural language processing [37]. Furthermore, machine learning provides novel possibilities for the development of non-computer scientific fields, including protein structure prediction [38], medical image analysis [39], particle signal detection [40], and universe analysis [41]. Machine learning also has significant advancements in material science. The calculations of interatomic potentials using ML models are usually much faster than DFT, and the results are close to the interatomic potentials calculated by DFT [42,43,44]. These application break through the original limitations of computational speed and computer memory for DFT, allowing researchers to simulate materials with more than millions of atoms with near-DFT accuracy [45,46]. This ability is sufficient to simulate the properties of HEAs at the nanoscale and is particularly important for exploring the complex long-range structure, which impacts the excellent mechanical properties in HEAs [47,48]. In addition to simulations, machine learning methods provide viable solutions for predicting material properties and screening potential candidate materials [49]. The intricate, non-linear interconnections between structure and property are challenging to convey through mere human observation. The efficacy of machine learning is contingent upon its capacity to elucidate intricate patterns. With the help of machine learning, researchers can explore the huge range of possible materials more quickly. Machine learning has emerged as a promising tool for addressing the challenges inherent to the theoretical modeling of HEAs [50] and has been successful in physical property prediction [51,52], atomistic simulations [53,54,55], phase classification [56], and material design [57,58,59].
We focus on the application of machine learning to HEAs. This review begins by outlining the process of data collection, the selection of appropriate descriptors, the development of algorithms, and the subsequent analysis of performance in machine learning. This is presented in the order in which machine learning models are developed. Furthermore, each section provides a detailed account of the distinctive advancements in machine learning of HEAs, such as descriptors or algorithmic structures applicable to HEAs. Some of the advances are inspired by the high entropy or long-range structural randomness of HEAs, which react to properties that are unique to HEAs. In addition, we present some other special machine learning methods, including generative models, data augmentation, and transfer learning, and discuss their applications in the HEAs. The aforementioned algorithms are of significant assistance in enhancing the performance of machine learning. The relative strengths and weaknesses of different machine learning methods are compared. The final section discusses the challenges associated with machine learning methods in HEAs and, based on these challenges, offers insights into the future direction of machine learning.

2. General Model Process

Although machine learning is frequently a versatile and powerful tool, a single machine learning model is only applicable to a specific problem. This section describes the general process for developing a machine learning model, with a particular focus on recent advancements in the field of HEAs.

2.1. Data Collection

The first step in developing a machine learning model is to collect datasets. The dataset should comprise the targets of the problem and target-related, readily accessible inputs for utilization upon completion of the model development process. DFT represents a commonly employed method for the collection of datasets, offering high levels of generalizability. The dataset calculated by DFT can be employed not only to fit interatomic potentials in order to assist MD simulations but also to directly fit the energy, magnetic moment, and other properties of HEAs. Tran et al. calculate the formation energies and magnetic moments of hundreds of alloys in the FeCoNiCrMn/Pd system [60], delivering the dataset that can be used for the development of machine learning models. It is important that DFT datasets used for the same model use the same computational methods with the same parameters, in particular the pseudopotentials and energy cutoffs. This is to ensure that the energy differences between the data can realistically reflect the effects of atomic arrangements and make it easier for the model to learn the relationships. The size and distribution of the dataset is important for the model performance because machine learning models are always stronger at interpolation and weaker at extrapolation. In order to accelerate the DFT calculations of the dataset, the effective pair interaction (EPI) algorithm is developed by Liu et al. [61]. It is an Ising-like model with only effective pair interactions without considering high-order interactions [53]. The EPI algorithm can significantly accelerate the calculations of the DFT. Furthermore, the machine learning model based on the dataset of the EPI algorithm is an effective method for accelerating the MD simulation of HEAs [54]. For the prediction of material properties, it is also possible to download datasets directly from the computational database, including the Open Quantum Mechanics Database of Northwestern University [62], the Materials Project of Lawrence Berkeley National Laboratory [63], AFLOW [64], Materials Cloud [65], and so on.
Experimentation is also an important method of acquiring a dataset. Similar to DFT calculations, the dataset should be maintained under the same experimental conditions. There are some experimental datasets, such as the Pauling File [66], the High Throughput Experimental Materials [67], and the Materials Experiment and Analysis Database [68]. Some high-performance machine learning models of high-entropy solid solutions or alloys are built on these databases [51,69]. There are also some databases focusing on high-entropy alloy materials. For example, a database containing the mechanical properties and phases of 1545 high-entropy alloys is developed by Borg et al. [70,71,72] A database containing 1252 solid solutions and intermetallic compounds is also developed by Gao et al. [73] The dataset of Kube et al. contains 2425 quinary alloys [74]. A machine learning model for high-entropy alloy phase classification is constructed by Feng et al. [75] using the databases from Gao et al. [73] and Kube et al. [74] Feng then evaluated the performance of the model on the two databases, which again validates the importance of a sufficient amount of data for machine learning performance. Even with the use of transfer learning, a technique that improves performance with small datasets, the classification accuracy of the database with 355 data [73] is still lower than that of the database with 2425 data [74].

2.2. Descriptor Selection

Descriptors are the inputs to machine learning models. The essence of machine learning is to fit the relationship between descriptors and predictions. It is important to find descriptions that have strong correlations with predictions. In order to facilitate material development, elemental types and ratios are often considered. In machine learning, elemental types are often represented using one-hot encoding [76] or element embedding [77]. For the prediction of specific material properties, it is necessary to identify descriptors associated with specific material properties. In order to predict the hardness of HEAs, for example, the melting temperature of the alloy would be an appropriate descriptor, given that it is an indirect representation of the metallic bond strength [52]. Some empirical descriptors are often used, such as mixing enthalpy or mixing entropy [78]. A well-established method of descriptor selection is to filter from the vast descriptor space. Zhang et al. use a genetic algorithm to screen from 9 models and 70 descriptors [79]. Eventually, a Support Vector Machine with four descriptors, the average atomic number, the difference in electronegativities, covalent radii, and the boiling temperature, became the best combination of model descriptors for predicting the crystal structure of HEAs with an accuracy of more than 90% [79].
The sure independence screening and sparsifying operator (SISSO) is a data-driven approach that combines symbolic regression and compressed sensing to construct descriptors of target attributes. It is able to find the optimal descriptor through operations between low-dimensional descriptors [80]. Shang et al. complete the phase classification of HEAs using SISSO relying on only two descriptors [81]. And the descriptors have some physical significance, making it possible to draw accurate, physically interpretable 2D phase diagrams in Figure 1 [81].
Based on the SRO in HEAs, liu et al. propose a descriptor based on Voronoi analysis and Shannon entropy (VASE) [82]. The Voronoi analysis is employed as a basis for introducing Shannon entropy, which is used to directly represent the information about the spatial arrangement of atoms. VASE is capable of responding effectively to the disordered atomic occupancies in HEAs. The coefficient of determination ( R 2 ) for the prediction of the formation energy of the FeCoNiAlTiCu system is 0.918. It is an efficient representation based on the unique arrangement of atoms in HEAs. It is worth mentioning that the model chosen by liu et al. for VASE is a Light Gradient Boosting Machine [83], which shows that with strongly correlated descriptors, even relatively simple models have the potential to achieve good performance.

2.3. Model Selection and Development

The efficacy of machine learning algorithms varies according to their specific characteristics and inherent limitations. It is essential to select an appropriate algorithm in accordance with the specific task objectives and the size of datasets. Machine learning algorithms can be divided into two main groups. One group comprises simple regression models and ensemble learning models. Such models include support vector machine (SVM) [84], classification and regression tree (CART) [85], k-nearest neighbor (KNN) [86], etc. These ensemble learning algorithms are relatively simple, easier to understand, and easier for researchers to find patterns within the model. The other group is deep learning algorithms, represented by artificial neural networks (ANNs) [37]. An ANN consists of multiple functional layers, and each layer consists of multiple neurons. The simplest ANN is the multi-layer perceptron, which is a feed-forward network with input, output, and hidden layers comprising fully connected networks. Other frequently utilized neural networks include convolutional neural networks (CNNs) [87], recurrent neural networks (RNNs) [88], and Transformer [89], which is employed in the context of large language models. In contrast to ensemble learning, deep learning models, while they can exhibit high model performance, are typically too complex to be readily comprehended. Cheng et al. use multiple machine learning algorithms to predict the hardness of HEAs [90]. The root mean square errors (RMSEs) on the test set of each model are shown in Table 1. In their work, ANN performs better than SVM and KNN, which shows the excellent performance of deep learning. Nevertheless, it has been demonstrated that simple machine learning may perform better compared to deep neural network models when the dataset size is limited [91]. This fact can be attributed to the propensity of complex networks to exhibit overfitting when confronted with relatively limited datasets.
Graph neural networks (GNNs) have been extensively developed and have demonstrated remarkable performance in a multitude of tasks within the field of materials science [92]. Graphical representations are highly versatile in capturing the local chemical environment, which is the reason for the success of GNNs in materials science. The graphical representations defined by GNNs are directly encoded by the representation of atoms as nodes and chemical bonds as edges. Additionally, a GNN possesses the capacity to incorporate physical information such as charge and spin. There are many successful GNN-based machine learning models for materials science, such as Schnet [93], CGCNN [94], and MEGNet [95]. The capacity of GNN to represent crystals enables these models to accurately characterize the material properties of a diverse array of systems, thereby attaining advanced performance. Ghouchan et al. propose a graph-based KNN method for predicting phases in HEAs [96]. In their work, Ghouchan proposes a definition of an HEA interaction network, in which each HEA material is represented as a node on the graph. The edges of the graph represent the correlation between materials, with a stronger correlation indicating a higher degree of similarity. This correlation is obtained through the calculation of the similarity between material descriptors. Ghouchan then uses KNN to predict the phases of the HEAs. KNN first selects all the neighbors of each compound in the network and predicts the phase of the target compound using correlation as the voting weight. A part of the HEA interaction network can be seen in Figure 2. Each node represents a distinct material. Nodes of the same color are deemed to be highly similar. Materials of the same color usually belong to the same phase. This model demonstrates well the differences and similarities between HEA materials [96].
Due to the absence of LRO, the properties of HEAs are influenced not only by the local environment of the atoms but also by long-range disorder. At the same time, when predicting the properties of a material, not all target properties are greatly influenced by the local structure of the atoms. Therefore, a machine learning model that can respond to long-range properties is important for HEAs. Wang et al. develop the elemental convolution graph neural network (ECNet) [97], which enables the element-wise features to function as both intermediate and final describers, thereby facilitating the extraction of knowledge pertaining to both atomic information and crystal structures. Furthermore, these features can be updated through the process of learning target material properties. The process of ECnet is referred to in Figure 3, and the atomic local information obtained from the graph network is converted into global elemental properties by elemental convolution to obtain long-range average properties. Wang predicts the formation free energies and magnetic moments of HEAs using ECNet and plots ternary diagrams in Figure 4. These findings contribute to the advancement of physical understanding of HEAs. This reflects another role of machine learning: resolving the difficulty of a near-infinite compositional space in material screening, especially in high-dimensional materials such as HEAs.
The effect of LRO on high-entropy alloys is considered in the work of Zhang et al. [98] The long-range structure of HEAs is a disordered combination of different local environments. Zhang et al. develop an aggregation module based on the GNN model to reflect long-range structures. The GNN learns representations of the local environments, which gains the SRO of HEAs, and the aggregation module randomly combines these representations into global representations of HEAs. The mean absolute errors in the prediction of the bulk modulus and Young’s modulus of HEAs after using the aggregation module are below 8 GPa and 11 GPa [98].
The interatomic potential is a function that describes the dependence of the potential energy on the atomic positions, which is a crucial aspect for molecular dynamics simulations. Recently, machine learning potentials have become a favorable tool for molecular dynamics simulations of complex materials [99]. In comparison to the traditional potentials, machine learning potentials exhibit two principal characteristics. Firstly, machine learning potentials adopt a data-driven methodology comprising training, validation, and testing phases, utilising datasets derived from first-principles calculations. It is beneficial for improving the accuracy of machine learning potentials. Secondly, machine learning potentials assume a malleable, rather than a fixed, functional form, thereby facilitating a systematic enhancement in accuracy [4]. Consequently, high-quality machine learning potentials are capable of describing the interatomic potentials of complex systems with a minimal number of parameters and achieving a level of accuracy that is comparable to that of quantum mechanical methods such as DFT [100]. There are already many proven machine learning potentials, such as the Gaussian approximation potential [101], the spectral neighbor analysis potential [102], Behler–Parrinello neural-network potential [103], deep potential [104], atomic cluster expansion [105], the moment tensor potential [106], and the neuroevolution potential [107,108]. In the Gaussian approximation potential, the total energy is predicted by the Gaussian process regression method, which measures the degree of similarity to the reference atomic environment. Different kernel functions can be chosen depending on the different reference atomic environments. Constructing machine learning potentials that satisfy symmetry to improve machine learning prediction accuracy is also commonly used in HEAs [109], such as the EPI mentioned in Section 2.1 [53,54], low rank potential [110,111], Gaussian approximation potential [112], and moment tensor potential [113]. Li et al. use the spectral neighbor analysis potential to study the Peierls stress for both screw and edge dislocation in the equiatomic NbMoTaW HEAs [48]. They find strong evidence of Nb segregation to the grain boundaries of the NbMoTaW MPEA by machine learning potential-assisted MD simulation. These machine learning potentials can also be used as descriptors to predict properties. Pandey et al. predict the hardness and modulus of MoNbTaTiW alloys using moment tensor potential-based machine learning [114]. Song et al. develop a neuroevolution potential model with 16 elemental metals and their alloys and achieve the transfer from unary and binary materials to multi-component alloys [115]. Their model predicts the formation energy of multi-component alloys and studies the plasticity and primary radiation damage on MoTaVW alloys.
Another approach to constructing machine learning potentials is to directly predict interatomic potentials from symmetric atomic structure inputs. The GNN represents the atomic structure as an undirected graph that directly satisfies the translational invariance and rotational covariance, which makes GNN a powerful tool for predicting machine learning potentials. The characteristics of GNN that use crystal structure to predict material properties are applicable to assist the MD simulation. Cheol et al. use GNN to predict interatomic forces directly from material structures. The predicted interatomic forces are used in MD simulations to obtain new structures [42]. Wu et al. use GNN to predict the interatomic potential to obtain MD simulation lattices with less than 1% error [116]. This process eliminates the need to artificially design material features to satisfy translationally invariant and rotationally covariant and speeds up calculations.
Machine learning potentials can be used to quantify and understand SRO in complex materials [117]. There are millions of possible SRO configurations in a quinary alloy, and it is difficult to traverse these configurations using DFT. Therefore, machine learning potential-assisted MD simulation becomes a powerful tool to study SRO in complex materials. Chen et al. use the neuroevolution potential to study SRO in GeSn alloys [118]. With machine learning potentials enhancing the scale of the study, they find the coexistence of two types of SROs in GeSn alloys.

2.4. Performance Analysis

In order to ensure an objective and accurate evaluation, it is essential that the test samples are representative and uncorrelated with the training samples. Otherwise, this may result in artificially high scores. In the context of classification tasks, various metrics are commonly employed to assess model performance, including recall, precision, the area under the curve, and receiver operating characteristic curves. In the case of regression tasks, RMSE and R 2 score are frequently utilized [119].
Although well-trained ML models are capable of making accurate predictions, in many instances, the interpretability of their model performance is also a crucial consideration. The interpretable predictions provided by machine learning models can offer insights into physics. Pei et al. analyze the contribution of various descriptors to the ductile Mg alloys by machine learning and show that the two mechanisms, dislocation nucleation and dislocation cross-slip, respectively, are always strongly correlated, rather than only one being dominant, as is often assumed [69]. Pei et al. also propose a solid solution formation rule based on machine learning that contains important features such as bulk modulus [51], which is not considered in the previous Hume-Rothery rules [120]. SHapley Additive exPlanations (SHAP) is also an effective method for machine learning interpretation [121]. SHAP can provide physical understanding for machine learning by giving the importance and contribution of each descriptor.
The interpretability of deep neural networks is often hindered by their non-linearity and high dimensionality. At present, one of the most established methods for visualizing high-dimensional data is the t-distributed stochastic neighbor embedding (t-SNE) algorithm [122]. The t-SNE has been successfully applied in the visualization of phase classification in HEAs [55]. The t-SNE algorithm represents high-dimensional data in two or three-dimensional space, ensuring that the mapped data points and their neighboring data maintain analogous spatial relationships to those observed in the original space. The t-SNE algorithm provides a straightforward visualization of the classification problem. If the data are effectively separated in the low-dimensional space, the model can readily distinguish between them. Conversely, if the data are clustered together in the low-dimensional space, the model exhibits reduced classification accuracy. Lee et al. examine the impact of model classification accuracy on t-SNE [123]. The network they used for high-entropy alloy phase classification has a total of five hidden layers. In the first three hidden layers, the differences between the data are not well discriminated, the points in t-SNE are still clustered together, and the classification accuracy of the model is not high. After going through all five hidden layers, the classification accuracy of the model reaches 93.17%, and the points in the t-SNE are also separated (Figure 5) [123].
Lee et al. develop an interpretable machine learning model for phase classification of HEAs based on the contribution of descriptors [124]. In their breakdown (BD) approach, the model predictions for a single observation are decomposed into contributions attributable to different input variables. Furthermore, the contributions of descriptors to different phases are separated to obtain a local contribution map. The mixing entropy of NbTaTiV has a negative contribution (see Figure 6a). The mean_MeltingT, mean_NValence, and mean_NsValence variables are of particular significance, as they exert the greatest influence on the formation of components within the BCC phase. The probability of each phase predicted by the machine learning model in the NbTaTiV system when there is only one descriptor change can help to study the dependence of predictions on specific descriptors, as shown in Figure 6b.
In addition to interpretability, the analysis of the results should also encompass the capacity to generalize and the overfitting issues, particularly in the context of small datasets. Five-fold cross-validation is a commonly used method for model validation [125]. A good machine learning algorithm should ensure that there is little difference in the accuracy of the five models. This is to prevent particular data splits from causing the models to show falsely high performance with low generalization. Victor et al. propose a method of cross-validation of classification and regression models to determine SISSO descriptors to avoid the overfitting problem of traditional SISSO algorithms in HEAs, which successfully improves the performance of the models on the test set [126].
Experimental validation is the most effective test for the performance of machine learning. Li et al. validate 18 HAEs in experimental articles [127]. The phases of the 13 HEAs are consistent with the machine learning results. The experimental results of Chen et al. are consistent with machine learning predictions of eutectic alloys [128]. All experimentally synthesized HEAs are intended eutectic alloys.

3. Special Machine Learning Algorithms

Some machine learning algorithms differ in function or process from general machine learning algorithms. For instance, there are generative models, which serve the complemented purpose to predictive models. Other special algorithms, such as data augmentation algorithms and transfer learning, can enhance the predictive performance of models. We will describe each of them in this section.

3.1. Generative Models

The application of predictive machine learning models necessitates the input of elemental ratios or structural information in order to obtain material properties. As the number of possible combinations increases exponentially with the range of elemental species, the process of global screening of material properties can become a large and complex task. To overcome these problems generative machine learning models are proposed [129]. In contrast to predictive models, generative models receive the desired material properties and provide combinations directly from the potential material space. Such models are capable of identifying and extracting hidden patterns from complex databases, subsequently utilizing them to generate novel combinations of structures that satisfy specific attribute requirements, negating the necessity for additional human input [130,131]. The most commonly utilized generative models include Generative Adversarial Network (GAN) [132], Variational Autoencoder (VAE) [133], flow-based model [134], diffusion model [135], and so on.
Generative models have been successfully used in HEAs. Li et al. used GAN to generate single-phase HEAs and obtained 188 potential materials with densities of less than 8 g / cm 3 and prices of less than $USD 12/g [127]. A VEA model for generating complex eutectic alloys is developed by Chen et al. [128] The process of their algorithm is illustrated in Figure 7. The encoder network maps the input components and descriptors into a two-dimensional potential space, where the eutectic and non-eutectic components are classified into two distinct groups. Subsequently, the decoder network generates combinations based on the corresponding groups in the potential space. The quality of the generated data is also evaluated in conjunction with an ANN model for predicting eutectic alloys. The potential eutectic compositions are generated ternary alloy combinations, which are then observed under a scanning electron microscope. The experimental results demonstrate that VAE was effective in generating potential eutectic alloys.

3.2. Data Augmentation

Notwithstanding the rapid evolution of methodologies and computational resources, the implementation of DFT computations remains a costly and inefficient process. This has resulted in datasets typically being limited to less than 10,000 for the development of material machine learning models, with datasets larger than this size being computationally prohibitively expensive. The advantage of machine learning models in terms of computational cost is no longer a factor. Furthermore, the potential material space for multi-component materials like HEAs is vast [58], and the limit of the dataset has become a significant challenge that impedes the accuracy of the models.
The method of increasing the number of samples in a dataset by processing the original data in the dataset and adding data containing new information is known as data augmentation. Ye et al. add Gaussian noise to the original data and fed these noisy data into the model to train it as data augmentation [136]. This approach achieved good results in predicting the hardness of HEAs. In addition, Ye et al. show the effect of the amount of augmented data and the amount of noise on the model performance, as given in Table 2. The data-augmented model has good performance on the validation set, which the model has never seen before, suggesting that data augmentation has better performance away from the training set and improves the generalization of the model. The middle noise-enhanced model performs best, while too much enhanced data degrades model performance.
Generative models such as GAN can also be used for data augmentation. Chen et al. develop a data-augmented model for the classification of phases in HEAs using data generated by the GAN model [137]. The classification accuracy of the data-augmented model reaches 96.08%.

3.3. Transfer Learning

Transfer learning represents a valuable machine learning technique that facilitates the sharing of knowledge between domain models that are related in some way [138]. Transfer learning can also address the problem of insufficient datasets. Transfer learning typically pre-trains the network on a large, easily accessible dataset, after which it keeps some of the network structure and parameters unchanged and trains the model on a small dataset. Wang et al. reduce the mean absolute error for predicting magnetic moments of HEAs from 0.197 μ B / atom to 0.091 μ B / atom by pre-training the model on the binary and ternary alloys [97].
Feng et al. propose a transfer learning algorithm based on CNN [75]. The process of their transfer algorithm is shown in Figure 8. They regard the convolutional layer of CNN as a feature extractor, capable of extracting generic material feature representations from diverse materials suitable for machine learning. These elemental features are transferable. They train the feature extractor on 228,676 compounds, followed by a phase classification model trained on two HEA datasets. The classification accuracies on the two datasets are 0.93 and 0.939, respectively. It shows that the feature extractor captures the generic elemental features well and can help machine learning models on small datasets to achieve better performance [75].

4. Challenges and Future Directions

Today, machine learning still faces many challenges in the field of high-entropy alloys. The past applications of machine learning in the field of high-entropy alloys predominantly concentrate on the classification of phases and the prediction of mechanical properties. Using machine learning to capture more underlying physical properties such as formation energy and magnetic moments is one of the future directions. Unlike the mechanical properties, these underlying physical properties vary dramatically with elemental components and are strongly correlated with the structure of alloys. The inclusion of more underlying physical information such as translation and rotation symmetries in crystal structures in machine learning descriptors is a key ingredient for the successful prediction of properties such as formation energies. However, the introduction of structure means that more calculations are required in preparing the training set. Balancing the consumption of developing algorithms with the accuracy of the algorithms is also a problem that needs to be focused on. At the present, the idea of using machine learning to capture the underlying physical information is currently being successfully applied to assist molecular dynamics simulations. The accuracy of predicting interatomic potentials is improved by constructing machine learning potentials that satisfy translational invariance and rotational covariance or by using graph neural networks directly to satisfy translational invariance and rotational covariance [42].
Another challenge is obtaining high-quality datasets. Both experiments and simulations to obtain data on high-entropy alloys are time consuming. Experimental or simulation data from different researchers often have the problem of being obtained under different conditions. It is difficult to calibrate these data. As a result, datasets for high-entropy alloys are often small. However, due to the large component space of high-entropy alloys, small datasets often cause more problems than in other materials. Although we show in Section 3 that generative models and data augmentation are used to improve prediction accuracy with small datasets, these methods do not fully address the problems posed by small datasets. In the future development of machine learning models, it is important to construct a uniform and accurate dataset. In addition, the mechanical properties of high-entropy alloys are affected by defects, dislocations, and so on. More materials containing defects and dislocations should be included in the machine learning datasets.
The third problem is the reliability of the generative model. Generative models such as GANs often rely on the similarity of the original data to generate new data. Currently, the generative model is mainly applied to generate single-phase high-entropy alloys. This is due to the fact that high-entropy alloys of the same phase tend to have similar structures. However, the relationship between structure and properties of materials often does not follow similarity. Identical properties may correspond to completely different structures, and similar structures may have large differences in properties. Therefore, the properties obtained using generative models are usually used for data augmentation to improve prediction accuracy [137]. In the meantime, the intricate and fluctuating nature of high-entropy alloys renders it challenging for directly generative models to meet the requisite mechanical properties or magnetic moments. In order to satisfy the need of inverse material design from properties to structures, the use of generative models to obtain structures that satisfy performance requirements is a major future challenge.

5. Conclusions

The synergy between machine learning and materials science has fostered the emergence of a new, exciting, and exponentially growing field. High-entropy alloys are particularly interesting candidates for demonstrating the power of machine learning due to their superior mechanical properties, vast compositional space, and complex chemical interactions. Models applicable to high-entropy alloys continue to be proposed in various parts of the machine learning process. Some of the works take a data or computational algorithmic perspective to improve model performance. Others focus on the unique properties of high-entropy alloys, such as exploring unique representations of the short-range and long-range order of high-entropy alloys and seeking more physically correct representations. These efforts have greatly improved the effectiveness of machine learning in high-entropy alloys. The unique large-scale screening capability of machine learning is also providing new impetus for the development of high-entropy alloys. In addition, interpretable machine learning models in high-entropy alloys have also progressed, bringing unprecedented directions. Generative models are capable of providing potential element proportions and structures, thereby greatly reducing the difficulty of predictive model search in high-dimensional spaces. The incorporation of data augmentation and transfer learning offers a novel avenue for enhancing the performance of prediction in high-entropy alloys with limited data. Overall, machine learning is well developed in the field of high-entropy alloys, but there are many exciting and challenging problems to be solved.

Author Contributions

J.N. and Y.S. wrote the text of the review. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the National Natural Science Foundation of China (grant no. 92270104).

Acknowledgments

Figure 1 is reprinted from Ref. [81] with permission of Royal Society of Chemistry (license ID: 1542452-1). Figure 2 is reprinted from Ref. [96] under the Creative Commons Attribution license (https://creativecommons.org/licenses/by/4.0/ accessed on 30 November 2024). Figure 3 and Figure 4 are reprinted from Ref. [97] under the Creative Commons Attribution license. Figure 5 is reprinted from Ref. [123] with permission of Elsevier (license ID: 5901841487027). Figure 6 is reprinted from Ref. [124] under the Creative Commons Attribution license. Figure 7 is reprinted from Ref. [128] under the Creative Commons Attribution-NonCommercial-NoDerivatives license (http://creativecommons.org/licenses/by-nc-nd/4.0/ accessed on 30 November 2024). Figure 8 is reprinted from Ref. [75] with permission of Elsevier (license ID: 5901870412321).

Conflicts of Interest

The authors declare no competing interests.

Abbreviations

The following abbreviations are used in this manuscript:
HEAHigh-entropy alloys
RHEARefractory high-entropy alloys
SROShort-range order
LROLong-range order
DFTDensity functional theory
MDMolecular dynamics
CALPHADPhase diagram calculation
AIArtificial intelligence
EPIEffective pair interaction
SISSOSure independence screening and sparsifying operator
AMAmorphous
IMIntermetallic
SSSolid solution
BCCBody-centered-cubic
FCCFace-centered-cubic
VASEVoronoi analysis and Shannon entropy
R 2 Coefficient of determination
SVMSupport vector machine
CARTClassification and regression tree
KNNk-nearest neighbor
ANNArtificial neural network
CNNConvolutional neural network
RNNRecurrent neural network
RMSERoot mean square error
GNNGraph neural network
ECNetElemental convolution graph neural network
SHAPSHapley Additive exPlanations
BDbreakdown
t-SNEt-distributed stochastic neighbor embedding
GANGenerative adversarial network
VAEVariational autoencoder

References

  1. Yeh, J.W.; Chen, S.K.; Lin, S.J.; Gan, J.Y.; Chin, T.S.; Shun, T.T.; Tsau, C.H.; Chang, S.Y. Nanostructured high-entropy alloys with multiple principal elements: Novel alloy design concepts and outcomes. Adv. Eng. Mater. 2004, 6, 299–303. [Google Scholar] [CrossRef]
  2. Cantor, B.; Chang, I.; Knight, P.; Vincent, A. Microstructural development in equiatomic multicomponent alloys. Mater. Sci. Eng. A 2004, 375, 213–218. [Google Scholar] [CrossRef]
  3. Liu, X.; Xu, P.; Zhao, J.; Lu, W.; Li, M.; Wang, G. Material machine learning for alloys: Applications, challenges and perspectives. J. Alloys Compd. 2022, 921, 165984. [Google Scholar] [CrossRef]
  4. Kumar, A.; Singh, A.; Suhane, A. A critical review on mechanically alloyed high entropy alloys: Processing challenges and properties. Mater. Res. Express 2022, 9, 052001. [Google Scholar] [CrossRef]
  5. Chen, C.; Zhang, H.; Fan, Y.; Wei, R.; Zhang, W.; Wang, T.; Zhang, T.; Wu, K.; Li, F.; Guan, S.; et al. Improvement of corrosion resistance and magnetic properties of FeCoNiAl0.2Si0.2 high entropy alloy via rapid-solidification. Intermetallics 2020, 122, 106778. [Google Scholar] [CrossRef]
  6. Kai, W.; Li, C.; Cheng, F.; Chu, K.; Huang, R.; Tsay, L.; Kai, J. Air-oxidation of FeCoNiCr-based quinary high-entropy alloys at 700–900 C. Corros. Sci. 2017, 121, 116–125. [Google Scholar] [CrossRef]
  7. Pu, G.; Lin, L.; Ang, R.; Zhang, K.; Liu, B.; Liu, B.; Peng, T.; Liu, S.; Li, Q. Outstanding radiation tolerance and mechanical behavior in ultra-fine nanocrystalline Al1.5CoCrFeNi high entropy alloy films under He ion irradiation. Appl. Surf. Sci. 2020, 516, 146129. [Google Scholar] [CrossRef]
  8. Lin, Y.; Yang, T.; Lang, L.; Shan, C.; Deng, H.; Hu, W.; Gao, F. Enhanced radiation tolerance of the Ni-Co-Cr-Fe high-entropy alloy as revealed from primary damage. Acta Mater. 2020, 196, 133–143. [Google Scholar] [CrossRef]
  9. George, E.P.; Raabe, D.; Ritchie, R.O. High-entropy alloys. Nat. Rev. Mater. 2019, 4, 515–534. [Google Scholar] [CrossRef]
  10. Cantor, B. Multicomponent high-entropy Cantor alloys. Prog. Mater. Sci. 2021, 120, 100754. [Google Scholar] [CrossRef]
  11. Otto, F.; Yang, Y.; Bei, H.; George, E.P. Relative effects of enthalpy and entropy on the phase stability of equiatomic high-entropy alloys. Acta Mater. 2013, 61, 2628–2638. [Google Scholar] [CrossRef]
  12. Ma, D.; Grabowski, B.; Körmann, F.; Neugebauer, J.; Raabe, D. Ab initio thermodynamics of the CoCrFeMnNi high entropy alloy: Importance of entropy contributions beyond the configurational one. Acta Mater. 2015, 100, 90–97. [Google Scholar] [CrossRef]
  13. Senkov, O.; Wilks, G.; Miracle, D.; Chuang, C.; Liaw, P. Refractory high-entropy alloys. Intermetallics 2010, 18, 1758–1765. [Google Scholar] [CrossRef]
  14. Senkov, O.N.; Wilks, G.B.; Scott, J.M.; Miracle, D.B. Mechanical properties of Nb25Mo25Ta225W25 and V20Nb20Mo20Ta20W20 refractory high entropy alloys. Intermetallics 2011, 19, 698–706. [Google Scholar] [CrossRef]
  15. Guo, N.; Wang, L.; Luo, L.; Li, X.; Chen, R.; Su, Y.; Guo, J.; Fu, H. Hot deformation characteristics and dynamic recrystallization of the MoNbHfZrTi refractory high-entropy alloy. Mater. Sci. Eng. A 2016, 651, 698–707. [Google Scholar] [CrossRef]
  16. Shi, Y.; Yang, B.; Xie, X.; Brechtl, J.; Dahmen, K.A.; Liaw, P.K. Corrosion of AlxCoCrFeNi high-entropy alloys: Al-content and potential scan-rate dependent pitting behavior. Corros. Sci. 2017, 119, 33–45. [Google Scholar] [CrossRef]
  17. Rodriguez, A.A.; Tylczak, J.H.; Gao, M.C.; Jablonski, P.D.; Detrois, M.; Ziomek-Moroz, M.; Hawk, J.A. Effect of molybdenum on the corrosion behavior of high-entropy alloys CoCrFeNi2 and CoCrFeNi2Mo0.25 under sodium chloride aqueous conditions. Adv. Mater. Sci. Eng. 2018, 2018, 3016304. [Google Scholar] [CrossRef]
  18. Sarkar, S.; Sarswat, P.K.; Free, M.L. Elevated temperature corrosion resistance of additive manufactured single phase AlCoFeNiTiV0.9Sm0.1 and AlCoFeNiV0.9Sm0.1 HEAs in a simulated syngas atmosphere. Addit. Manuf. 2019, 30, 100902. [Google Scholar] [CrossRef]
  19. Gorr, B.; Mueller, F.; Christ, H.J.; Mueller, T.; Chen, H.; Kauffmann, A.; Heilmaier, M. High temperature oxidation behavior of an equimolar refractory metal-based alloy 20Nb20Mo20Cr20Ti20Al with and without Si addition. J. Alloys Compd. 2016, 688, 468–477. [Google Scholar] [CrossRef]
  20. Gorr, B.; Schellert, S.; Müller, F.; Christ, H.J.; Kauffmann, A.; Heilmaier, M. Current status of research on the oxidation behavior of refractory high entropy alloys. Adv. Eng. Mater. 2021, 23, 2001047. [Google Scholar] [CrossRef]
  21. Singh, P.; Smirnov, A.V.; Johnson, D.D. Atomic short-range order and incipient long-range order in high-entropy alloys. Phys. Rev. B 2015, 91, 224204. [Google Scholar] [CrossRef]
  22. Widom, M. Modeling the structure and thermodynamics of high-entropy alloys. J. Mater. Res. 2018, 33, 2881–2898. [Google Scholar] [CrossRef]
  23. Oh, H.S.; Kim, S.J.; Odbadrakh, K.; Ryu, W.H.; Yoon, K.N.; Mu, S.; Körmann, F.; Ikeda, Y.; Tasan, C.C.; Raabe, D.; et al. Engineering atomic-level complexity in high-entropy and complex concentrated alloys. Nat. Commun. 2019, 10, 2090. [Google Scholar] [CrossRef]
  24. Hu, R.; Jin, S.; Sha, G. Application of atom probe tomography in understanding high entropy alloys: 3D local chemical compositions in atomic scale analysis. Prog. Mater. Sci. 2022, 123, 100854. [Google Scholar] [CrossRef]
  25. Pei, Z.; Li, R.; Gao, M.C.; Stocks, G.M. Statistics of the NiCoCr medium-entropy alloy: Novel aspects of an old puzzle. npj Comput. Mater. 2020, 6, 122. [Google Scholar] [CrossRef]
  26. George, E.P.; Curtin, W.A.; Tasan, C.C. High entropy alloys: A focused review of mechanical properties and deformation mechanisms. Acta Mater. 2020, 188, 435–474. [Google Scholar] [CrossRef]
  27. Gludovatz, B.; Hohenwarter, A.; Catoor, D.; Chang, E.H.; George, E.P.; Ritchie, R.O. A fracture-resistant high-entropy alloy for cryogenic applications. Science 2014, 345, 1153–1158. [Google Scholar] [CrossRef]
  28. Li, Z.; Pradeep, K.G.; Deng, Y.; Raabe, D.; Tasan, C.C. Metastable high-entropy dual-phase alloys overcome the strength–ductility trade-off. Nature 2016, 534, 227–230. [Google Scholar] [CrossRef]
  29. Zhang, Y.; Zhou, Y.J.; Lin, J.P.; Chen, G.L.; Liaw, P.K. Solid-solution phase formation rules for multi-component alloys. Adv. Eng. Mater. 2008, 10, 534–538. [Google Scholar] [CrossRef]
  30. Mak, E.; Yin, B.; Curtin, W. A ductility criterion for bcc high entropy alloys. J. Mech. Phys. Solids. 2021, 152, 104389. [Google Scholar] [CrossRef]
  31. Sanchez, J.M.; Ducastelle, F.; Gratias, D. Generalized cluster description of multicomponent systems. Phys. A 1984, 128, 334–350. [Google Scholar] [CrossRef]
  32. Daw, M.S.; Baskes, M.I. Embedded-atom method: Derivation and application to impurities, surfaces, and other defects in metals. Phys. Rev. B 1984, 29, 6443. [Google Scholar] [CrossRef]
  33. Kohn, W.; Sham, L.J. Self-consistent equations including exchange and correlation effects. Phys. Rev. 1965, 140, A1133. [Google Scholar] [CrossRef]
  34. Hollingsworth, S.A.; Dror, R.O. Molecular dynamics simulation for all. Neuron 2018, 99, 1129–1143. [Google Scholar] [CrossRef] [PubMed]
  35. Chang, Y.A.; Chen, S.; Zhang, F.; Yan, X.; Xie, F.; Schmid-Fetzer, R.; Oates, W.A. Phase diagram calculation: Past, present and future. Prog. Mater. Sci. 2004, 49, 313–345. [Google Scholar] [CrossRef]
  36. Xie, J.; Su, Y.; Zhang, D.; Feng, Q. A vision of materials genome engineering in China. Engineering 2022, 10, 10–12. [Google Scholar] [CrossRef]
  37. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
  38. Senior, A.W.; Evans, R.; Jumper, J.; Kirkpatrick, J.; Sifre, L.; Green, T.; Qin, C.; Žídek, A.; Nelson, A.W.; Bridgland, A.; et al. Improved protein structure prediction using potentials from deep learning. Nature 2020, 577, 706–710. [Google Scholar] [CrossRef]
  39. Litjens, G.; Kooi, T.; Bejnordi, B.E.; Setio, A.A.A.; Ciompi, F.; Ghafoorian, M.; Van Der Laak, J.A.; Van Ginneken, B.; Sánchez, C.I. A survey on deep learning in medical image analysis. Med. Image Anal. 2017, 42, 60–88. [Google Scholar] [CrossRef]
  40. Radovic, A.; Williams, M.; Rousseau, D.; Kagan, M.; Bonacorsi, D.; Himmel, A.; Aurisano, A.; Terao, K.; Wongjirad, T. Machine learning at the energy and intensity frontiers of particle physics. Nature 2018, 560, 41–48. [Google Scholar] [CrossRef]
  41. Hezaveh, Y.D.; Levasseur, L.P.; Marshall, P.J. Fast automated analysis of strong gravitational lenses with convolutional neural networks. Nature 2017, 548, 555–557. [Google Scholar] [CrossRef] [PubMed]
  42. Park, C.W.; Kornbluth, M.; Vandermause, J.; Wolverton, C.; Kozinsky, B.; Mailoa, J.P. Accurate and scalable graph neural network force field and molecular dynamics with direct force architecture. npj Comput. Mater. 2021, 7, 73. [Google Scholar] [CrossRef]
  43. Nyshadham, C.; Rupp, M.; Bekker, B.; Shapeev, A.V.; Mueller, T.; Rosenbrock, C.W.; Csányi, G.; Wingate, D.W.; Hart, G.L. Machine-learned multi-system surrogate models for materials prediction. npj Comput. Mater. 2019, 5, 51. [Google Scholar] [CrossRef]
  44. Rosenbrock, C.W.; Gubaev, K.; Shapeev, A.V.; Pártay, L.B.; Bernstein, N.; Csányi, G.; Hart, G.L. Machine-learned interatomic potentials for alloys and alloy phase diagrams. npj Comput. Mater. 2021, 7, 24. [Google Scholar] [CrossRef]
  45. Jia, W.; Wang, H.; Chen, M.; Lu, D.; Lin, L.; Car, R.; Weinan, E.; Zhang, L. Pushing the limit of molecular dynamics with ab initio accuracy to 100 million atoms with machine learning. In Proceedings of the SC20: International Conference for High Performance Computing, Networking, Storage and Analysis, Virtual, 9–19 November 2020; IEEE: New York, NY, USA, 2020; pp. 1–14. [Google Scholar]
  46. Deringer, V.L.; Bernstein, N.; Csányi, G.; Ben Mahmoud, C.; Ceriotti, M.; Wilson, M.; Drabold, D.A.; Elliott, S.R. Origins of structural and electronic transitions in disordered silicon. Nature 2021, 589, 59–64. [Google Scholar] [CrossRef]
  47. Yin, S.; Zuo, Y.; Abu-Odeh, A.; Zheng, H.; Li, X.G.; Ding, J.; Ong, S.P.; Asta, M.; Ritchie, R.O. Atomistic simulations of dislocation mobility in refractory high-entropy alloys and the effect of chemical short-range order. Nat. Commun. 2021, 12, 4873. [Google Scholar] [CrossRef]
  48. Li, X.G.; Chen, C.; Zheng, H.; Zuo, Y.; Ong, S.P. Complex strengthening mechanisms in the NbMoTaW multi-principal element alloy. npj Comput. Mater. 2020, 6, 70. [Google Scholar] [CrossRef]
  49. Schmidt, J.; Marques, M.R.; Botti, S.; Marques, M.A. Recent advances and applications of machine learning in solid-state materials science. npj Comput. Mater 2019, 5, 83. [Google Scholar] [CrossRef]
  50. Hart, G.L.; Mueller, T.; Toher, C.; Curtarolo, S. Machine learning for alloys. Nat. Rev. Mater. 2021, 6, 730–755. [Google Scholar] [CrossRef]
  51. Pei, Z.; Yin, J.; Hawk, J.A.; Alman, D.E.; Gao, M.C. Machine-learning informed prediction of high-entropy solid solution formation: Beyond the Hume-Rothery rules. npj Comput. Mater. 2020, 6, 50. [Google Scholar] [CrossRef]
  52. Rickman, J.; Chan, H.; Harmer, M.; Smeltzer, J.; Marvel, C.; Roy, A.; Balasubramanian, G. Materials informatics for the screening of multi-principal elements and high-entropy alloys. Nat. Commun. 2019, 10, 2618. [Google Scholar] [CrossRef]
  53. Zhang, J.; Liu, X.; Bi, S.; Yin, J.; Zhang, G.; Eisenbach, M. Robust data-driven approach for predicting the configurational energy of high entropy alloys. Mater. Design. 2020, 185, 108247. [Google Scholar] [CrossRef]
  54. Liu, X.; Zhang, J.; Yin, J.; Bi, S.; Eisenbach, M.; Wang, Y. Monte Carlo simulation of order-disorder transition in refractory high entropy alloys: A data-driven approach. Comput. Mater. Sci. 2021, 187, 110135. [Google Scholar] [CrossRef]
  55. Yin, J.; Pei, Z.; Gao, M.C. Neural network-based order parameter for phase transitions and its applications in high-entropy alloys. Nat. Comput. Sci. 2021, 1, 686–693. [Google Scholar] [CrossRef] [PubMed]
  56. Yan, Y.; Lu, D.; Wang, K. Accelerated discovery of single-phase refractory high entropy alloys assisted by machine learning. Comput. Mater. Sci. 2021, 199, 110723. [Google Scholar] [CrossRef]
  57. Ha, M.Q.; Nguyen, D.N.; Nguyen, V.C.; Nagata, T.; Chikyow, T.; Kino, H.; Miyake, T.; Denœux, T.; Huynh, V.N.; Dam, H.C. Evidence-based recommender system for high-entropy alloys. Nat. Comput. Sci. 2021, 1, 470–478. [Google Scholar] [CrossRef] [PubMed]
  58. Singh, R.; Sharma, A.; Singh, P.; Balasubramanian, G.; Johnson, D.D. Accelerating computational modeling and design of high-entropy alloys. Nat. Comput. Sci. 2021, 1, 54–61. [Google Scholar] [CrossRef] [PubMed]
  59. Rao, Z.; Tung, P.Y.; Xie, R.; Wei, Y.; Zhang, H.; Ferrari, A.; Klaver, T.; Körmann, F.; Sukumar, P.T.; Kwiatkowski da Silva, A.; et al. Machine learning–enabled high-entropy alloy discovery. Science 2022, 378, 78–85. [Google Scholar] [CrossRef]
  60. Tran, N.D.; Saengdeejing, A.; Suzuki, K.; Miura, H.; Chen, Y. Stability and thermodynamics properties of CrFeNiCoMn/Pd high entropy alloys from first principles. J. Phase Equilib. Diffus. 2021, 42, 606–616. [Google Scholar] [CrossRef]
  61. Liu, X.; Zhang, J.; Eisenbach, M.; Wang, Y. Machine learning modeling of high entropy alloy: The role of short-range order. arXiv 2019, arXiv:1906.02889. [Google Scholar]
  62. Saal, J.E.; Kirklin, S.; Aykol, M.; Meredig, B.; Wolverton, C. Materials design and discovery with high-throughput density functional theory: The open quantum materials database (OQMD). JOM 2013, 65, 1501–1509. [Google Scholar] [CrossRef]
  63. Jain, A.; Ong, S.P.; Hautier, G.; Chen, W.; Richards, W.D.; Dacek, S.; Cholia, S.; Gunter, D.; Skinner, D.; Ceder, G.; et al. Commentary: The Materials Project: A materials genome approach to accelerating materials innovation. APL Mater. 2013, 1, 011002. [Google Scholar] [CrossRef]
  64. Curtarolo, S.; Setyawan, W.; Hart, G.L.; Jahnatek, M.; Chepulskii, R.V.; Taylor, R.H.; Wang, S.; Xue, J.; Yang, K.; Levy, O.; et al. AFLOW: An automatic framework for high-throughput materials discovery. Comput. Mater. Sci. 2012, 58, 218–226. [Google Scholar] [CrossRef]
  65. Talirz, L.; Kumbhar, S.; Passaro, E.; Yakutovich, A.V.; Granata, V.; Gargiulo, F.; Borelli, M.; Uhrin, M.; Huber, S.P.; Zoupanos, S.; et al. Materials Cloud, a platform for open computational science. Sci. Data 2020, 7, 299. [Google Scholar] [CrossRef]
  66. Villars, P.; Berndt, M.; Brandenburg, K.; Cenzual, K.; Daams, J.; Hulliger, F.; Massalski, T.; Okamoto, H.; Osaki, K.; Prince, A.; et al. The pauling file. J. Alloys Compd. 2004, 367, 293–297. [Google Scholar] [CrossRef]
  67. Zakutayev, A.; Wunder, N.; Schwarting, M.; Perkins, J.D.; White, R.; Munch, K.; Tumas, W.; Phillips, C. An open experimental database for exploring inorganic materials. Sci. Data 2018, 5, 180053. [Google Scholar] [CrossRef]
  68. Soedarmadji, E.; Stein, H.S.; Suram, S.K.; Guevarra, D.; Gregoire, J.M. Tracking materials science data lineage to manage millions of materials experiments and analyses. npj Comput. Mater. 2019, 5, 79. [Google Scholar] [CrossRef]
  69. Pei, Z.; Yin, J. Machine learning as a contributor to physics: Understanding Mg alloys. Mater. Design 2019, 172, 107759. [Google Scholar] [CrossRef]
  70. Borg, C.K.; Frey, C.; Moh, J.; Pollock, T.M.; Gorsse, S.; Miracle, D.B.; Senkov, O.N.; Meredig, B.; Saal, J.E. Expanded dataset of mechanical properties and observed phases of multi-principal element alloys. Sci. Data 2020, 7, 430. [Google Scholar] [CrossRef]
  71. Couzinié, J.P.; Senkov, O.; Miracle, D.; Dirras, G. Comprehensive data compilation on the mechanical properties of refractory high-entropy alloys. Data Brief 2018, 21, 1622–1641. [Google Scholar] [CrossRef]
  72. Gorsse, S.; Nguyen, M.; Senkov, O.N.; Miracle, D.B. Database on the mechanical properties of high entropy alloys and complex concentrated alloys. Data Brief 2018, 21, 2664–2678. [Google Scholar] [CrossRef] [PubMed]
  73. Gao, M.C.; Zhang, C.; Gao, P.; Zhang, F.; Ouyang, L.; Widom, M.; Hawk, J.A. Thermodynamics of concentrated solid solution alloys. Curr. Opin. Solid State Mater. Sci. 2017, 21, 238–251. [Google Scholar] [CrossRef]
  74. Kube, S.A.; Sohn, S.; Uhl, D.; Datye, A.; Mehta, A.; Schroers, J. Phase selection motifs in High Entropy Alloys revealed through combinatorial methods: Large atomic size difference favors BCC over FCC. Acta Mater. 2019, 166, 677–686. [Google Scholar] [CrossRef]
  75. Feng, S.; Zhou, H.; Dong, H. Application of deep transfer learning to predicting crystal structures of inorganic substances. Comput. Mater. Sci. 2021, 195, 110476. [Google Scholar] [CrossRef]
  76. Zhang, J.; Cai, C.; Kim, G.; Wang, Y.; Chen, W. Composition design of high-entropy alloys with deep sets learning. npj Comput. Mater. 2022, 8, 89. [Google Scholar] [CrossRef]
  77. Dai, B.; Shen, X.; Wang, J. Embedding learning. J. Am. Stat. Assoc. 2022, 117, 307–319. [Google Scholar] [CrossRef]
  78. Roy, A.; Balasubramanian, G. Predictive descriptors in machine learning and data-enabled explorations of high-entropy alloys. Comput. Mater. Sci. 2021, 193, 110381. [Google Scholar] [CrossRef]
  79. Zhang, Y.; Wen, C.; Wang, C.; Antonov, S.; Xue, D.; Bai, Y.; Su, Y. Phase prediction in high entropy alloys with a rational selection of materials descriptors and machine learning models. Acta Mater. 2020, 185, 528–539. [Google Scholar] [CrossRef]
  80. Ouyang, R.; Curtarolo, S.; Ahmetcik, E.; Scheffler, M.; Ghiringhelli, L.M. SISSO: A compressed-sensing method for identifying the best low-dimensional descriptor in an immensity of offered candidates. Phys. Rev. Mater. 2018, 2, 083802. [Google Scholar] [CrossRef]
  81. Zhao, S.; Yuan, R.; Liao, W.; Zhao, Y.; Wang, J.; Li, J.; Lookman, T. Descriptors for phase prediction of high entropy alloys using interpretable machine learning. J. Mater. Chem. A 2024, 12, 2807–2819. [Google Scholar] [CrossRef]
  82. Liu, J.; Wang, P.; Luan, J.; Chen, J.; Cai, P.; Chen, J.; Lu, X.; Fan, Y.; Yu, Z.; Chou, K. VASE: A High-Entropy Alloy Short-Range Order Structural Descriptor for Machine Learning. J. Chem. Theory Comput. 2024. [Google Scholar] [CrossRef] [PubMed]
  83. Ke, G.; Meng, Q.; Finley, T.; Wang, T.; Chen, W.; Ma, W.; Ye, Q.; Liu, T.Y. Lightgbm: A highly efficient gradient boosting decision tree. Proc. Adv. Neural Inf. Process. Syst. 2017, 30, 52. [Google Scholar]
  84. Cortes, C. Support-Vector Networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
  85. Loh, W.Y. Classification and regression trees. Wires. Data. Min. Knowl. 2011, 1, 14–23. [Google Scholar] [CrossRef]
  86. Peterson, L.E. K-nearest neighbor. Scholarpedia 2009, 4, 1883. [Google Scholar] [CrossRef]
  87. Goodfellow, I.; Bengio, Y.; Courville, A. Convolutional networks. Deep Learn. 2016, 2016, 330–372. [Google Scholar]
  88. Schuster, M.; Paliwal, K.K. Bidirectional recurrent neural networks. IEEE Trans. Signal Process. 1997, 45, 2673–2681. [Google Scholar] [CrossRef]
  89. Vaswani, A. Attention is all you need. In Advances in Neural Information Processing Systems; MIT Press: Cambridge, MA, USA, 2017; pp. 5998–6008. [Google Scholar]
  90. Wen, C.; Zhang, Y.; Wang, C.; Xue, D.; Bai, Y.; Antonov, S.; Dai, L.; Lookman, T.; Su, Y. Machine learning assisted design of high entropy alloys with desired property. Acta Mater. 2019, 170, 109–117. [Google Scholar] [CrossRef]
  91. Mehta, P.; Bukov, M.; Wang, C.H.; Day, A.G.; Richardson, C.; Fisher, C.K.; Schwab, D.J. A high-bias, low-variance introduction to machine learning for physicists. Phys. Rep. 2019, 810, 1–124. [Google Scholar] [CrossRef]
  92. Reiser, P.; Neubert, M.; Eberhard, A.; Torresi, L.; Zhou, C.; Shao, C.; Metni, H.; van Hoesel, C.; Schopmans, H.; Sommer, T.; et al. Graph neural networks for materials science and chemistry. Commun. Mater. 2022, 3, 93. [Google Scholar] [CrossRef] [PubMed]
  93. Schütt, K.T.; Sauceda, H.E.; Kindermans, P.J.; Tkatchenko, A.; Müller, K.R. Schnet–a deep learning architecture for molecules and materials. J. Chem. Phys. 2018, 148, 241722. [Google Scholar] [CrossRef] [PubMed]
  94. Xie, T.; Grossman, J.C. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Phys. Rev. Lett. 2018, 120, 145301. [Google Scholar] [CrossRef]
  95. Chen, C.; Ye, W.; Zuo, Y.; Zheng, C.; Ong, S.P. Graph networks as a universal machine learning framework for molecules and crystals. Chem. Mater. 2019, 31, 3564–3572. [Google Scholar] [CrossRef]
  96. Ghouchan Nezhad Noor Nia, R.; Jalali, M.; Houshmand, M. A Graph-Based k-Nearest Neighbor (KNN) Approach for Predicting Phases in High-Entropy Alloys. Appl. Sci. 2022, 12, 8021. [Google Scholar] [CrossRef]
  97. Wang, X.; Tran, N.D.; Zeng, S.; Hou, C.; Chen, Y.; Ni, J. Element-wise representations with ECNet for material property prediction and applications in high-entropy alloys. npj Comput. Mater. 2022, 8, 253. [Google Scholar] [CrossRef]
  98. Zhang, H.; Huang, R.; Chen, J.; Rondinelli, J.M.; Chen, W. Do Graph Neural Networks Work for High Entropy Alloys? arXiv 2024, arXiv:2408.16337. [Google Scholar]
  99. Dong, H.; Shi, Y.; Ying, P.; Xu, K.; Liang, T.; Wang, Y.; Zeng, Z.; Wu, X.; Zhou, W.; Xiong, S.; et al. Molecular dynamics simulations of heat transport using machine-learned potentials: A mini-review and tutorial on GPUMD with neuroevolution potentials. J. Appl. Phys. 2024, 135, 161101. [Google Scholar] [CrossRef]
  100. Behler, J. Constructing high-dimensional neural network potentials: A tutorial review. Int. J. Quantum Chem. 2015, 115, 1032–1050. [Google Scholar] [CrossRef]
  101. Bartók, A.P.; Payne, M.C.; Kondor, R.; Csányi, G. Gaussian approximation potentials: The accuracy of quantum mechanics, without the electrons. Phys. Rev. Lett. 2010, 104, 136403. [Google Scholar] [CrossRef]
  102. Thompson, A.P.; Swiler, L.P.; Trott, C.R.; Foiles, S.M.; Tucker, G.J. Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials. J. Comput. Phys. 2015, 285, 316–330. [Google Scholar] [CrossRef]
  103. Behler, J.; Parrinello, M. Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys. Rev. Lett. 2007, 98, 146401. [Google Scholar] [CrossRef] [PubMed]
  104. Wang, H.; Zhang, L.; Han, J.; Weinan, E. DeePMD-kit: A deep learning package for many-body potential energy representation and molecular dynamics. Comput. Phys. Commun. 2018, 228, 178–184. [Google Scholar] [CrossRef]
  105. Drautz, R. Atomic cluster expansion for accurate and transferable interatomic potentials. Phys. Rev. B 2019, 99, 014104. [Google Scholar] [CrossRef]
  106. Shapeev, A.V. Moment tensor potentials: A class of systematically improvable interatomic potentials. Multiscale Model. Sim. 2016, 14, 1153–1173. [Google Scholar] [CrossRef]
  107. Fan, Z.; Zeng, Z.; Zhang, C.; Wang, Y.; Song, K.; Dong, H.; Chen, Y.; Ala-Nissila, T. Neuroevolution machine learning potentials: Combining high accuracy and low cost in atomistic simulations and application to heat transport. Phys. Rev. B 2021, 104, 104309. [Google Scholar] [CrossRef]
  108. Fan, Z.; Wang, Y.; Ying, P.; Song, K.; Wang, J.; Wang, Y.; Zeng, Z.; Xu, K.; Lindgren, E.; Rahm, J.M.; et al. GPUMD: A package for constructing accurate machine-learned potentials and performing highly efficient atomistic simulations. J. Chem. Phys. 2022, 157, 114801. [Google Scholar] [CrossRef]
  109. Mirzoev, A.; Gelchinski, B.; Rempel, A. Neural Network Prediction of Interatomic Interaction in Multielement Substances and High-Entropy Alloys: A Review. In Doklady Physical Chemistry; Springer: Berlin/Heidelberg, Germany, 2022; Volume 504, pp. 51–77. [Google Scholar]
  110. Kostiuchenko, T.; Körmann, F.; Neugebauer, J.; Shapeev, A. Impact of lattice relaxations on phase transitions in a high-entropy alloy studied by machine-learning potentials. npj Comput. Mater. 2019, 5, 55. [Google Scholar] [CrossRef]
  111. Körmann, F.; Kostiuchenko, T.; Shapeev, A.; Neugebauer, J. B2 ordering in body-centered-cubic AlNbTiV refractory high-entropy alloys. Phys. Rev. Mater. 2021, 5, 053803. [Google Scholar] [CrossRef]
  112. Byggmästar, J.; Nordlund, K.; Djurabekova, F. Modeling refractory high-entropy alloys with efficient machine-learned interatomic potentials: Defects and segregation. Phys. Rev. B 2021, 104, 104101. [Google Scholar] [CrossRef]
  113. Gubaev, K.; Ikeda, Y.; Tasnádi, F.; Neugebauer, J.; Shapeev, A.V.; Grabowski, B.; Körmann, F. Finite-temperature interplay of structural stability, chemical complexity, and elastic properties of bcc multicomponent alloys from ab initio trained machine-learning potentials. Phys. Rev. Mater. 2021, 5, 073801. [Google Scholar] [CrossRef]
  114. Pandey, A.; Gigax, J.; Pokharel, R. Machine learning interatomic potential for high-throughput screening of high-entropy alloys. JOM 2022, 74, 2908–2920. [Google Scholar] [CrossRef]
  115. Song, K.; Zhao, R.; Liu, J.; Wang, Y.; Lindgren, E.; Wang, Y.; Chen, S.; Xu, K.; Liang, T.; Ying, P.; et al. General-purpose machine-learned potential for 16 elemental metals and their alloys. Nat. Commun. 2024, 15, 10208. [Google Scholar] [CrossRef]
  116. Wu, L.; Li, T. A machine learning interatomic potential for high entropy alloys. J. Mech. Phys. Solids 2024, 187, 105639. [Google Scholar] [CrossRef]
  117. Ferrari, A.; Körmann, F.; Asta, M.; Neugebauer, J. Simulating short-range order in compositionally complex materials. Nat. Comput. Sci. 2023, 3, 221–229. [Google Scholar] [CrossRef] [PubMed]
  118. Chen, S.; Jin, X.; Zhao, W.; Li, T. Intricate short-range order in GeSn alloys revealed by atomistic simulations with highly accurate and efficient machine-learning potentials. Phys. Rev. Mater. 2024, 8, 043805. [Google Scholar] [CrossRef]
  119. Jordan, M.I.; Mitchell, T.M. Machine learning: Trends, perspectives, and prospects. Science 2015, 349, 255–260. [Google Scholar] [CrossRef]
  120. Mizutani, U. The Hume-Rothery rules for structurally complex alloy phases. In Surface Properties and Engineering of Complex Intermetallics; World Scientific: Singapore, 2010; pp. 323–399. [Google Scholar]
  121. Lundberg, S. A unified approach to interpreting model predictions. arXiv 2017, arXiv:1705.07874. [Google Scholar]
  122. Van der Maaten, L.; Hinton, G. Visualizing data using t-SNE. J. Mach. Learn. Res. 2008, 9, 11. [Google Scholar]
  123. Lee, S.Y.; Byeon, S.; Kim, H.S.; Jin, H.; Lee, S. Deep learning-based phase prediction of high-entropy alloys: Optimization, generation, and explanation. Mater. Design 2021, 197, 109260. [Google Scholar] [CrossRef]
  124. Lee, K.; Ayyasamy, M.V.; Delsa, P.; Hartnett, T.Q.; Balachandran, P.V. Phase classification of multi-principal element alloys via interpretable machine learning. npj Comput. Mater. 2022, 8, 25. [Google Scholar] [CrossRef]
  125. Wong, T.T.; Yang, N.Y. Dependency analysis of accuracy estimates in k-fold cross validation. IEEE Trans. Knowl. Data Eng. 2017, 29, 2417–2427. [Google Scholar] [CrossRef]
  126. Oh, S.H.V.; Yoo, S.H.; Jang, W. Small dataset machine-learning approach for efficient design space exploration: Engineering ZnTe-based high-entropy alloys for water splitting. npj Comput. Mater. 2024, 10, 166. [Google Scholar] [CrossRef]
  127. Li, Z.; Nash, W.; O’Brien, S.; Qiu, Y.; Gupta, R.; Birbilis, N. cardiGAN: A generative adversarial network model for design and discovery of multi principal element alloys. J. Mater. Sci. Technol. 2022, 125, 81–96. [Google Scholar] [CrossRef]
  128. Chen, Z.; Shang, Y.; Liu, X.; Yang, Y. Accelerated discovery of eutectic compositionally complex alloys by generative machine learning. npj Comput. Mater. 2024, 10, 204. [Google Scholar] [CrossRef]
  129. Harshvardhan, G.; Gourisaria, M.K.; Pandey, M.; Rautaray, S.S. A comprehensive survey and analysis of generative models in machine learning. Comput. Sci. Rev. 2020, 38, 100285. [Google Scholar]
  130. Fuhr, A.S.; Sumpter, B.G. Deep generative models for materials discovery and machine learning-accelerated innovation. Front. Mater. 2022, 9, 865270. [Google Scholar] [CrossRef]
  131. Zhou, Z.; Shang, Y.; Liu, X.; Yang, Y. A generative deep learning framework for inverse design of compositionally complex bulk metallic glasses. npj Comput. Mater. 2023, 9, 15. [Google Scholar] [CrossRef]
  132. Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial networks. Commun. ACM 2020, 63, 139–144. [Google Scholar] [CrossRef]
  133. Pinheiro Cinelli, L.; Araújo Marins, M.; Barros da Silva, E.A.; Lima Netto, S. Variational autoencoder. In Variational Methods for Machine Learning with Applications to Deep Networks; Springer: Berlin/Heidelberg, Germany, 2021; pp. 111–149. [Google Scholar]
  134. Rezende, D.; Mohamed, S. Variational inference with normalizing flows. In Proceedings of the International Conference on Machine Learning, Lille, France, 6–11 July 2015; PMLR: New York, NY, USA, 2015; pp. 1530–1538. [Google Scholar]
  135. Yang, L.; Zhang, Z.; Song, Y.; Hong, S.; Xu, R.; Zhao, Y.; Zhang, W.; Cui, B.; Yang, M.H. Diffusion models: A comprehensive survey of methods and applications. Acm Comput. Surv. 2023, 56, 1–39. [Google Scholar] [CrossRef]
  136. Ye, Y.; Li, Y.; Ouyang, R.; Zhang, Z.; Tang, Y.; Bai, S. Improving machine learning based phase and hardness prediction of high-entropy alloys by using Gaussian noise augmented data. Comput. Mater. Sci. 2023, 223, 112140. [Google Scholar] [CrossRef]
  137. Chen, C.; Zhou, H.; Long, W.; Wang, G.; Ren, J. Phase prediction for high-entropy alloys using generative adversarial network and active learning based on small datasets. Sci. China Technol. Sci. 2023, 66, 3615–3627. [Google Scholar] [CrossRef]
  138. Pan, S.J.; Yang, Q. A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 2009, 22, 1345–1359. [Google Scholar] [CrossRef]
Figure 1. HEAs phase classification using SISSO descriptors as coordinates in Ref. [81]: (a) Classification of crystal and amorphous (AM). (b) Classification of intermetallic (IM) and solid solution (SS). (c) Classification of single-phase (BCC or FCC) and multi-phase (BCC and FCC). (d) Classification of BCC and FCC.
Figure 1. HEAs phase classification using SISSO descriptors as coordinates in Ref. [81]: (a) Classification of crystal and amorphous (AM). (b) Classification of intermetallic (IM) and solid solution (SS). (c) Classification of single-phase (BCC or FCC) and multi-phase (BCC and FCC). (d) Classification of BCC and FCC.
Entropy 26 01119 g001
Figure 2. Correlation and similarity between HEA materials obtained using HEA interaction network in Ref. [96].
Figure 2. Correlation and similarity between HEA materials obtained using HEA interaction network in Ref. [96].
Entropy 26 01119 g002
Figure 3. The framework of the ECNet model in Ref. [97]: The embedding layer serves the function of encoding the initial inputs derived from the atomic numbers. In the interaction block, a series of neural networks is employed for the purpose of transforming the crystal structures into atomic attributes. The elemental convolution operation entails the computation of the mean value of the atom-wise features, classified according to the atomic element type.
Figure 3. The framework of the ECNet model in Ref. [97]: The embedding layer serves the function of encoding the initial inputs derived from the atomic numbers. In the interaction block, a series of neural networks is employed for the purpose of transforming the crystal structures into atomic attributes. The elemental convolution operation entails the computation of the mean value of the atom-wise features, classified according to the atomic element type.
Entropy 26 01119 g003
Figure 4. Ternary diagrams predicted by the ECNet in Ref. [97]: (a) formation free energies in FeCoMn system; (b) magnetic moments in FeCoMn system; (c) formation free energies in FeCoPd system; (d) magnetic moments in FeCoPd system. Areas I, II, III represent the stability of alloys from high to low.
Figure 4. Ternary diagrams predicted by the ECNet in Ref. [97]: (a) formation free energies in FeCoMn system; (b) magnetic moments in FeCoMn system; (c) formation free energies in FeCoPd system; (d) magnetic moments in FeCoPd system. Areas I, II, III represent the stability of alloys from high to low.
Entropy 26 01119 g004
Figure 5. Feature visualization using the t-SNE algorithm in Ref. [123]. The model in Ref. [123] contains a total of five hidden layers. The original inputs to the model and the output results of the five hidden layers are visualized as two-dimensional distributions using the t-SNE algorithm, respectively. The color of the points represents the phase of the alloys: blue, solid solution; red, mixed phase of solid solution and intermetallic; green, intermetallic; yellow, amorphous.
Figure 5. Feature visualization using the t-SNE algorithm in Ref. [123]. The model in Ref. [123] contains a total of five hidden layers. The original inputs to the model and the output results of the five hidden layers are visualized as two-dimensional distributions using the t-SNE algorithm, respectively. The color of the points represents the phase of the alloys: blue, solid solution; red, mixed phase of solid solution and intermetallic; green, intermetallic; yellow, amorphous.
Entropy 26 01119 g005
Figure 6. Analysis of model interpretation in Ref. [124]: (a) the contributions of descriptors for BCC phase in the NbTaTiV system; (b) phase predictions when there is only one descriptor change. Line colors denote phase information: blue, mixed phases; violet, AM; cyan, FCC; orange, BCC + FCC; light blue, HCP; red, BCC; green, IM.
Figure 6. Analysis of model interpretation in Ref. [124]: (a) the contributions of descriptors for BCC phase in the NbTaTiV system; (b) phase predictions when there is only one descriptor change. Line colors denote phase information: blue, mixed phases; violet, AM; cyan, FCC; orange, BCC + FCC; light blue, HCP; red, BCC; green, IM.
Entropy 26 01119 g006
Figure 7. Algorithm process and generated results in Ref. [128]: VAE and ANN work together to ensure the effectiveness of generating eutectic alloys; possible eutectic alloy components are generated by machine learning and experimentally verified.
Figure 7. Algorithm process and generated results in Ref. [128]: VAE and ANN work together to ensure the effectiveness of generating eutectic alloys; possible eutectic alloy components are generated by machine learning and experimentally verified.
Entropy 26 01119 g007
Figure 8. Algorithm process in Ref. [75]: (a) Mapping the chemical formula of materials to a two-dimensional representation that employs the periodic table structure. (b) Network structures of the model, containing the transferable feature extractor and separately trained regressor or classifier.
Figure 8. Algorithm process in Ref. [75]: (a) Mapping the chemical formula of materials to a two-dimensional representation that employs the periodic table structure. (b) Network structures of the model, containing the transferable feature extractor and separately trained regressor or classifier.
Entropy 26 01119 g008
Table 1. RMSE for predicting hardness of HEAs on the test set of different models.
Table 1. RMSE for predicting hardness of HEAs on the test set of different models.
ModelSVMKNNANN
RMSE *826965
* The data of RMSE is from Ref. [90].
Table 2. RMSE for predicting hardness of HEAs on the test set and validation set with Gaussian noise as data augmentation: The test set is the test data split from the original dataset. The validation data are data not in the original set that the model has never seen before.
Table 2. RMSE for predicting hardness of HEAs on the test set and validation set with Gaussian noise as data augmentation: The test set is the test data split from the original dataset. The validation data are data not in the original set that the model has never seen before.
Augmented DataRMSE in Test Set *RMSE in Validation Set *
Row Data58.144.4
2 × Row Data42.840.5
Low noise enhanced42.840.1
Middle noise enhanced43.239.6
High noise enhanced43.740.0
3 × Row Data39.041.5
Low + middle noise enhanced39.841.0
Low + high noise enhanced40.040.7
Middle + high noise enhanced40.940.5
4 × Row Data30.243.1
Low + middle + high noise enhanced31.141.5
* The data of RMSE are from Ref. [136].
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sun, Y.; Ni, J. Machine Learning Advances in High-Entropy Alloys: A Mini-Review. Entropy 2024, 26, 1119. https://doi.org/10.3390/e26121119

AMA Style

Sun Y, Ni J. Machine Learning Advances in High-Entropy Alloys: A Mini-Review. Entropy. 2024; 26(12):1119. https://doi.org/10.3390/e26121119

Chicago/Turabian Style

Sun, Yibo, and Jun Ni. 2024. "Machine Learning Advances in High-Entropy Alloys: A Mini-Review" Entropy 26, no. 12: 1119. https://doi.org/10.3390/e26121119

APA Style

Sun, Y., & Ni, J. (2024). Machine Learning Advances in High-Entropy Alloys: A Mini-Review. Entropy, 26(12), 1119. https://doi.org/10.3390/e26121119

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop