Next Article in Journal
CoReaAgents: A Collaboration and Reasoning Framework Based on LLM-Powered Agents for Complex Reasoning Tasks
Previous Article in Journal
A Supernet-Only Framework for Federated Learning in Computationally Heterogeneous Scenarios
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

The AI-Driven Transformation in New Materials Manufacturing and the Development of Intelligent Sports

1
College of Physical Education, Chongqing University, Chongqing 401331, China
2
School of Chemistry and Chemical Engineering, Chongqing University, Chongqing 401331, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(10), 5667; https://doi.org/10.3390/app15105667
Submission received: 14 April 2025 / Revised: 14 May 2025 / Accepted: 15 May 2025 / Published: 19 May 2025

Abstract

:
The advancement of materials science has had a profound, even revolutionary, impact on sports. Materials are used in the sports field, equipment, and sportswear, each with distinct functionality and safety requirements. Additionally, diverse sport-related data require physical devices for collection, analysis, and storage, which can be crucial in athlete selection, performance assessment, strategy planning, and training optimization. Artificial intelligence, with its strong cognitive abilities, learning capacity, large-scale data processing, and adaptability, can effectively enhance efficiency, reduce errors, and lower costs. The integration of advanced materials and artificial intelligence (AI) has significantly enhanced the efficiency and precision of research and development in sports-related technologies, while also facilitating the innovation of training methodologies through intelligent data analytics. This convergence has initiated a transformative phase in the digitalization of the sports industry. Anchored in both theoretical analysis and practical implementation, this study seeks to construct a systematic cognitive framework that elucidates the interrelationship between material science and AI technologies. The aim is to assist sports professionals in understanding and leveraging this technological shift to support strategic decision-making and to foster sustainable, high-quality development within the field.

1. The Evolution of Sports: A Synergy of Historical Development and Technological Innovation

1.1. The Evolution of Sports from a Technology-Driven Perspective: From Material Innovation to Intelligent Integration

As an important manifestation of the development of human civilization, the evolution process of sports has always been closely linked to the improvement of social productive forces and technological levels [1]. Fundamentally, the evolution of sports forms reflects the profound changes in technological conditions and production methods during a specific historical period.
In this historical process, the advancement of materials science and technology has long been a key force driving the development of sports. Thanks to continuous innovation in the field of materials, sports equipment has not only achieved performance breakthroughs but also significantly enhanced safety protection levels, ease of use and affordability. This all-round progress provides a solid foundation for the professional development and popularization of sports among the public.
Carbon fiber materials are widely used in high-performance sports equipment (such as bicycles and tennis rackets) by optimizing the mass-stiffness ratio, thereby enhancing competitive levels. In the field of racing, F1 racing cars employ high-strength composite materials, ensuring lightweight design while enhancing crash resistance, thereby improving racing performance and ensuring safety at the same time [2]. In 2019, Kipchoge accomplished the feat of breaking the 2-h mark in a marathon wearing Nike Vaporfly 4% running shoes, whose Zoom X foam midsole could provide 85% energy return [3]. However, the high price deters ordinary runners. Through the TPU-PEBA hybrid foaming technology, Li-Ning has significantly reduced the price while maintaining a similar rebound rate, making professional running shoes truly accessible to the public.
Secondly, the application of new materials has prompted the competitive rule system to constantly adapt to the changes in equipment, achieving a dynamic coordination between rules and technology. Take swimsuit technology as an example. Modern swimsuits use highly elastic polymers and superhydrophobic coatings to ensure a snug fit, reduce resistance and enhance performance. A famous example is the “sharkskin” swimsuit [4]. This phenomenon of “technical doping” has directly prompted FINA to implement new regulations, strictly limiting the material thickness and coverage of swimsuits.
In terms of industrialization, the advancement of materials science and technology has significantly lowered the threshold for professional equipment. Sports brands like Decathlon have popularized mid-to-low priced high-performance equipment through industrial chain integration. The requirements for unified equipment standards in international events such as the Olympic Games and the World Cup play a crucial role in ensuring fair competition. These standardized measures not only ensure the fairness of the competition but also set reasonable boundaries for technological innovation.
AI is not only used in material research and development but can also be deeply embedded in sports training systems, forming a dual-driven pattern of “equipment intelligence–training intelligence”, and promoting sports technology to enter a new stage of integration and systematization.
In 2024, Ma et al. [5] successfully developed an AI-powered coaching system for table tennis beginners, combining computer vision and deep learning to analyze players’ movements in real time. The system detects technical errors with 73% accuracy for arm postures and 82% for racket angles, offering three key benefits: it standardizes training to improve efficiency, provides instant feedback for faster skill correction, and serves as a low-cost solution to make table tennis more accessible. This innovation demonstrates AI’s expanding role in sports development. Meanwhile, in consumer fitness, smart platforms like Peloton use personalized algorithms to deliver real-time workout feedback, revolutionizing traditional exercise by eliminating time and space constraints. These systems create a new interactive sports ecosystem that seamlessly connects users, devices, and training environments.
Overall, the coordinated development of artificial intelligence and materials science not only continues the historical logic of technology-driven evolution of sports but also promotes sports to enter a new era of “intelligent collaboration”. In this process, the technical system, organizational model, and value structure are being profoundly reshaped, laying a solid foundation for the continuous innovation of sports and the expansion of its social functions.

1.2. The Evolution of Sports from a Historical Development Perspective: From the Stone Age to the Intelligent Age

In the historical process of sports development, every major breakthrough in materials technology has almost always been accompanied by profound sports changes. As shown in Figure 1 and Figure 2, from ancient times to the present, the evolution of materials has not only changed the forms and rules of sports, but also systematically promoted the evolution of sports systems, culture and industries [6,7].
In primitive society (around 10,000 BC), humans not only trained basic survival skills such as hunting and defense through game-like activities like throwing stone balls, but also initially demonstrated the rudimentary form of organized sports behavior. With the formation of the tribal structure and the development of religious rituals, activities such as javelin throwing, archery, and running, which originated from survival needs, gradually transformed into ritualistic and competitive sports behaviors, such as sword dancing and collective military training in the tribe. These early forms of sports laid the cultural and structural foundation for the germination of team sports in later generations, the establishment of rules and the rudimentary form of sports venues.
Entering the Bronze Age [8] (around 5000 BC), the emergence of bronze smelting technology significantly enhanced the strength and durability of equipment, gradually giving sports gear its initial form of specialization. The extensive use of military equipment such as bronze swords and bronze arrivers not only enhanced the training effect but also prompted sports competitions to start having standardization and entertainment value. The bronze javelin and discus widely used in the Ancient Greek Olympic Games are important examples of how material progress has promoted the institutionalization of sports [9]. During this period, the connection between sports and the military became increasingly close, and the organization of competitive activities and the competitive spirit also grew stronger day by day.
By the Iron Age (approximately from the 3rd century BC to the 15th century AD), the popularization of iron smelting technology initiated a dual leap in material properties and sports practicality. The emergence of iron weapons and protective gear not only enhanced the strength and precision of sports equipment but also promoted the development of sports gear towards standardization and professionalization. In ancient Rome, gladiators used steel weapons and armor in the Colosseum, marking that the professionalization of equipment in confrontational competitions had reached a high level. Meanwhile, chivalric culture gradually emerged in Europe [10], driving the development of sports forms such as equestrianism and fencing. In East Asia, ironware has promoted the popularization and evolution of traditional sports such as wrestling and martial arts. The use of steel materials not only enhances the safety of competitive sports but also promotes the mass production and standardization of sports equipment, laying a solid foundation for the fairness and entertainment of events.
Entering the 19th century, with the advancement of the Industrial Revolution, materials technology witnessed an unprecedented development. The Bessemer converter and open-hearth steelmaking technologies have enabled the large-scale production of steel and given rise to a leap in the construction of modern sports infrastructure. The large-scale application of steel structures has significantly enhanced the load-bearing capacity and safety of sports venues, making it possible to host major events. For instance, the Crystal Palace built for the 1851 London World’s Fair was the first to adopt a structural form combining cast iron and glass in its architecture, which had a profound influence on later sports buildings. Meanwhile, the invention of vulcanized rubber [11] in 1839 promoted the wide application of highly elastic materials in ball games, boosted the popularization of football, basketball and other sports, and also facilitated the standardization of relevant competition rules. The emergence of reinforced concrete technology laid the structural foundation for the modernization of sports venues in the 20th century and initiated the era of “spectator” and “industrialization” of sports.
The 20th century witnessed a transformative evolution in synthetic materials, marked by the emergence of three major material systems: alloys, inorganic materials, and polymers, which together laid the technological foundation for the modern sports industry. In the domain of metallic materials, the widespread adoption of aluminum and titanium alloys has significantly improved the performance of sports equipment such as golf clubs and tennis rackets. Among inorganic materials, advanced ceramics—with their exceptional hardness and wear resistance—have become essential in the production of durable components and protective gear. Polymer materials have demonstrated even greater versatility, encompassing high-performance polymer fibers that enhance sportswear functionality, polyurethane rubber that has become the standard for athletic track surfaces, and composite materials that achieve an optimal balance between strength and weight. These developments have directly propelled the modernization of professional sports equipment, exemplified by innovations in kayaking and paragliding technologies. A particularly notable milestone is the advancement of semiconductor materials, which has enabled the development of intelligent chip technologies, forming the core infrastructure for the digital transformation of sports. These progressive breakthroughs in materials science have collectively constructed a comprehensive technological system that continues to drive innovation in sports equipment, performance monitoring, and training methodologies.
Entering the 21st century, the deep integration of materials, science and information technology has led sports into a new stage of intelligent and precise development. As shown in Table 1, the emergence of functional sensing materials has enabled the combination of kinematics and informatics. Smart materials, bionic materials [12], self-healing materials [13], and neural interface technologies [14] are gradually expanding the boundaries of sports materials. Smart clothing can monitor athletes’ heart rate, body temperature, and muscle fatigue in real time [15]. Smart footwear can conduct gait analysis to help optimize training strategies [16]. Bionic materials are widely used in the fields of sports rehabilitation and human body enhancement. The application of 3D printing technology [17] not only significantly enhances the efficiency of equipment manufacturing, but also enables highly personalized customization, allowing each athlete to obtain exclusive equipment that suits their own physiological characteristics.

2. The Predicament of Material Research and Development and Driving Forces Behind the Rise of Artificial Intelligence

Since the 20th century, the field of materials science has experienced revolutionary breakthroughs, and innovative achievements represented by synthetic materials technology continue to emerge. This technological transition has given birth to a series of new materials with both versatility and high performance and, thus, constructed four basic systems for modern material classification: carbon fiber composites, high molecular polymers, multi-component alloys and advanced ceramic materials [18]. As shown in Table 2, these four types of materials show significant differences at the microstructure level, and their covalent bond configuration, lattice arrangement and phase transition characteristics and other structural parameters directly lead to the unique advantages and limitations of materials in key performance indicators such as mechanical strength, thermal stability and corrosion resistance. The four basic materials together can be used to manufacture a variety of different equipment. Some of them are listed in Table 3 below.
Although the four traditional basic materials each have their own performance characteristics, to meet the high-performance requirements of modern sports equipment (such as tennis rackets needing to be both lightweight and high-strength, and the midsoles of running shoes needing to achieve a balance between energy feedback and shock absorption), it is often necessary to optimize material performance through complex means such as multiphase interface regulation and nanostructure design.
The performance of materials essentially depends on their multi-level structural characteristics. Take carbon fiber composites as an example [19]. Their outstanding strength-to-weight ratio and fatigue resistance make them key materials in advanced manufacturing fields. However, minor differences in manufacturing processes can significantly affect the final performance: Unidirectional braided (UD) maximizes axial stiffness through the parallel arrangement of fibers and is widely used in linear load-bearing components such as bicycle frames, but its obvious anisotropy is prone to delamination failure; bidirectional braided (BD) adopts an orthogonal fiber network design to achieve a balance of mechanical properties in the X/Y axis direction. Its excellent torsional stiffness and interwoven structure can effectively disperse impact energy and suppress crack propagation under dynamic loads, thus, becoming an ideal choice for sports equipment such as tennis rackets. While multi-directional braiding extends the reinforcing phase to the Z-axis direction through 3D interweaving, although it significantly enhances the interlaminar shear strength (up to several times that of the UD) and is suitable for complex geometric structures such as aerospace connectors, it also leads to a significant increase in manufacturing costs and cycles. Similarly, the mechanical properties of metallic alloys are also closely related to their fine structures [20]. The differences between the two typical crystal structures, face-centered cubic (fcc) and body-centered cubic (bcc), are particularly significant: The atoms of the fcc structure are arranged at the corners and face-centered positions of the cube, with 12 slip systems, demonstrating excellent ductility and low-temperature toughness and, thus, are widely used in fields such as automotive panels, cryogenic storage tanks, and biomedical implants. The atoms of the bcc structure are located at the corner points and the center of the cube. Although it has higher strength and wear resistance, its low-temperature ductility is relatively poor, making it more suitable for applications such as heavy-duty bearings, armor plates, and high-strength fasteners. These examples fully demonstrate that the precise regulation of the microstructure of materials is the key to achieving performance optimization.
Precise control of material properties across scales is the core scientific challenge currently faced. The macroscopic properties of materials are influenced by the multi-scale synergy of electronic structure (quantum scale), crystal structure (atomic scale), and microstructure (mesoscopic scale). To achieve precise regulation of material properties, it is necessary to establish a full-chain control capability ranging from electronic structure calculation, crystal defect engineering to microstructure design, and ensure the mutual compatibility and collaborative optimization of mechanisms at various scales.
However, this process faces fundamental difficulties. From a physical nature perspective, the complex electronic structure of the material system makes precise description face theoretical bottlenecks. In most cases, the Schrodinger equation lacks analytical solutions, and the computational complexity of solving it by numerical methods increases exponentially with the size of the system, far exceeding the processing capacity of existing computing resources. Researchers must adopt various approximation methods to handle behaviors at different scales. Although this partially alleviates the problem of computational efficiency, it inevitably introduces the loss of accuracy, and this error will continuously accumulate and amplify during the multi-scale simulation process. With the expansion of the scale of the research system, the computational complexity and time cost increase sharply and nonlinearly, which poses huge challenges for traditional computational methods when solving practical problems.
It is worth noting that artificial intelligence technology provides a revolutionary new idea for solving this predicament: through the nonlinear mapping ability of deep neural networks, cross-scale correlation models can be efficiently established. Machine learning algorithms can extract feature patterns from massive data, significantly reducing the computational complexity of solving complex equations. This data-driven research paradigm not only holds the potential to break through the trade-off dilemma between accuracy and efficiency in traditional methods, but also enables truly precise cross-scale regulation, opening up a brand-new technical path for material research and development and promoting the transformation of materials science towards an intelligent design paradigm.
For instance, researchers have employed integrated and automated machine learning algorithms to predict the mechanical properties of 3D printed foam structures, such as tensile and bending strength. These methods help optimize the material formula and improve the performance of polymers, which can be used in fields such as sports shoes and rackets [21]. A study used Convolutional Neural Networks (CNNS) to predict the stress distribution in composite materials, demonstrating the potential of deep learning in accelerating stress analysis. With its powerful high-dimensional data processing capabilities, AI can establish structure–activity relationship models from massive material feature parameters, providing a rapid solution for the reverse design of complex material systems [22].

2.1. From Traditional Trial and Error to Intelligent Design: The Transformation of the AI Materials Research and Development Paradigm Driven by Computational Materials Science

Computational Materials Science (CMS) is a discipline that studies the structure, properties and evolution of materials by means of computer simulation, numerical calculation and data analysis. It combines theories and techniques of physics, chemistry, materials science, and computer science to predict and optimize material properties and drive the efficient design and development of new materials. In different research scenarios, computational materials science involves different spatial scales (from atomic and nano to macroscopic structures) and time scales (from femtoseconds to seconds and beyond) to adapt to the research needs of different materials problems.
The development of materials research is a progressive history from experience accumulation to theoretical breakthrough and then to technological revolution. In the early days, materials research relied heavily on an experiment-driven approach to explore the properties and structure of materials through trial and error and observation. By the 1600s, with the establishment of theoretical basic models such as Newtonian mechanics and thermodynamics, material research began to enter the theoretical stage, and scientists were able to explain the physical and chemical behavior of materials through mathematical models, providing a preliminary theoretical basis for material design. In the 1950s, the advent of computer science revolutionized the paradigm of materials research. The introduction of computer simulation and numerical computing technology enables scientists to predict the properties of materials in a virtual environment, optimize their structures, and accelerate the discovery and invention of new materials. The breakthrough at this stage not only greatly shortened the research and development cycle, but also reduced the cost of experiments, and promoted the rapid development of materials science. In the 21st century, the rise of AI and large-scale high-throughput technologies has injected new vitality into materials research. By analyzing massive amounts of data through machine learning algorithms, AI can quickly identify complex relationships between material properties and structures, and even predict the properties of unknown materials. High-throughput technologies significantly improve the efficiency of material screening and optimization through parallel experiments and automated processes. The combination of these technologies not only further speeds up the material research and development process, but also significantly reduces the research and development cost, making the commercial application of new materials more rapid and extensive [23], as shown in Figure 3a.
The advent of computers has enabled the calculation of material properties using physical theories to address real-world chemical challenges. High-throughput computational screening and experimental validation have become mainstream approaches in material design. Compared to traditional trial-and-error methods—which are time-consuming, costly, and less capable of handling complex material challenges, computational methods effectively shorten the R&D cycle. Material research and development relies on the synergistic collaboration of experiments, theories, and computational methods, collectively forming the “Iron Triangle” (as shown in Figure 3b) that drives scientific discovery and technological advancement. Theoretical research establishes models and laws, providing a framework for experimental design and computational simulations. Experiments validate theories and address unknowns by collecting data and observing phenomena. Computational methods leverage numerical simulations, virtual experiments, and parameter optimization to accelerate material screening and property prediction. This integrated approach significantly reduces trial-and-error costs and enhances research and development efficiency.
Consequently, computational materials science emerged as a discipline utilizing computational methods to explore the structure, properties, and evolution of materials. Its development has progressed from theoretical modeling and numerical simulations to the emergence of data-driven intelligent research, providing a powerful tool for discovering and optimizing new materials [24,25].
Figure 3. (a) Four scientific paradigms: empirical, theoretical, computational, and data- or AI-driven [26]. Copyright 2024 American Chemical Society. (b) Experiment-theory-computation collaborative innovation model for high-performance material development [27].
Figure 3. (a) Four scientific paradigms: empirical, theoretical, computational, and data- or AI-driven [26]. Copyright 2024 American Chemical Society. (b) Experiment-theory-computation collaborative innovation model for high-performance material development [27].
Applsci 15 05667 g003

2.2. The Evolution of Computational Materials Science

In the 1950s and 1960s, the foundations of computational materials science began to take shape, primarily leveraging the theoretical frameworks of quantum mechanics and statistical mechanics. Density Functional Theory (DFT) [28], introduced in the 1960s, allowed researchers to simulate material properties at the electronic structure level. This breakthrough significantly advanced materials science, enabling computers to predict the physical and chemical properties of materials without relying solely on experimental methods. Simultaneously, Molecular Dynamics (MD) simulation [29] methods were developed in the 1950s to explore interactions between atoms and molecules. Despite their high computational demands, these methods provided crucial insights into the microscopic behavior of materials.
With advancements in computing power, materials scientists began utilizing High-Throughput Computing (HTC) to accelerate the discovery of new materials. In the 1980s and 1990s, the advent of supercomputers enabled the simultaneous computation of multiple material systems, streamlining the material screening process. After 2000, the Materials Genome Initiative (MGI) marked a significant milestone in computational materials science [30]. Launched by the United States in 2011, MGI aimed to standardize, share, and efficiently calculate material data, thereby accelerating the R&D cycle for new materials. During this period, large-scale computational databases such as Materials Project, AFLOW [31], and OQMD [32] were established, laying a robust data foundation for the integration of AI in materials science.

2.3. Barriers to the Development of Computational Materials Science

Although computational materials science represents a significant advancement over traditional trial-and-error methods, it still faces challenges in practical applications. These challenges are primarily rooted in computational limitations rather than theoretical constraints. For most common material systems, the fundamental interactions between particles are well understood. Quantum mechanics and relativity, established in the early 20th century, laid the foundation for modern physics and provided precise equations to describe the behavior of microscopic matter. Examples include the Schrödinger equation for electron behavior, the Einstein field equations for gravity [33], and the Navier–Stokes equations for fluid dynamics. Additionally, statistical mechanics bridges the behavior of microscopic particles with macroscopic properties, enabling the analysis of phase transitions and non-equilibrium processes in complex materials. These theoretical methods effectively capture the essential principles governing material systems.
As Paul Dirac once stated, “The fundamental laws necessary for the mathematical treatment of a large part of physics and the whole of chemistry are, thus, fully known, and the difficulty lies only in the fact that application of these laws leads to equations that are too complex to be solved analytically.” Indeed, solving these equations under complex boundary conditions and non-linear characteristics is extremely challenging, with analytical solutions typically obtainable only under idealized conditions. As computing technology has advanced, numerical methods have become essential tools for addressing practical problems. They enable the application of theoretical models to real-world scenarios through approximate solutions, particularly when dealing with complex systems and long timescale simulations.
However, the current technical bottleneck primarily arises from limitations in computing resources and the inefficiency of algorithms. Developing more efficient computational models, enhancing accuracy, and reducing costs have emerged as critical frontiers in materials research. The key challenge lies in reducing computational complexity and difficulty without significantly sacrificing accuracy. Fortunately, the advent of AI [34] has made this vision increasingly achievable. AI-driven algorithms offer promising solutions to optimize computational efficiency and accuracy, paving the way for the next breakthroughs in computational materials science.

3. AI Transforms the Development of Computational Materials Science

3.1. The Connection Between AI and Materials

As a data-driven science and technology, AI can solve complex problems in high-dimensional data laws by analyzing the laws behind the data. Material science involves many multi-dimensional data, such as the relationships between composition, structure, process, and properties, which are often complicated and difficult to effectively tap into by traditional methods. However, pioneering developments in material science established essential tools and data infrastructure, facilitating the integration of AI into materials research. In 2011, the United States launched the Materials Genome Initiative (MGI) to accelerate the R&D cycle for new materials, reducing it from the traditional 10–20 years to just 2–5 years. The initiative focuses on developing advanced computing and experimental tools, building open-access databases, and integrating big data, computational simulations, and high-throughput experiments. These efforts have significantly accelerated material design and optimization.
These initiatives accelerated the digital transformation of materials R&D and laid the groundwork for the deep integration of AI and materials science. As databases expand, high-quality experimental data accumulates, and AI algorithms enhance their predictive accuracy for complex materials, R&D materials are evolving from traditional “trial and error” to an intelligent, data-driven approach.
Additionally, advanced computing toolkits and specialized databases in material science have accelerated AI applications in materials research (see Table 4 and Table 5). This synergy has deepened the integration of AI and materials science, providing robust support for the intelligent discovery and optimization of new materials.
The deep application of AI technology is reshaping the material science research system, creating a trinity of innovation paradigm of “data + algorithm + experiment”. By integrating AI technology with computational simulation and experimental data, material research and development has achieved a leapfrog transformation from experience-driven to data-driven paradigm, significantly improving the efficiency and accuracy of materials innovation.
At the data-driven level, by analyzing high-throughput data generated by materials genome project and quantum chemistry simulation, AI has deeply mined the structure–performance correlation law of materials, which has improved the prediction and discovery efficiency of new materials by orders of magnitude. The technological breakthrough extends to the reverse design dimension, and AI builds a cross-scale modeling system from atomic scale to macro properties, which can reverse derive the optimal material formula according to the target performance, breaking through the limitations of traditional trial and error methods. In the face of complex engineering scenarios, AI shows multi-threading optimization capabilities: at the research and development end, it combines an automated experiment platform and active learning algorithm to realize intelligent selection of experiment paths. On the application side, the multi-objective optimization algorithm balances the contradictory indicators such as strength, cost, and environmental protection, providing a safe and reliable research path for the development of special materials. This whole-chain intelligent transformation is reshaping the innovation ecology from basic research to industrial applications.

3.2. AI in Materials Science: Current Applications and Prospects

In addition to improvements in database systems, the development of AI model frameworks has significantly advanced the field of materials AI. Since 2014, the number of publications related to the integration of AI and materials has been steadily increasing. As shown in Figure 4, the field of materials AI saw a sharp growth around 2020, marked by the introduction of AI frameworks like TensorFlow V0.1 [39] in 2015 and PyTorch V 0.1 [40] in 2016. These frameworks have substantially lowered the barrier for researchers to quickly build and train AI models, allowing them to focus on addressing key challenges in materials science. This has greatly accelerated the development of AI models for materials.
The number of papers published peaked in 2021 and gradually decreased each year, signaling that the research has shifted from a rapid growth phase to a more quality-focused development phase. Notably, the number of articles combining Graph Neural Networks (GNN) [41] with materials research has been steadily increasing. This is because GNNs have a natural alignment between their model structure and the molecular structure in materials science, offering better performance in large-scale models due to their advantageous physical properties, especially in the context of complex big data scenarios.
In 2024, the Nobel Prizes in Physics and Chemistry recognized the transformative influence of AI on scientific research. The Physics Prize was awarded to John J. Hopfield and Geoffrey E. Hinton for their pioneering contributions to Artificial Neural Networks and machine learning. In 1982, Hopfield introduced the Hopfield network, an associative memory model capable of storing and reconstructing information. Hinton, known as the “father of deep learning”, significantly enhanced machines’ ability to autonomously identify data features, laying the foundation for modern AI. Their groundbreaking work not only advanced human understanding of cognition but also revolutionized various scientific fields by enabling complex data analysis and pattern recognition.
Similarly, the Chemistry Prize honored David Baker, Demis Hassabis, and John M. Jumper for their innovative contributions to protein research. Baker pioneered a new direction in protein engineering by designing novel proteins through computational methods. Under the leadership of Hassabis and Jumper, the development of the AlphaFold2 [42] model achieved unprecedented accuracy in predicting protein 3D structures, solving a challenge that had perplexed scientists for decades. This breakthrough has revolutionized fields such as drug discovery, vaccine design, and biotechnology, driving rapid advancements in chemical and biological research.
These two Nobel Prizes underscore the pivotal role of AI in advancing scientific frontiers. They highlight how AI-driven models and computational tools are reshaping traditional research paradigms. Looking ahead, AI is expected to have an even more profound impact across multidisciplinary fields, ushering in a new era of innovation in areas such as materials science, medicine, and quantum computing.

3.3. The Model Form That Combines Materials Science and AI

With the rapid development of the Materials Genome Initiative (MGI) and computational materials science, the field of materials science has accumulated a large amount of high-quality data, providing a solid foundation for data-driven materials research. At the same time, the advancement of ML and NNs has enabled material modeling to gradually evolve from early statistical methods to more advanced deep learning methods, greatly improving the ability to describe atomic interactions and electronic effects, and enhancing the generalization ability of the model in different systems.
Based on methods and application scenarios, these models are categorized as follows. Data-driven Machine Learning Prediction Model: By analyzing experimental and computational data, it predicts material performance and characteristics, enabling rapid screening of candidate materials. Machine Learning Potentials: These models use machine learning to construct potential functions for interatomic interactions, enabling high-precision molecular dynamics simulations at a low computational cost. Graph Neural Network-Based Material Modeling: This approach represents atomic and bonding relationships through graph structures, making it suitable for performance prediction and design optimization of complex material systems. Each model has unique characteristics and functions (see Table 6). They provide diverse solutions for materials science research, driving innovation from basic research to industrial applications.

3.3.1. Machine Learning and Neural Network Data-Driven Models in Materials Science

In 2019, Schleder et al. [27] reviewed the application and development of methods from density functional theory (DFT) to machine learning (ML) in materials science, as shown in Figure 5. With the advancement of experimental and computational methods, a vast amount of data has been accumulated in the field of materials. How to effectively utilize this data to guide the design of new materials has become an important challenge. The article first reviews the important position of DFT in the calculation of material properties, emphasizing its high accuracy and wide application range. Subsequently, the advantages of high-throughput (HT) computing in generating structural and property data were introduced, providing a solid data foundation for subsequent machine learning modeling. In the fundamental principles of machine learning prediction, the key lies in constructing a mathematical model capable of learning the relationship between the structure and properties of materials. Firstly, through feature engineering, the material structure information is transformed into numerical features that can be used for modeling, such as atomic type, bond length, charge distribution, band gap, etc. Then, appropriate algorithms (such as Support Vector Machine, Random Forest, Neural Network, etc.) are selected to train the existing data and establish the mapping relationship between structure and property. For data with known properties, supervised learning is adopted for model optimization; in unsupervised learning, the inherent patterns of data are discovered through methods such as clustering or dimensionality reduction, which is applicable to material classification and potential performance discovery. The trained model can be used to quickly predict the properties of new materials, greatly reducing the reliance on first-principles calculations and significantly improving the efficiency of screening and design. Especially when dealing with high-dimensional and complex material data, machine learning demonstrates powerful processing capabilities, providing a brand-new technical path for accelerating material discovery.
The prediction of molecular properties relies on the molecular information extracted from computational materials science data. By establishing the mapping relationship between the molecular structure property descriptors and the macroscopic molecular properties, and with the help of machine learning model training to obtain the optimal prediction model, efficient prediction and optimization of performance can be achieved. In 2024, Sinha et al. [46] conducted a study to explore the application of machine learning (ML) technology in accelerating material discovery, as shown in Figure 6, with a particular focus on the utilization of electronic band structure data. This research is based on the Materials Project database, collecting the electronic band structure data of 63,588 kinds of materials, and conducting classification and cluster analysis of the materials through multiple ML algorithms. Before data modeling, researchers conducted systematic data preprocessing, including feature selection, feature engineering and noise reduction processing, to improve the prediction accuracy and computational efficiency of the model. Subsequently, three unsupervised learning algorithms—K-means, AGNES, and BIRCH—were adopted in the study to conduct cluster analysis on the material data and mine the similarity features among the materials, thereby accelerating the discovery process of new materials. The results show that the clustering method based on the characteristics of electronic energy points outperforms the models that only use category attributes such as band gaps or Fermi levels in terms of performance. Among them, the AGNES algorithm shows the optimal clustering effect in most experiments. This research provides a systematic and effective methodology for analyzing and predicting the electronic properties of materials using machine learning methods, and is expected to play a key role in future material design and target performance-oriented material screening.
The second type of data source comes from experimental collection. Through in-depth research on the relationship between the characterization of material properties and macroscopic performance, it has promoted the development of data-based material design methods. In 2021, as shown in Figure 7, Pei et al. [47] reported that a hybrid neural network method integrating variational autoencoder (VAE) and regression model was innovatively proposed to solve the problem of microstructure identification in martensitic steel and successfully achieved the reverse design of up to 20 elemental alloys. This work has significantly broken through the bottlenecks of traditional methods in microstructure identification and component design. Specifically, this method first uses VAE to encode the microstructure image into a 64-dimensional latent space, achieving efficient compression and representation of complex microscopic features. Then, a weighted regression model (with a weight ratio of 100:1) is combined to guide the distribution of the latent space, enabling it to better reflect the latent correlation between the material composition and properties. Finally, with the aid of 2D kernel principal component analysis (2D-kernel PCA), the back mapping from the latent space to the specific material composition is achieved, realizing the reverse design from microstructure to composition. Compared with the traditional trial-and-error methods, this technology has significant advantages: on the premise of using only hundreds of microstructure images as training data, it still achieves high-precision prediction and design capabilities, greatly improving the efficiency of data utilization. This method has not only been successfully applied to the intelligent design of 9–12% Cr martensitic steel but also has good scalability and can be extended to complex multi-component material systems such as high-entropy alloys.
Due to the high cost of experiments on material data, model training for small-scale databases requires the development of efficient algorithms with strong generalization capabilities. Although the traditional machine learning method has been widely used in material computation, it is often difficult to fully analyze the high-dimensional complex laws of molecular systems due to the relatively simple structure of the model. In 2022, Chan et al. [48] research shows that these traditional methods have significant limitations in obtaining accurate target-specific data. Among them, kernel-based methods, such as KRR, improve predictive performance to some extent by combining nuclear techniques with regression models. In recent years, the breakthrough of attention mechanism provides a new way to solve this problem. The technology automatically focuses on key data features and exhibits excellent generalization capabilities, which are highly compatible with the coexistence of local and global interactions in the material molecule. For example, in 2022 Wang et al. [49] developed CrabNet model, which innovatively applies the self-attention mechanism in natural language processing to materials science and effectively learns element embedding vectors. At the same time, the mat2vec model based on word embedding technology realizes high-quality compound prediction by capturing the context relationship between elements. These models have been rigorously tested and have demonstrated excellent predictive performance. On this basis, in 2023, Du et al. [50] further proposed an innovative fusion strategy model, which successfully achieved accurate modeling of complex molecular interactions by deeply integrating the global feature extractor (LSTM/GRU), the local feature extractor (DCNN) and the novel CECV (Coupled Enhanced Context Vectorization) feedback mechanism. This breakthrough not only overcomes the limitations of traditional rule-based systems, but also establishes a new research paradigm for efficient learning of small sample material data.

3.3.2. Machine Learning Potentials in Materials Science

Classical Molecular Dynamics (CMD) and Ab Initio Molecular Dynamics (AIMD) [51] are two widely used simulation methods for modeling molecular motion. These simulations explore how molecules “climb” the Potential Energy Surface (PES), as shown in Figure 8, which represents the total energy of a molecular system as a function of the relative positions of its nuclei. CMD relies on Newtonian mechanics with relatively large time steps and lower computational complexity, enabling fast simulations of macromolecular dynamics at the millisecond scale, such as protein folding. However, CMD lacks the precision needed for electronic behavior analysis. In contrast, AIMD utilizes first-principles calculations like Density Functional Theory (DFT) to simulate electronic structures with high accuracy. While AIMD can model detailed chemical reaction mechanisms, it requires intensive computations and short time steps, making it less suitable for large-scale systems or long-term simulations. The complementary strengths and limitations of CMD and AIMD highlight the need for a simulation approach that balances computational speed and accuracy.
Machine Learning Potentials (MLPs) address this need by leveraging machine learning techniques to construct high-precision, low-cost potential energy surface models. MLPs utilize neural networks and deep learning to capture the local chemical environment of atoms, accurately fitting the relationships between atomic forces and energies. This approach reduces computational complexity while enhancing the generalization capability for complex systems, thus, expanding the applicability of molecular simulations.
In 2007, Behler and Parrinello [53] pioneered the Neural Network Potential Function (MLP), which successfully simulated the interaction of CO/Ni surfaces and H2/Si surfaces by training neural networks on first-principles data. This work not only validates the precision and efficiency of MLP, but also introduces two core concepts of local environment approximation and symmetry descriptor, which lays a methodological foundation for further research.
Under this theoretical framework, in 2010, Bartók et al. [54] developed the Gaussian Approximation Potential (GAP), which for the first time explicitly decomposed the interatomic interaction into multibody terms: the distance dependence relationship was fit through the two-body descriptor, and the kernel function summary and normalization technology was innovatively adopted to deal with the three-body and higher order terms, so as to achieve robust regression of atomic arrangement changes.
To further improve the ability to describe the behavior of complex materials, in 2015, Thompson et al. [55] proposed developed the spectral Neighborhood Analysis Potential (SNAP) model: assuming a linear relationship between the atomic energy and the bispectrality component of the local neighborhood density, the weighted least squares regression is used to fit large-scale quantum mechanical data. SNAP has demonstrated excellent performance in predicting screw dislocation migration for BCC tantalum, revealing for the first time the microscopic mechanism of material plastic deformation under shear stress.
The deep integration of deep learning and molecular dynamics was a breakthrough in 2018, Wang et al. [56] developed the DeePMD-kit model, which enables end-to-end training through TensorFlow, with automated descriptor calculations and force field evaluation modules that bring computational efficiency to a level of accuracy comparable to DFT, while supporting million-atom-scale simulations. This open-source tool has greatly lowered the application threshold of MLP and promoted the field standardization process.
In view of the limitations of traditional empirical force fields, Kovacs et al. [57] proposed the atomic cluster Extension (ACE) framework in 2021, which uses symmetric polynomials to construct linear force fields, and achieves an order of magnitude higher accuracy than traditional methods in scenarios such as high temperature dynamics and bond ionization energy. ACE’s universal design has quickly become the new benchmark for modeling complex chemical systems.
In recent years, MLPS have begun to develop deeper into specific material systems. In 2024, Hedman et al. [58] used DeepMD to reveal the atomic mechanism of nucleation and defect formation during the growth of carbon nanotubes, achieving the leap from DFT accuracy to classical molecular dynamics speed. Based on the ACE framework, McCandler et al. [59] developed a special potential function for Aun(SCH)m, which provided a new research tool for nanocatalizations due to its long-range simulation capability.
These breakthroughs demonstrate that the accuracy of MLP force fields is rapidly approaching the precision of traditional DFT calculations, paving the way for large-scale simulations of multi-atom systems and uncovering previously unknown mechanisms. The continuous integration of deep learning and MD heralds a new era in materials science, empowering researchers to explore uncharted territories in chemical and physical phenomena.
In 2024, Hedman et al. [58] reported the research on simulating the growth process of carbon nanotubes (CNTS) using machine learning force fields (MLFF), as shown in Figure 9, and developed a new type of MLFF—DeepCNT-22. To construct the DeepCNT-22 dataset, researchers adopted multiple methods to generate the initial structures, including molecular dynamics (MD) simulations driven by density functional tight binding (DFTB), random perturbations of the structures, and carbon allotrope structures extracted from the GAP-20 database. Based on the initial data set, the study further introduces a variant of the active learning strategy to refine the data. By training a set of MLFF models, researchers used predictive force bias in the process of simulating the growth of single-walled carbon nanotubes (SWCNT) to identify structures that emerged during the growth process but were not adequately represented by the training data. With the aid of this MLFF system, researchers carried out molecular dynamics simulations of the growth mechanism of CNT and deeply explored the entire process from initial nucleation to tube growth, as well as defect formation and repair. The research finds that the growth of CNT is not a continuous and stable spiral pattern, but an intermittent process affected by the highly dynamic behavior and configuration entropy of the tube-catalyst interface. Furthermore, the research revealed the random formation of defects during growth and their self-healing mechanism, and analyzed the influence of factors such as growth rate and temperature on defect behavior. Overall, this work not only provides a more detailed atomic-scale understanding of the growth process of CNT, but also offers a theoretical basis and new ideas for regulating and optimizing the CNT preparation process, and provides a reference paradigm for the simulation research of related material systems in the future.

3.3.3. Graph Neural Network Based on Molecular Structure

GNNs exhibit an intrinsic compatibility with molecular structure analysis, making them uniquely suited for predicting material properties. This natural alignment stems from the inherent similarity between molecular configurations and graph representations. Atoms (e.g., carbon, hydrogen, oxygen) are encoded as nodes with features such as electronegativity and valence electrons, while chemical bonds (single, double, or triple) form edges characterized by bond type, strength, and geometric parameters like bond length, as shown in Figure 10. This graph-based framework mirrors the structural essence of molecules, allowing GNNs to model atomic interactions with exceptional precision. GNNs leverage message-passing mechanisms to aggregate neighborhood information, emulating how molecular properties emerge from atomic interactions. They capture both local molecular features—such as the reactivity of functional groups (e.g., hydroxyl radicals)—and global structural patterns, including ring systems or polymer chains that influence solubility. Through hierarchical aggregation across network layers, GNNs holistically characterize molecular behavior. Critically, their permutation invariance ensures predictions remain unaffected by atom ordering, inherently preserving molecular symmetry without requiring data augmentation. This structural fidelity, combined with computational efficiency, enables GNNs to outperform traditional methods in property prediction tasks, particularly for complex molecular systems where quantum simulations are computationally prohibitive.
The development of GNN architecture has been driven by advancements in modeling capabilities. In 2017, Gilmer et al. reported a Message-Passing Neural Networks [61] (MPNNs), which gained prominence in material property prediction through their message-passing framework, early models fell short in capturing multi-body interactions such as dihedral angles.
In 2018, Schütt et al. introduced SchNet [62], a deep learning model specifically designed for molecular and material property prediction using 3D Graph Neural Networks (3D-GNNs). It uses continuous filter convolution (CFCONV) layers to process atomic space coordinates, enabling accurate modeling of 3D atomic interactions that significantly enhance local interaction characterization while efficiently resolving many-body atomic effects, enabling accurate modeling of molecular and crystal structural features. SchNet has demonstrated excellent performance in predicting various material properties, especially formation energy.
The model’s ability to generate local chemical potential revealed enhanced interpretability for chemical concepts such as bond saturation and aromaticity, surpassing the Deep Tensor Neural Network in these tasks. In MD applications, SchNet achieved state-of-the-art results on the MD17 dataset, accurately predicting combined energy and force profiles with 30% faster convergence than the GAP-based GDML framework. Notably, its simulation of C20 fullerene dynamics reduced computational time by 50% while maintaining quantum-mechanical precision, validating its efficiency in complex molecular systems. By integrating spatial awareness with adaptive feature learning, SchNet established a new paradigm for data-driven materials modeling, bridging the gap between computational efficiency and quantum-level accuracy.
The introduction of Transformer architecture and attention mechanisms in 2018 revolutionized feature extraction in neural networks, aligning particularly well with the localized chemical environments inherent to molecular systems. This breakthrough propelled graph attention mechanisms to the forefront of GNN development.
In 2019, Yun et al. [63] reported a Graph Transformer Networks, which pioneered the integration of Transformer self-attention into GNNs, significantly enhancing graph representation capabilities by capturing both short-range atomic interactions and long-range structural dependencies. The model demonstrated superior performance in predicting electronic, mechanical, and thermal properties across diverse material systems, achieving 15–25% higher accuracy than conventional GNNs in alloy phase stability predictions.
Building on this foundation, Louis et al. developed GATGNN [64] in 2020, a groundbreaking deep graph network incorporating dual-scale attention for crystal property prediction. The architecture combines local attention layers and global attention layers. Local attention layers focus on atomic-scale environments, such as coordination numbers and bond angles, to precisely capture local information around atoms. Global attention layers perform weighted aggregation of atomic environment vectors to construct crystal-level representations, providing a comprehensive reflection of the overall crystal structure. This hierarchical approach addressed the limitations of traditional GNNs that overemphasized local interactions, enabling GATGNN to model complex atomic relationships in polycrystalline systems and metastable phases with 20–30% improved prediction consistency. In benchmark tests on perovskite stability and metal-organic framework porosity, GATGNN outperformed baseline models by reducing mean absolute errors to <0.08 eV and <1.5 Å3, respectively. The framework’s ability to balance local precision with global context has made it particularly effective for multicomponent systems where cooperative effects dominate material behavior, such as high-entropy alloys and layered heterostructures.
In addition to the material prediction of inorganic materials, GNN plays an important role in the large-scale screening and structural prediction of drug molecules. In 2021, Lai et al. [65] used a graph recurrent NN model for drug molecular structure generation, namely MGRNN (Molecular Graph Recurrent Neural Network). The model adopts an iterative way of generating nodes and edges and optimizes the training process through a breadth. It has good performance on specific datasets such as ZINC250k, ChEMBL, and MOSES with databases at the million-level.
GNNs can not only model graph nodes at the atomic level, but also take atomic aggregates as graph nodes for learning and prediction. It is not only applicable to the prediction of single molecular structure, but also has been applied to large molecular systems such as crystal polymers. In 2024, Korolev and Mitrofanov [66] used atomic-level GNNs to predict the properties of network materials and proposed a new material representation method. By transmitting information on course-grained crystal graphs, it aims to overcome the limitations of existing methods. The neural network prediction performance and energy efficiency of different material representation methods were compared, including models based on composition and crystal structure. The results show that the coarse-grained crystal Graph Neural Network reduces the computational cost while maintaining accuracy and becomes a strong competitor of atomic-level algorithms.
In 2023, Dong et al. [67] proposed a self-learning input GNN framework, named Self-Learning Input GNN (SLI-GNN), for unified prediction of crystals and molecules. A dynamic embedding layer was designed to update the input features with the iteration of the neural network, and an Infomax mechanism was introduced to maximize the average mutual information between local features and global features. The SLI-GNN can achieve the desired prediction accuracy with fewer inputs and more Message Passing Neural Network (MPNN) layers. Model evaluations on the Materials Project dataset and QM9 dataset verified that the overall performance of our SLI-GNN is comparable to other GNNs previously reported. The SLI-GNN framework exhibits excellent performance in material property prediction, thus, promising to accelerate the discovery of new materials.
In addition to molecular structure prediction, GNN can also be used to predict various properties of materials. In 2024, Zhao and Li [68] proposed a new method to study and simulate material interface diffusion by using GNNs. Experimental and simulation data of diffusion coefficients, concentration gradients and other relevant parameters from different material systems were collected. The data were pre-processed, and the key features affecting interface diffusion were extracted. Subsequently, a GNN model for diffusion problems was constructed, and the graphical representation was used to capture the atomic structure of materials. The model architecture includes multiple graph convolution layers for feature aggregation and update, as well as optional graph attention layers for capturing complex relationships between atoms. The GNN model was trained and validated using pre-processed data to achieve accurate prediction of diffusion coefficients, diffusion rates, concentration distributions, and potential diffusion paths.

3.3.4. The Differences and Connections Among the Three Models

These three types of models play significant roles, respectively, in macroscopic performance prediction, microscopic behavior modeling, and multi-scale structure–performance correlation, constituting the key technical system of intelligent material design. Firstly, the data-driven prediction model integrates experimental data and first-principles calculation results to establish the mapping relationship between material composition, structure and physical properties, achieving high-throughput screening and rapid performance evaluation. The core lies in reasonable feature engineering, which usually relies on the understanding of the relationship between the structure and properties of materials by domain experts. Therefore, the accuracy and interpretability of the model are more dependent on the quality and representativeness of the input variables.
Secondly, the Machine Learning Potential Function (MLIP) is mainly used to simulate the interaction potential energy surface between atoms in material systems. Compared with the traditional classical potential function, MLIPs can significantly expand the spatial and temporal scales of molecular dynamics simulation on the basis of approaching the precision of quantum mechanics, achieving the modeling capability of systems with millions of atoms or even larger. This method is applicable to the study of microscopic dynamic processes such as phase transition, diffusion, and defect evolution, making up for the deficiencies of first-principles calculations in terms of computational scale and efficiency. It is a key link connecting atomic scale and macroscopic behavior.
Thirdly, the material modeling method based on Graph Neural Networks has developed rapidly in recent years. GNN naturally represents atoms (as nodes) and the chemical bonds between them (as edges) in a graph structure, and can embed physical constraints such as bond length, bond Angle and symmetry, thereby enhancing the physical consistency and generalization ability of the model. GNN performs exceptionally well in dealing with complex material systems, such as multi-component alloys, polymer materials, and interface structures. It can automatically learn descriptors with clear physical and chemical significance for predicting the multi-scale properties of materials. However, GNN models usually require a large amount of high-quality training data, and the computational costs of the training and inference processes are relatively high.
Overall, these three types of machine learning models are not in a substitutive relationship with each other, but rather constitute a collaborative system for intelligent material modeling. Data-driven models are good at macroscopic performance prediction and are suitable for the material screening and optimization process. MLIPs focuses on microscopic atomic dynamics simulation, providing theoretical support for a deeper understanding of material behavior. GNN takes structural diagram learning as the core and attempts to establish an efficient and natural bridging mechanism between the microstructure and macroscopic properties of materials.

4. The Application of Artificial Intelligence in the Design of Basic Materials for Sports

At present, AI has become the forefront of materials research and development, and its powerful data processing, pattern recognition and prediction capabilities are completely changing the paradigm of traditional materials research. By taking full advantage of AI’s tools and methods, researchers can significantly accelerate the development of a wide range of materials, from high-performance structural materials to functional smart materials, all of which benefit from the power of AI technology. In the field of sports, AI-driven material research and development is to promote the innovation of sports equipment and equipment. The following are several types of mainstream materials in the sports industry and their applications.

4.1. Piezoelectric Materials

Piezoelectric materials are a class of functional materials capable of converting mechanical energy into electrical energy and vice versa. When subjected to external forces, piezoelectric materials generate electric charges. Conversely, when an electric field is applied, the materials undergo deformation. This unique property makes piezoelectric materials widely used in the detection of states in sports activities.
In 2023, Jing et al. [69] employed a machine learning (ML) strategy to systematically explore the performance of aluminum nitride-based piezoelectric materials with varying concentrations and compositions. The predicted piezoelectric strain coefficient (d33) closely matched experimental values for scandium (Sc), magnesium titanium (MgTi), and magnesium zirconium (MgZr) doped compounds. Notably, Sc0–5Al0–5N materials exhibited exceptional piezoelectric properties with values reaching 202 pC N−1. These findings not only reveal the relationship between piezoelectric properties and material structure but also provide a theoretical basis for designing high-performance piezoelectric materials.
Not only can the properties of piezoelectric materials be predicted by ML methods, but neural networks (such as Artificial Neural Networks (ANN)) can also perform this task efficiently. By simulating the learning mechanism of the human brain, NN can extract complex nonlinear relationships from many data, to accurately predict key performance indicators such as the piezoelectric coefficient and elastic modulus of materials. Compared with traditional machine learning methods, NNs show stronger generalization ability and prediction accuracy when dealing with high-dimensional data and complex structure–performance relationships.
Singh et al. [70] introduced an ANN method to evaluate the performance of 1–3 type piezoelectric composites, achieving prediction accuracy up to 99.999%. Compared to the traditional finite element method (FEM), the ANN model significantly reduces computational cost and time while maintaining accuracy, making it highly effective for designing hybrid piezoelectric composites.

4.2. Polymer Materials

Polymer materials are widely used in the field of sports, mainly due to their excellent characteristics such as being lightweight, flexible, and durable. For example, high-performance polyurethanes and polyester fibers are used to create breathable, hygroscopic and sweat-welling sportswear to enhance athlete comfort. Carbon fiber reinforced polymers are used to make lightweight and high-strength tennis rackets, bicycle frames and other equipment to significantly improve athletic performance.
In 2024, Gao et al. [26] advanced ML-assisted design for polymer materials by analyzing structure–composition-performance data. This approach uncovers structure–performance relationships to screen polymers meeting target requirements. Challenges such as scarce polymer databases and complex multi-scale structure–performance correlations persist. To address these challenges, transfer learning and new methods like polymer fingerprint maps and cross-linking descriptors were introduced, leading to a multi-fidelity learning framework that integrates multi-source heterogeneous data.
In 2024, Zhao et al. [71] proposed a machine learning-assisted multi-scale modeling strategy for carbon fiber reinforced polymers (CFRPs), which are known for their light weight and high strength, as shown in Figure 11. This strategy effectively predicts CFRP mechanical properties by combining ML models with molecular dynamics simulations. The predicted values align well with experimental data, offering a cost-effective solution for optimizing CFRP design.
Furthermore, NNs can perform better than traditional machine learning models. In 2023, Yuan et al. [72] by leveraging an ANN model optimized with a self-adjusting particle swarm algorithm. Prediction of nitrile rubber properties compared with traditional learning methods, achieved a 56.5% and 26.5% accuracy improvement, respectively.

4.3. Ceramic Materials

The application of ceramic materials in the field of sports is mainly due to their high hardness, wear resistance and high temperature resistance—for example, wear parts for high-performance sports equipment, such as zirconia ceramic ski edges and bicycle bearings.
In 2023, Qian et al. [73] proposed a fatigue life analysis method for composite materials based on Artificial Neural Networks. This method takes material parameters and load parameters as the network input and fatigue life as the output to construct a model of the relationship between the input parameters and fatigue life. Three models for predicting the fatigue of 2D braided ceramic matrix composites were established under the condition of small samples. The results show that Elman Network (ENN) and Convolutional Neural Network (CNN) can maintain high prediction accuracy under different training data scales. Their prediction errors gradually increase with the decrease of the training sample size, demonstrating excellent generalization ability, and their prediction results are in good agreement with the experimental values. However, the prediction accuracy of Generalized Regression Neural Network (GRNN) has always been difficult to meet the requirements of engineering applications.
In 2024, Rocha et al. [74] proposed a combination of machine learning and optimization programs to fine-tune the electrical properties of lead-free (1 − x) Na0.5 Bi0.5TiO3−xCaTiO3 piezoelectric ceramics. A comprehensive set of dielectric measurement data was used to train ML models that accurately predict the dielectric constant (ε) and dielectric loss (tanδ) as functions of Ca2+ concentration (x% Ca), temperature, and frequency.
NNs show remarkable results in predicting the life and properties of complex ceramic materials. The life and properties of ceramic materials are usually affected by many factors, including microstructure, composition, preparation process and use environment, etc., and the relationship between these factors is often complex and non-linear. Through its powerful data learning and pattern recognition ability, neural networks can extract these complex relationships from many experimental and practical data, to accurately predict the fatigue life, wear performance and failure mechanism of ceramic materials.

4.4. Metallic Materials

Metal materials are widely used in the field of sports, mainly due to their high strength, durability and processability. For example, aluminum alloys and high-strength steel are commonly used in the manufacture of golf clubs, bicycle frames, and fitness equipment, providing lightweight and high-strength properties. Because of its excellent corrosion resistance and biocompatibility, titanium alloys are used in the manufacture of high-performance sports equipment.
In 2024, Morand et al. [75] leveraged Machine Learning (ML) to address the intricate relationship between metal processing, structure, and performance. The study focused on two primary challenges: material design, which involves identifying optimal microstructures for desired performance, and process design, aimed at determining the most effective processing path to achieve these microstructures. By targeting multiple microstructures with similar behaviors, the model effectively guided the manufacturing process to achieve optimal performance.
In 2022, Hung et al. [76] introduced a chemically encoded Convolutional Neural Network (CNN) to predict the adsorption characteristics of metal-organic frameworks (MOFs) for CO2 gas absorption. The model was trained on approximately 10,000 MOF structures simulated at the molecular level. The CNN demonstrated high prediction accuracy, showcasing its capability to accelerate the discovery and design of MOF materials with enhanced gas absorption properties.

4.5. Composite Materials

Composite materials are widely used in the field of sports, mainly due to their excellent characteristics such as high strength, lightweight and designability, which integrate the advantages of multiple materials. For example, carbon fiber composites are used to make bicycle frames, tennis rackets and golf clubs, both reducing weight and improving performance.
In 2022, Bishara et al. [77] reviewed recent advancements in multiscale modeling and simulation using machine learning. The review emphasized applications in composite material homogenization, defect mechanics modeling, and material design. These ML-driven approaches are poised to revolutionize traditional multiscale modeling methods, offering more accurate and efficient solutions for complex material systems.
In 2024, Wang et al. [78] reported research on the modeling of the composite molding process based on integrated machine learning methods, as shown in Figure 12. The team innovatively adopted the image recognition model PixelRNN to model and simulate the complex physical phenomenon of filling composite material molds. The research results show that the fully trained and verified PixelRNN meta-model can achieve a prediction accuracy of 97.35% when using only 50% of the training dataset, and its computational cost is only 50% of that of traditional numerical simulation methods. This pioneering research not only validates the application value of machine learning in the field of composite molding, but also reveals the unique advantages of modeling methods based on graphic input data (rather than traditional numerical data) in this field, providing an important technical reference for subsequent research.

5. Multifunctional Material Integrated Devices Enhance Sports Data Collection and AI Analysis

With the advancement of materials science, integrated devices that combine multiple basic functional materials into multifunctional materials have provided a brand-new solution for sports data collection due to their excellent mechanical adaptability, high sensitivity and customizable characteristics. These devices can efficiently convert multi-dimensional information, such as mechanics and physiology during movement into digital signals, and achieve in-depth data analysis and intelligent feedback through artificial intelligence (AI) technology, thus, demonstrating great potential in fields such as athlete training and health management.
With the help of modern data collection technology, we can not only collect the physiological and sports data of athletes, but also extract key performance parameters from intelligent sports equipment.
In 2025, Wang et al. [79] reported a smart insole for real-time visualization and analysis of foot pressure monitoring and gait, as shown in Figure 13. This insole is equipped with 22 pressure sensors, which can achieve spatially resolved pressure mapping. Meanwhile, it integrates a Support Vector Machine model to further accurately identify eight motion states, including static (such as sitting and standing postures) and dynamic (such as walking, running, and squatting) activities. In addition, it can also detect diseases related to foot pressure, achieve fatigue prediction, exercise optimization, and customized fitness training.
In 2024, Yuan et al. [80] reported a self-powered and intelligent badminton racket to achieve multi-functional, real-time and convenient sports monitoring t, as shown in Figure 14. The customized electrode was constructed by using the silver paste coating method, and a frictional induction array was formed on the racket string, achieving the monitoring of the hitting position. Meanwhile, a flexible piezoelectric film is embedded in the hand glue, which can recognize the grip posture. These induction arrays can directly convert mechanical signals into electrical signals. By processing the collected multi-channel data through machine learning, this study achieved a 95% accuracy rate for hitting the ball position. SIBR provides a powerful reference for badminton training and opens a new path and method for badminton monitoring.

6. AI and Sports Data

6.1. Application of AI in Sports Data Analysis

In this AI-driven era, AI, with its highly intelligent data analysis capabilities, is revolutionizing the way sports are trained. By analyzing the vast amounts of data generated by sports, AI can provide athletes with scientific training programs. Intelligent materials and sensors can collect real-time movement data, such as heart rate, speed, power and posture, to provide the basis for AI analysis. Based on this data, AI can identify an athlete’s strengths and weaknesses, optimize training schedules, reduce sports injuries, and enhance competitive performance. For example, smart running shoes and wearables help personalize running training programs by recording stride length, stride frequency and energy expenditure. The smart tennis racket provides technical improvement suggestions by analyzing the swing force and Angle. Ai-driven data sports training programs not only improve training efficiency but also provide more accurate decision support for athletes and coaches and promote sports training into a new era of intelligence and science.
By converting various motion states in sports into electrical signals, denoising the signals and transmitting them to computers for data analysis, combined with big data and artificial intelligence technology, the tactics and motion states in sports can be effectively analyzed. At the same time, by combining the individual data of each athlete into a large data set, through AI learning and big data analysis mode, tactical formulation, training effect prediction, and other functions can be realized. The introduction of this technology has obviously promoted the development of sports training towards a new data-driven model.
For example, in 2021, Dijk et al. [81] reported a machine learning-based algorithm is introduced to use an inertial measurement unit (IMU) to evaluate the orientation of body parts and obtain accurate estimates of the orientation of body parts. The experimental results show that the estimation accuracy of the instantaneous trunk inclination can be significantly improved in sports occasions such as wheelchair sports, where IMU is difficult to use. Therefore, this algorithm has important application value for evaluating the orientation of body parts in difficult IMU situations.
In 2022, Havlucu et al. [82] reported that the psychological state of athletes was detected through wearable devices and machine learning techniques. A combination of coach observation and machine learning was used to collect inertial measurement unit data from tennis players and train a recurrent neural network to predict the psychological state label given by the coach. The experimental results show that this method can accurately predict the optimal psychological state and provide new ideas and technical support for the field of sports.
In 2024, Yang et al. [83] reported that a comprehensive analytical framework based on the BP NN model was developed to analyze the factors affecting the performance of elite skaters. Eight high-speed skaters from the Chinese national team were selected, and data from a total of 403 games were analyzed between 2013 and 2022. The results showed that the altitude of the ice rink had a positive impact on the performance of all athletes, and at the same time, the increase in the frequency of games also helped to improve the performance level. These research results provide important guidance for speed skaters and coaches, helping to optimize their training and competition strategies and ultimately improve their competitive level.
In 2024, Munoz-Macho et al. [84] report discussions that AI can be applied to performance improvement, health care, technical and tactical support, talent identification, game prediction, business growth, and other aspects in competitive sports. Among them, football accounts for 67% of the total number of studies. By using various AI methods (such as decision tree Ada/XGBoost, NN, etc.), the training status and health of 2823 professional athletes were evaluated. The results can provide practical applications and guidance for researchers, practitioners and policymakers to explore the dynamics of complex systems in the future.
Musat et al. [85] reviewed and analyzed the transformative role of AI in predicting and preventing sports injuries across various disciplines in the last ten years. Various ML methods, such as Random Forests (RFs), Convolutional Neural Networks (CNNs), and Artificial Neural Networks (ANNs), have been used to analyze complex datasets, detect patterns, and generate predictive insights that enhance injury prevention strategies.

6.2. AI in Sports Data: Privacy, Data Integrity, and Anti-Tampering Risks

The application of artificial intelligence in sports data analysis is profoundly transforming the operational mode of the sports industry. However, this transformation has also brought many challenges, especially in terms of privacy and security, data integrity and tamper-proofing. First, the issue of privacy is particularly prominent. Sensitive information such as biometric data, personal performance indicators and location information generated by athletes through wearable devices is at risk of being abused or leaked. Secondly, data integrity is also a key issue that urgently needs to be addressed in sports data analysis. Sensor failures, data transmission errors, and technical bottlenecks in multi-source data fusion can all lead to the distortion of the basic data for analysis. Thirdly, the risk of malicious tampering should not be ignored either. In the highly competitive sports environment, some stakeholders may tamper with data for commercial or competitive purposes, affecting the authenticity and credibility of the analysis results.
To effectively deal with the above-mentioned threats, sports organizations should establish a multi-level and systematic data security protection system. Specific measures include adopting high-strength encryption technology to ensure the security of data transmission and storage, setting up an unalterable audit trail mechanism, and using AI algorithms to identify the authenticity of data to enhance the overall data management capability.
At the technical level, in 2024, Liu et al. [86] proposed a privacy protection method based on enhanced Local Sensitive Hashing (LSH)—PSDFPALSH—in 2024, providing a new solution for sports data analysis. This method converts high-dimensional sensitive data into low-dimensional privacy indexes, effectively protecting data privacy while maintaining high analysis efficiency and data utilization rate. Compared with traditional hashing techniques such as MinHash and SimHash, PSDFPALSH performs better in terms of average absolute error, root mean square error and computational efficiency. This method mainly includes three stages: converting the performance data of athletes and the interaction data of fans into hash indexes that comply with privacy norms; identifying the approximate nearest neighbor relationship based on these indexes; and ultimately carrying out predictive analysis to provide personalized suggestions. This method not only optimizes the design of the LSH function, but also constructs the athlete privacy sequence through a multiple hash mechanism, effectively avoiding the risk of data leakage.
At the institutional and ethical levels, Hegi et al. [87] report a systematic review study in 2023 that sorted out the ethical issues of AI applications in sports and summarized them into four core topics: fairness and bias, transparency and explainability, privacy and data ethics, and responsibility division. Research indicates that AI may exacerbate inequality in sports, and algorithmic black boxes also reduce the traceability and credibility of decisions. Meanwhile, the ethical use and protection of athletes’ and users’ data urgently needs the support of a sound system. In addition, developers and users need to clearly define their respective responsibility boundaries to promote the sustainable development of AI in sports. This study adopted the PCC framework and the PRISMA-ScR checklist for systematic retrieval and analysis, ensuring the comprehensiveness and representativeness of the research. In the future, ethical standards applicable to different sports subfields should be formulated, and a unified AI regulatory mechanism should be established globally, with particular attention paid to the application of amateur sports, the improvement of dataset diversity, and the integration of ethical AI in the sports governance structure.
In conclusion, it provides a privacy solution at the technical level for sports data analysis, enabling efficient prediction and personalized recommendations based on cloud platforms while maintaining user data privacy. In terms of ensuring data quality, a standardized data verification mechanism should be established and AI models capable of automatically detecting abnormal data should be developed to enhance the credibility of sports data.

7. The Application of Dual Drivers of Materials and AI in Sports Events

The collaborative innovation of materials science and artificial intelligence (AI) is profoundly transforming the field of sports. New materials provide fundamental support for the collection of sports data, while AI, through its powerful data processing and analysis capabilities, promotes the development of competitive sports towards greater efficiency, intelligence, and safety.
In the high jump event, Mascia et al. [88] reported an AI-integrated training system in 2023. Researchers used gyroscope sensors to collect dynamic data of athletes during jumps, extracted key biomechanical features through the Lasso Regularization Algorithm, and finally achieved precise prediction of jump heights with the help of multi-layer perceptron neural networks. This system significantly reduces the cost of traditional high-speed camera systems, while providing a scientific basis for coaches to formulate personalized training plans.
In the running event, the breakthrough of smart textile materials has brought revolutionary progress to sports monitoring. In 2022, Young et al. [89] achieved real-time monitoring of key indicators such as heart rate, muscle activity, and body temperature by integrating flexible sensors into sportswear through printing, coating or weaving. Combined with the AI gait analysis technology based on the inertial measurement unit, this system can accurately evaluate biomechanical parameters such as foot inversion and ground contact time within the speed range of 8 to 12 km per hour, thereby providing data support for running posture optimization and sports injury prevention. However, during high-speed running at speeds above 14 km per hour, the system accuracy has declined, indicating the direction for future optimization.
In wheelchair competitive sports, the introduction of AI technology also shows great potential. Dijk et al. [81] have developed an extended Madgwic filtering algorithm that can accurately estimate the trunk tilt of athletes and is applicable to athletes of different skill levels and wheelchair types. Combined with the kinematic data of wheelchairs, this system can calculate key parameters such as the center of mass displacement, power output and energy consumption of athletes, providing a quantitative basis for the optimization of training programs, significantly improving training efficiency and reducing the risk of sports injuries.
In the field of rugby, in 2021, Goodin et al. [90] conducted a study aimed at identifying head and body impact events during matches. They developed a machine learning-based classifier and used customized instrumentalized mouth guards in combination with video verification to collect data, covering 21,348 data records from a total of 64 elite athletes in 119 matches from the Australian Football League (AFL) and Women’s Football League (AFLW). The research adopted techniques such as low-pass filtering and cross-validation to process the data, and used F1 score, true positive (TP) and true negative (TN) as evaluation indicators. The classifier shows an accuracy of more than 95% in most data subsets. The study also adopted the SHAP (SHapley Additive exPlanations) method to explain the feature weights of the model, providing theoretical support for the design of sports equipment and injury prevention.
In conclusion, the deep integration of materials science and AI is reshaping sports training and competition models in unprecedented ways. From data collection and motion analysis, to injury prevention and equipment optimization, this dual-driven mechanism is driving sports technology towards a smarter new era.

8. Conclusions and Future Prospects

The integration of material science and AI has significantly advanced the development of sport, offering transformative applications and future opportunities. This review emphasizes the role of innovative materials in driving progress in sports from the historical perspective of the relationship between materials and sports, highlighting their contribution to the transformation of sports equipment and the innovation of sports disciplines. Most importantly, the advent of AI has had a profound impact on both the materials and sports sectors, accelerating rapid innovation in materials and giving rise to the concept of smart sports, as shown in Figure 15. It is bold to speculate that AI will be a pivotal opportunity for the next revolution in sports. AI-driven sensing technologies can achieve precise data collection and analysis, facilitating smart talent selection, training optimization, and tactical decision-making. Furthermore, AI has expedited the development of new materials, optimized data processing, and improved the accuracy of sports scenario simulations, driving the advancement of sports technology towards higher efficiency and scalability.
However, the introduction of AI technology has led to the development of innovative training models in sport science. However, it also raises concerns regarding privacy security, data integrity, and the risk of malicious tampering. These issues can result in flawed training methods, inaccurate tactical guidance, and even pose risks to the physical health and safety of athletes. The growing reliance on AI technology also introduces potential risks. Over-dependence on AI systems may reduce the involvement of human expertise, leading to a diminished role for coaches and sports scientists. Additionally, AI applications in sports are accompanied by data privacy risks, threats to data integrity, tampering issues, and resistance to the adoption of new technologies. Fully understanding and addressing these challenges is crucial to maximizing the benefits of AI while minimizing its associated risks.

Author Contributions

Conceptualization, F.W. and J.L.; methodology, F.W. and S.J.; validation, F.W., S.J. and J.L.; formal analysis, F.W., S.J. and J.L.; investigation, F.W. and J.L.; resources, F.W. and J.L.; data curation, S.J. and F.W.; writing—original draft preparation, S.J.; writing—review and editing, F.W., S.J. and J.L.; supervision, F.W. and J.L.; project administration, F.W.; funding acquisition, F.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the MOE (Ministry of Education in China) project of Humanities and Social Sciences grant number 20YJC890027.

Acknowledgments

The anonymous reviewers are thanked for very useful guide and suggestions to improve the structure of the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Delaney, T.; Madigan, T. The Sociology of Sports: An introduction; McFarland: Jefferson, NC, USA, 2021. [Google Scholar]
  2. Savage, G. Formula 1 Composites Engineering. Eng. Fail. Anal. 2010, 17, 92–115. [Google Scholar] [CrossRef]
  3. Arderiu, A.; De Fondeville, R. Influence of advanced footwear technology on sub-2 hour marathon and other top running performances. J. Quant. Anal. Sports 2022, 18, 73–86. [Google Scholar] [CrossRef]
  4. Morales, A.T.; Tamayo Fajardo, J.A.; González-García, H. High-Speed Swimsuits and Their Historical Development in Competitive Swimming. Front. Psychol. 2019, 10, 2639. [Google Scholar] [CrossRef]
  5. Ma, W.; Liu, Y.; Yi, Q.; Liu, X.; Xing, W.; Zhao, R.; Liu, H.; Li, R. Table tennis coaching system based on a multimodal large language model with a table tennis knowledge base. PLoS ONE 2025, 20, e0317839. [Google Scholar] [CrossRef]
  6. Rowe, D. Sport, Culture and the Media: The Unruly Trinity, 1st ed.; Lü, P., Translator; Tsinghua University Press: Beijing, China, 2013. [Google Scholar]
  7. Memmert, D. (Ed.) Sports Technology: Technologies, Fields of Application, Sports Equipment and Materials for Sport; Springer: Berlin/Heidelberg, Germany, 2024. [Google Scholar] [CrossRef]
  8. Amzallag, N. From Metallurgy to Bronze Age Civilizations: The Synthetic Theory. Am. J. Archaeol. 2009, 113, 497–519. [Google Scholar] [CrossRef]
  9. Kidd, B. Onward to the Olympics: Historical Perspectives on the Olympic Games. Univ. Tor. Q. 2009, 78, 342–344. [Google Scholar] [CrossRef]
  10. Bennett, J.M.; Hollister, C.W. Medieval Europe: A Short History; McGraw-Hill: New York, NY, USA, 2006. [Google Scholar]
  11. Flory, P.J. Network Structure and the Elastic Properties of Vulcanized Rubber. Chem. Rev. 1944, 35, 51–75. [Google Scholar] [CrossRef]
  12. Wang, S.; Yu, S. Advances in Biomimetic Materials. Small Methods 2024, 8, 2301487. [Google Scholar] [CrossRef]
  13. An, S.; Yoon, S.S.; Lee, M.W. Self-Healing Structural Materials. Polymers 2021, 13, 2297. [Google Scholar] [CrossRef]
  14. Hong, J.-W.; Yoon, C.; Jo, K.; Won, J.H.; Park, S. Recent advances in recording and modulation technologies for next-generation neural interfaces. iScience 2021, 24, 103550. [Google Scholar] [CrossRef]
  15. Yang, K.; McErlain-Naylor, S.A.; Isaia, B.; Callaway, A.; Beeby, S. E-Textiles for Sports and Fitness Sensing: Current State, Challenges, and Future Opportunities. Sensors 2024, 24, 1058. [Google Scholar] [CrossRef] [PubMed]
  16. Drăgulinescu, A.; Drăgulinescu, A.-M.; Zincă, G.; Bucur, D.; Feieș, V.; Neagu, D.-M. Smart Socks and In-Shoe Systems: State-of-the-Art for Two Popular Technologies for Foot Motion Analysis, Sports, and Medical Applications. Sensors 2020, 20, 4316. [Google Scholar] [CrossRef] [PubMed]
  17. He, Q.; Zeng, Y.; Jiang, L.; Wang, Z.; Lu, G.; Kang, H.; Li, P.; Bethers, B.; Feng, S.; Sun, L.; et al. Growing recyclable and healable piezoelectric composites in 3D printed bioinspired structure for protective wearable sensor. Nat. Commun. 2023, 14, 6477. [Google Scholar] [CrossRef] [PubMed]
  18. Callister, W.D.; Rethwisch, D.G. Fundamentals of Materials Science and Engineering; Wiley: London, UK, 2000. [Google Scholar]
  19. Bello, S.A. Carbon-Fiber Composites: Development, Structure, Properties, and Applications. In Handbook of Nanomaterials and Nanocomposites for Energy and Environmental Applications; Kharissova, O.V., Torres-Martínez, L.M., Kharisov, B.I., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 63–84. [Google Scholar] [CrossRef]
  20. Ogura, M.; Fukushima, T.; Zeller, R.; Dederichs, P.H. Structure of the high-entropy alloy Al CrFeCoNi: Fcc versus bcc. J. Alloys Compd. 2017, 715, 454–459. [Google Scholar] [CrossRef]
  21. Neelam, R.; Kulkarni, S.A.; Bharath, H.S.; Powar, S.; Doddamani, M. Mechanical response of additively manufactured foam: A machine learning approach. Results Eng. 2022, 16, 100801. [Google Scholar] [CrossRef]
  22. Bhaduri, A.; Gupta, A.; Graham-Brady, L. Stress field prediction in fiber-reinforced composite materials using a deep learning approach. Compos. Part B Eng. 2022, 238, 109879. [Google Scholar] [CrossRef]
  23. LeSar, R. Introduction to Computational Materials Science: Fundamentals to Applications; Cambridge University Press: Cambridge, UK, 2013. [Google Scholar]
  24. Louie, S.G.; Chan, Y.-H.; Da Jornada, F.H.; Li, Z.; Qiu, D.Y. Discovering and understanding materials through computation. Nat. Mater. 2021, 20, 728–735. [Google Scholar] [CrossRef]
  25. Ohno, K.; Esfarjani, K.; Kawazoe, Y. Computational Materials Science: From Ab Initio to Monte Carlo Methods, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar] [CrossRef]
  26. Gao, L.; Lin, J.; Wang, L.; Du, L. Machine Learning-Assisted Design of Advanced Polymeric Materials. Acc. Mater. Res. 2024, 5, 571–584. [Google Scholar] [CrossRef]
  27. Schleder, G.R.; Padilha, A.C.M.; Acosta, C.M.; Costa, M.; Fazzio, A. From DFT to machine learning: Recent approaches to materials science—A review. J. Phys. Mater. 2019, 2, 032001. [Google Scholar] [CrossRef]
  28. Kohn, W.; Sham, L.J. Self-Consistent Equations Including Exchange and Correlation Effects. Phys. Rev. 1965, 140, A1133–A1138. [Google Scholar] [CrossRef]
  29. Rapaport, D.C. The Art of Molecular Dynamics Simulation; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
  30. De Pablo, J.J.; Jackson, N.E.; Webb, M.A.; Chen, L.-Q.; Moore, J.E.; Morgan, D.; Jacobs, R.; Pollock, T.; Schlom, D.G.; Toberer, E.S.; et al. New frontiers for the materials genome initiative. NPJ Comput. Mater. 2019, 5, 41. [Google Scholar] [CrossRef]
  31. Curtarolo, S.; Setyawan, W.; Hart, G.L.; Jahnatek, M.; Chepulskii, R.V.; Taylor, R.H.; Wang, S.; Xue, J.; Yang, K.; Levy, O.; et al. AFLOW: An automatic framework for high-throughput materials discovery. Comput. Mater. Sci. 2012, 58, 218–226. [Google Scholar] [CrossRef]
  32. Saal, J.E.; Kirklin, S.; Aykol, M.; Meredig, B.; Wolverton, C. Materials design and discovery with high-throughput density functional theory: The open quantum materials database (OQMD). JOM 2013, 65, 1501–1509. [Google Scholar] [CrossRef]
  33. Levine, I.N.; Busch, D.H.; Shull, H. Quantum Chemistry; Pearson Prentice Hall: Upper Saddle River, NJ, USA, 2009. [Google Scholar]
  34. Winston, P.H. Artificial Intelligence; Addison-Wesley Longman Publishing Co., Inc.: Boston, MA, USA, 1984. [Google Scholar]
  35. Hjorth Larsen, A.; Jørgen Mortensen, J.; Blomqvist, J.; Castelli, I.E.; Christensen, R.; Dułak, M.; Friis, J.; Groves, M.N.; Hammer, B.; Hargus, C.; et al. The atomic simulation environment—A Python library for working with atoms. J. Phys. Condens. Matter 2017, 29, 273002. [Google Scholar] [CrossRef]
  36. Ong, S.P.; Richards, W.D.; Jain, A.; Hautier, G.; Kocher, M.; Cholia, S.; Gunter, D.; Chevrier, V.L.; Persson, K.A.; Ceder, G. Python Materials Genomics (pymatgen): A robust, open-source python library for materials analysis. Comput. Mater. Sci. 2013, 68, 314–319. [Google Scholar] [CrossRef]
  37. Bento, A.P.; Hersey, A.; Félix, E.; Landrum, G.; Gaulton, A.; Atkinson, F.; Bellis, L.J.; De Veij, M.; Leach, A.R. An open source chemical structure curation pipeline using RDKit. J. Cheminform. 2020, 12, 51. [Google Scholar] [CrossRef]
  38. Linstrom, P.J.; Mallard, W.G. The NIST Chemistry WebBook: A Chemical Data Resource on the Internet. J. Chem. Eng. Data 2001, 46, 1059–1063. [Google Scholar] [CrossRef]
  39. Abadi, M.; Barham, P.; Chen, J.; Chen, Z.; Davis, A.; Dean, J.; Devin, M.; Ghemawat, S.; Irving, G.; Isard, M.; et al. TensorFlow: A System for Large-Scale Machine Learning. In Proceedings of the 12th USENIX conference on Operating Systems Design and Implementation, Savannah, GA, USA, 2–4 November 2016. [Google Scholar]
  40. Paszke, A.; Gross, S.; Massa, F.; Lerer, A.; Bradbury, J.; Chanan, G.; Killeen, T.; Lin, Z.; Gimelshein, N.; Antiga, L.; et al. PyTorch: An Imperative Style, High-Performance Deep Learning Library. In Proceedings of the Advances in Neural Information Processing Systems 32, Vancouver Convention Centre, Vancouver, BC, Canada, 8–14 December 2019. [Google Scholar]
  41. Wu, Z.; Pan, S.; Chen, F.; Long, G.; Zhang, C.; Yu, P.S. A Comprehensive Survey on Graph Neural Networks. IEEE Trans. Neural Netw. Learn. Syst. 2021, 32, 4–24. [Google Scholar] [CrossRef]
  42. Skolnick, J.; Gao, M.; Zhou, H.; Singh, S. AlphaFold 2: Why It Works and Its Implications for Understanding the Relationships of Protein Sequence, Structure, and Function. J. Chem. Inf. Model. 2021, 61, 4827–4831. [Google Scholar] [CrossRef]
  43. Butler, K.T.; Davies, D.W.; Cartwright, H.; Isayev, O.; Walsh, A. Machine learning for molecular and materials science. Nature 2018, 559, 547–555. [Google Scholar] [CrossRef]
  44. Sha, W.; Edwards, K.L. The use of artificial neural networks in materials science based research. Mater. Des. 2007, 28, 1747–1752. [Google Scholar] [CrossRef]
  45. Behler, J. Perspective: Machine learning potentials for atomistic simulations. J. Chem. Phys. 2016, 145, 170901. [Google Scholar] [CrossRef] [PubMed]
  46. Sinha, P.; Joshi, A.; Dey, R.; Misra, S. Machine-Learning-Assisted Materials Discovery from Electronic Band Structure. J. Chem. Inf. Model. 2024, 64, 8404–8413. [Google Scholar] [CrossRef] [PubMed]
  47. Pei, Z.; Rozman, K.A.; Doğan, Ö.N.; Wen, Y.; Gao, N.; Holm, E.A.; Hawk, J.A.; Alman, D.E.; Gao, M.C. Machine-Learning Microstructure for Inverse Material Design. Adv. Sci. 2021, 8, 2101207. [Google Scholar] [CrossRef]
  48. Chan, C.H.; Sun, M.; Huang, B. Application of machine learning for advanced material prediction and design. EcoMat 2022, 4, e12194. [Google Scholar] [CrossRef]
  49. Wang, A.Y.-T.; Mahmoud, M.S.; Czasny, M.; Gurlo, A. CrabNet for Explainable Deep Learning in Materials Science: Bridging the Gap Between Academia and Industry. Integr. Mater. Manuf. Innov. 2022, 11, 41–56. [Google Scholar] [CrossRef]
  50. Du, H.; Hui, J.; Zhang, L.; Wang, H. Rational Design of Deep Learning Networks Based on a Fusion Strategy for Improved Material Property Predictions. J. Chem. Theory Comput. 2024, 20, 6756–6771. [Google Scholar] [CrossRef]
  51. Kresse, G.; Hafner, J. Ab initio molecular dynamics for liquid metals. Phys. Rev. B 1993, 47, 558–561. [Google Scholar] [CrossRef]
  52. Kang, P.-L.; Shang, C.; Liu, Z.-P. Large-Scale Atomic Simulation via Machine Learning Potentials Constructed by Global Potential Energy Surface Exploration. Acc. Chem. Res. 2020, 53, 2119–2129. [Google Scholar] [CrossRef]
  53. Behler, J.; Parrinello, M. Generalized Neural-Network Representation of High-Dimensional Potential-Energy Surfaces. Phys. Rev. Lett. 2007, 98, 146401. [Google Scholar] [CrossRef]
  54. Bartók, A.P.; Payne, M.C.; Kondor, R.; Csányi, G. Gaussian Approximation Potentials: The Accuracy of Quantum Mechanics, without the Electrons. Phys. Rev. Lett. 2010, 104, 136403. [Google Scholar] [CrossRef] [PubMed]
  55. Thompson, A.P.; Swiler, L.P.; Trott, C.R.; Foiles, S.M.; Tucker, G.J. Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials. J. Comput. Phys. 2015, 285, 316–330. [Google Scholar] [CrossRef]
  56. Wang, H.; Zhang, L.; Han, J. EW DeePMD-kit: A deep learning package for many-body potential energy representation and molecular dynamics. Comput. Phys. Commun. 2018, 228, 178–184. [Google Scholar] [CrossRef]
  57. Kovács, D.P.; Oord, C.V.D.; Kucera, J.; Allen, A.E.A.; Cole, D.J.; Ortner, C.; Csányi, G. Linear Atomic Cluster Expansion Force Fields for Organic Molecules: Beyond RMSE. J. Chem. Theory Comput. 2021, 17, 7696–7711. [Google Scholar] [CrossRef]
  58. Hedman, D.; McLean, B.; Bichara, C.; Maruyama, S.; Larsson, J.A.; Ding, F. Dynamics of growing carbon nanotube interfaces probed by machine learning-enabled molecular simulations. Nat. Commun. 2024, 15, 4076. [Google Scholar] [CrossRef]
  59. McCandler, C.A.; Pihlajamäki, A.; Malola, S.; Häkkinen, H.; Persson, K.A. Gold–Thiolate Nanocluster Dynamics and Intercluster Reactions Enabled by a Machine Learned Interatomic Potential. ACS Nano 2024, 18, 19014–19023. [Google Scholar] [CrossRef]
  60. Chen, Z.; Li, D.; Liu, M.; Liu, J. Graph neural networks with molecular segmentation for property prediction and structure–property relationship discovery. Comput. Chem. Eng. 2023, 179, 108403. [Google Scholar] [CrossRef]
  61. Gilmer, J.; Schoenholz, S.S.; Riley, P.F.; Vinyals, O.; Dahl, G.E. Neural Message Passing for Quantum Chemistry. In Proceedings of the 34th International Conference on Machine Learning, Sydney, NSW, Australia, 6–11 August 2017. [Google Scholar]
  62. Schütt, K.T.; Sauceda, H.E.; Kindermans, P.-J.; Tkatchenko, A.; Müller, K.-R. SchNet–A deep learning architecture for molecules and materials. J. Chem. Phys. 2018, 148, 241722. [Google Scholar] [CrossRef]
  63. Yun, S.; Jeong, M.; Kim, R.; Kang, J.; Kim, H.J. Graph transformer networks. In Advances in Neural Information Processing Systems, Proceedings of the 2019 Conference on Neural Information Processing Systems, Vancouver, BC, Canada, 8–14 December 2019; Neural Information Processing Systems Foundation, Inc. (NeurIPS): San Diego, CA, USA, 2019. [Google Scholar]
  64. Louis, S.-Y.; Zhao, Y.; Nasiri, A.; Wang, X.; Song, Y.; Liu, F.; Hu, J. Graph convolutional neural networks with global attention for improved materials property prediction. Phys. Chem. Chem. Phys. 2020, 22, 18141–18148. [Google Scholar] [CrossRef]
  65. Lai, X.; Yang, P.; Wang, K.; Yang, Q.; Yu, D. MGRNN: Structure Generation of Molecules Based on Graph Recurrent Neural Networks. Mol. Inform. 2021, 40, 2100091. [Google Scholar] [CrossRef]
  66. Korolev, V.; Mitrofanov, A. Coarse-Grained Crystal Graph Neural Networks for Reticular Materials Design. J. Chem. Inf. Model. 2024, 64, 1919–1931. [Google Scholar] [CrossRef] [PubMed]
  67. Dong, Z.; Feng, J.; Ji, Y.; Li, Y. SLI-GNN: A Self-Learning-Input Graph Neural Network for Predicting Crystal and Molecular Properties. J. Phys. Chem. A 2023, 127, 5921–5929. [Google Scholar] [CrossRef]
  68. Zhao, Z.; Li, H.-F. Investigating Material Interface Diffusion Phenomena through Graph Neural Networks in Applied Materials. ACS Appl. Mater. Interfaces 2024, 16, 53153–53162. [Google Scholar] [CrossRef]
  69. Jing, H.; Guan, C.; Yang, Y.; Zhu, H. Machine learning-assisted design of AlN-based high-performance piezoelectric materials. J. Mater. Chem. A 2023, 11, 14840–14849. [Google Scholar] [CrossRef]
  70. Singh, K.; Adhikari, J.; Roscow, J. Prediction of the electromechanical properties of a piezoelectric composite material through the artificial neural network. Mater. Today Commun. 2024, 38, 108288. [Google Scholar] [CrossRef]
  71. Zhao, G.; Xu, T.; Fu, X.; Zhao, W.; Wang, L.; Lin, J.; Hu, Y.; Du, L. Machine-learning-assisted multiscale modeling strategy for predicting mechanical properties of carbon fiber reinforced polymers. Compos. Sci. Technol. 2024, 248, 110455. [Google Scholar] [CrossRef]
  72. Yuan, Z. Predicting mechanical behaviors of rubber materials with artificial neural networks. Int. J. Mech. Sci. 2023, 249, 108265. [Google Scholar] [CrossRef]
  73. Qian, H.; Zheng, J.; Wang, Y.; Jiang, D. Fatigue Life Prediction Method of Ceramic Matrix Composites Based on Artificial Neural Network. Appl. Compos. Mater. 2023, 30, 1251–1268. [Google Scholar] [CrossRef]
  74. Rocha, H.R.O.; Roukos, R.; Abou Dargham, S.; Romanos, J.; Chaumont, D.; Silva, J.A.L.; Wörtche, H. Optimizing a machine learning design of dielectric properties in lead-free piezoelectric ceramics. Mater. Des. 2024, 243, 113053. [Google Scholar] [CrossRef]
  75. Morand, L.; Iraki, T.; Dornheim, J.; Sandfeld, S.; Link, N.; Helm, D. Machine learning for structure-guided materials and process design. Mater. Des. 2024, 248, 113453. [Google Scholar] [CrossRef]
  76. Hung, T.-H.; Xu, Z.-X.; Kang, D.-Y.; Lin, L.-C. Chemistry-Encoded Convolutional Neural Networks for Predicting Gaseous Adsorption in Porous Materials. J. Phys. Chem. C 2022, 126, 2813–2822. [Google Scholar] [CrossRef]
  77. Bishara, D.; Xie, Y.; Liu, W.K.; Li, S. A State-of-the-Art Review on Machine Learning-Based Multiscale Modeling, Simulation, Homogenization and Design of Materials. Arch. Comput. Methods Eng. 2023, 30, 191–222. [Google Scholar] [CrossRef]
  78. Wang, Y.; Xu, S.; Bwar, K.H.; Eisenbart, B.; Lu, G.; Belaadi, A.; Fox, B.; Chai, B.X. Application of machine learning for composite moulding process modelling. Compos. Commun. 2024, 48, 101960. [Google Scholar] [CrossRef]
  79. Wang, Q.; Guan, H.; Wang, C.; Lei, P.; Sheng, H.; Bi, H.; Hu, J.; Guo, C.; Mao, Y.; Yuan, J.; et al. A wireless, self-powered smart insole for gait monitoring and recognition via nonlinear synergistic pressure sensing. Sci. Adv. 2025, 11, eadu1598. [Google Scholar] [CrossRef]
  80. Yuan, J.; Xue, J.; Liu, M.; Wu, L.; Cheng, J.; Qu, X.; Yu, D.; Wang, E.; Fan, Z.; Liu, Z.; et al. Self-powered intelligent badminton racket for machine learning-enhanced real-time training monitoring. Nano Energy 2024, 132, 110377. [Google Scholar] [CrossRef]
  81. Van Dijk, M.P.; Kok, M.; Berger, M.A.M.; Hoozemans, M.J.M.; Veeger, D.H.E.J. Machine Learning to Improve Orientation Estimation in Sports Situations Challenging for Inertial Sensor Use. Front. Sports Act. Living 2021, 3, 670263. [Google Scholar] [CrossRef]
  82. Havlucu, H.; Akgun, B.; Eskenazi, T.; Coskun, A.; Ozcan, O. Toward Detecting the Zone of Elite Tennis Players Through Wearable Technology. Front. Sports Act. Living 2022, 4, 939641. [Google Scholar] [CrossRef]
  83. Yang, Z.; Ke, P.; Zhang, Y.; Du, F.; Hong, P. Quantitative analysis of the dominant external factors influencing elite speed Skaters’ performance using BP neural network. Front. Sports Act. Living 2024, 6, 1227785. [Google Scholar] [CrossRef]
  84. Munoz-Macho, A.A.; Domínguez-Morales, M.J.; Sevillano-Ramos, J.L. Performance and healthcare analysis in elite sports teams using artificial intelligence: A scoping review. Front. Sports Act. Living 2024, 6, 1383723. [Google Scholar] [CrossRef]
  85. Musat, C.L.; Mereuta, C.; Nechita, A.; Tutunaru, D.; Voipan, A.E.; Voipan, D.; Mereuta, E.; Gurau, T.V.; Gurău, G.; Nechita, L.C. Diagnostic Applications of AI in Sports: A Comprehensive Review of Injury Risk Prediction Methods. Diagnostics 2024, 14, 2516. [Google Scholar] [CrossRef]
  86. Liu, P.; Li, X.; Zang, B.; Diao, G. Privacy-preserving sports data fusion and prediction with smart devices in distributed environment. J. Cloud Comput. 2024, 13, 106. [Google Scholar] [CrossRef]
  87. Hegi, H.; Heitz, J.; Kredel, R. Sensor-based augmented visual feedback for coordination training in healthy adults: A scoping review. Front. Sports Act. Living 2023, 5, 1145247. [Google Scholar] [CrossRef] [PubMed]
  88. Mascia, G.; De Lazzari, B.; Camomilla, V. Machine learning aided jump height estimate democratization through smartphone measures. Front. Sports Act. Living 2023, 5, 1112739. [Google Scholar] [CrossRef] [PubMed]
  89. Young, F.; Mason, R.; Wall, C.; Morris, R.; Stuart, S.; Godfrey, A. Examination of a foot mounted IMU-based methodology for a running gait assessment. Front. Sports Act. Living 2022, 4, 956889. [Google Scholar] [CrossRef]
  90. Goodin, P.; Gardner, A.J.; Dokani, N.; Nizette, B.; Ahmadizadeh, S.; Edwards, S.; Iverson, G.L. Development of a Machine-Learning-Based Classifier for the Identification of Head and Body Impacts in Elite Level Australian Rules Football Players. Front. Sports Act. Living 2021, 3, 725245. [Google Scholar] [CrossRef]
Figure 1. Materials and the historical development stages of sports.
Figure 1. Materials and the historical development stages of sports.
Applsci 15 05667 g001
Figure 2. Major breakthroughs in technology have driven revolutionary development in sports competitions.
Figure 2. Major breakthroughs in technology have driven revolutionary development in sports competitions.
Applsci 15 05667 g002
Figure 4. The number of literature relevant to machine learning materials, deep learning materials, neural network materials, and Graph Neural Network materials in the last 10 years (Google Scholar).
Figure 4. The number of literature relevant to machine learning materials, deep learning materials, neural network materials, and Graph Neural Network materials in the last 10 years (Google Scholar).
Applsci 15 05667 g004
Figure 5. Data-driven machine learning prediction model principle [27].
Figure 5. Data-driven machine learning prediction model principle [27].
Applsci 15 05667 g005
Figure 6. (a) Establish the database of the target. (b) systematically conducts feature engineering processing on the molecular structure characteristics. (c) The unsupervised learning method is adopted to preprocess the material data. (d) conducts training and performance evaluation through a machine learning model [46]. Copyright 2024 American Chemical Society.
Figure 6. (a) Establish the database of the target. (b) systematically conducts feature engineering processing on the molecular structure characteristics. (c) The unsupervised learning method is adopted to preprocess the material data. (d) conducts training and performance evaluation through a machine learning model [46]. Copyright 2024 American Chemical Society.
Applsci 15 05667 g006
Figure 7. (a) The process of collecting data from images, model training to predicting new alloy compositions. (b) The process of training the sampling model is demonstrated from the microscopic experimental image data [47]. (c) A schematic diagram of the training of the machine learning model used, which consists of three sub-models, namely the encoder model, the decoder model and the regression model. The output of the regression model is basically the concentration of the alloy. During the training process, the image is both the input and the label, with the aim of minimizing the difference between the input and output images. And the latent space is further dimensionally reduced through kernel principal component analysis. The fine structure of the hidden layer is not shown. The hidden layers include batch normalization layers, convolutional layers, dropout layers, activation layers, etc.
Figure 7. (a) The process of collecting data from images, model training to predicting new alloy compositions. (b) The process of training the sampling model is demonstrated from the microscopic experimental image data [47]. (c) A schematic diagram of the training of the machine learning model used, which consists of three sub-models, namely the encoder model, the decoder model and the regression model. The output of the regression model is basically the concentration of the alloy. During the training process, the image is both the input and the label, with the aim of minimizing the difference between the input and output images. And the latent space is further dimensionally reduced through kernel principal component analysis. The fine structure of the hidden layer is not shown. The hidden layers include batch normalization layers, convolutional layers, dropout layers, activation layers, etc.
Applsci 15 05667 g007
Figure 8. Machine Learning Potential functions (MLPs) learn potential energies from quantum mechanical calculations or experimental data, enabling high-precision molecular dynamics simulations (e.g., material molecule formation reactions) at low computational costs [52]. Copyright 2020 American Chemical Society.
Figure 8. Machine Learning Potential functions (MLPs) learn potential energies from quantum mechanical calculations or experimental data, enabling high-precision molecular dynamics simulations (e.g., material molecule formation reactions) at low computational costs [52]. Copyright 2020 American Chemical Society.
Applsci 15 05667 g008
Figure 9. Machine learning force fields have demonstrated outstanding accuracy in simulating the growth reaction process of carbon nanotubes at the atomic scale. They not only enable nanosecond-level efficient calculations but also accurately predict the chiral structural characteristics of carbon nanotubes [58].
Figure 9. Machine learning force fields have demonstrated outstanding accuracy in simulating the growth reaction process of carbon nanotubes at the atomic scale. They not only enable nanosecond-level efficient calculations but also accurately predict the chiral structural characteristics of carbon nanotubes [58].
Applsci 15 05667 g009
Figure 10. The material modeling based on a Graph Neural Network has the function of intrinsic molecular structure information. (a) The molecular segmentation method is used to divide molecules into different clusters, with atoms as nodes and chemical bonds as edges, and convert them into graph data. (b) Atomic nodes conduct information transmission within molecular categories, learning the characteristics within functional groups to form a comprehensive understanding of the global structural information. (c) Through the information transmission of molecular segmentation and mask quantization, the molecular structure information that is prone to loss is optimized to improve the interpretability of the model [60].
Figure 10. The material modeling based on a Graph Neural Network has the function of intrinsic molecular structure information. (a) The molecular segmentation method is used to divide molecules into different clusters, with atoms as nodes and chemical bonds as edges, and convert them into graph data. (b) Atomic nodes conduct information transmission within molecular categories, learning the characteristics within functional groups to form a comprehensive understanding of the global structural information. (c) Through the information transmission of molecular segmentation and mask quantization, the molecular structure information that is prone to loss is optimized to improve the interpretability of the model [60].
Applsci 15 05667 g010
Figure 11. Flowchart of multi-scale modeling assisted by machine learning: (a) Establishing a database to calculate the properties of materials through molecular dynamics simulation. (b) Predicting the performance of materials by combining the database with machine learning algorithms [71].
Figure 11. Flowchart of multi-scale modeling assisted by machine learning: (a) Establishing a database to calculate the properties of materials through molecular dynamics simulation. (b) Predicting the performance of materials by combining the database with machine learning algorithms [71].
Applsci 15 05667 g011
Figure 12. The flow data of resin in fiber-reinforced composites were obtained through numerical simulation, and then the machine learning model PixelRNN was retrained and validated to predict different mold filling patterns.Preform permeability profile and part dimension of the: (a) Dashboard panel, (b) B-pillar. An example of composite mould filling progression and its corresponding colour-coded timescale for the: (c) Dashboard panel, (d) B-pillar. (e) Schematic diagram depicting the operations of the PixelRNN during model training and validation [78].
Figure 12. The flow data of resin in fiber-reinforced composites were obtained through numerical simulation, and then the machine learning model PixelRNN was retrained and validated to predict different mold filling patterns.Preform permeability profile and part dimension of the: (a) Dashboard panel, (b) B-pillar. An example of composite mould filling progression and its corresponding colour-coded timescale for the: (c) Dashboard panel, (d) B-pillar. (e) Schematic diagram depicting the operations of the PixelRNN during model training and validation [78].
Applsci 15 05667 g012
Figure 13. (A) Use the overall structure of the smart insole system. (B) Schematic diagram of pressure sensing is realized. The nonlinear components in the mechanical response and electrical response of the sensor cancel each other out, generating a linear sensing response. (C) The workflow of the smart insole system includes: pressure data collection, visualization of plantar pressure distribution on mobile terminals, and data classification based on the SVM learning model. (D) Photos of the smart insole system and the core sensing layer in their initial, folded, twisted and stretched states [79].
Figure 13. (A) Use the overall structure of the smart insole system. (B) Schematic diagram of pressure sensing is realized. The nonlinear components in the mechanical response and electrical response of the sensor cancel each other out, generating a linear sensing response. (C) The workflow of the smart insole system includes: pressure data collection, visualization of plantar pressure distribution on mobile terminals, and data classification based on the SVM learning model. (D) Photos of the smart insole system and the core sensing layer in their initial, folded, twisted and stretched states [79].
Applsci 15 05667 g013
Figure 14. An intelligent badminton racket with the ability to sense and collect information such as the force position of the badminton shuttlecock [80].
Figure 14. An intelligent badminton racket with the ability to sense and collect information such as the force position of the badminton shuttlecock [80].
Applsci 15 05667 g014
Figure 15. The innovation of materials technology has significantly promoted the evolution of sports equipment. With the deep integration of materials theory and artificial intelligence, various material descriptors and molecular characterization techniques have emerged, greatly promoting the research and development process of new materials. This advancement in materials science not only enhances the performance of sports equipment but also promotes breakthroughs in sports data collection and sensing technologies, laying a solid foundation for the digital transformation of sports.
Figure 15. The innovation of materials technology has significantly promoted the evolution of sports equipment. With the deep integration of materials theory and artificial intelligence, various material descriptors and molecular characterization techniques have emerged, greatly promoting the research and development process of new materials. This advancement in materials science not only enhances the performance of sports equipment but also promotes breakthroughs in sports data collection and sensing technologies, laying a solid foundation for the digital transformation of sports.
Applsci 15 05667 g015
Table 1. The principles of material function in signal acquisition.
Table 1. The principles of material function in signal acquisition.
Material TypePrincipleApplication
Piezoelectric MaterialsGenerate charge variations under mechanical force.Monitoring foot pressure distribution and dynamic motion state during running
Piezoresistive MaterialsResistance changes with mechanical strainMonitoring joint angles and subtle muscle activity changes
Magneto-resistive MaterialsMagnetic materials respond to external magnetic field variations.Capturing displacement and direction in motion, commonly used in motion trajectory tracking
Deformation-sensitive MaterialsUtilize strain effects and elastic recovery forcesReal-time monitoring of joint bending and dynamic deformation. Widely used in joint activity monitoring, smart clothing, and high-precision detection of complex dynamic behaviors
Table 2. Material classification.
Table 2. Material classification.
Type of MaterialAdvantagesDisadvantages
Carbon fiberHigh strength and low weight, good corrosion resistance, good fatigue propertiesBrittleness, easy breakage, high cost, difficult processing, and difficult waste recycling
Polymer materialsLightweight and highly malleable, easy to form, good chemical resistance, good elasticity and comfortPoor thermal stability, easy to deform or decompose, low mechanical properties, difficult to withstand large loads, aging problems
Alloy materialHigh strength and corrosion resistance, excellent processability, good high temperature resistanceThe density is large and not suitable for lightweighting requirements; the cost is high, and some alloys are embrittlement at low temperatures
Ceramic materialsHigh hardness and wear resistance, high temperature stability, corrosion resistance and insulationBrittleness, easy to break, difficult to process, easy to crack, heavy weight
Table 3. Sports equipment and their material classifications.
Table 3. Sports equipment and their material classifications.
Sports EquipmentPrimary MaterialsMaterial CategoryKey Applications
Basketball/Soccer/VolleyballRubber, PU/PVC LeatherPolymer MaterialsOuter layer: PU synthetic leather; Inner bladder: butyl rubber
Badminton RacketCarbon Fiber, Titanium AlloyComposite + MetallicFrame: carbon fiber composite; Shaft: titanium alloy or carbon fiber
Tennis RacketCarbon Fiber, KevlarComposite MaterialsMain body: carbon fiber-reinforced epoxy resin; Some include Kevlar for toughness
Golf ClubTitanium Alloy, Carbon FiberMetallic + CompositeClubhead: titanium alloy; Shaft: carbon fiber composite
Bicycle FrameAluminum/Carbon Fiber/TitaniumMetallic + CompositeEntry-level: aluminum alloy; Racing: carbon fiber; High-end: titanium alloy
Treadmill BeltRubber + Nylon FiberPolymer + CompositeSurface: anti-slip rubber; Base layer: nylon fiber reinforcement
Swimming GogglesPolycarbonate (PC), SiliconePolymer MaterialsLens: PC; Seal: silicone
SkisWood + Fiberglass + PolyethyleneComposite MaterialsCore: wood; Reinforcement: fiberglass; Base: ultra-high-molecular-weight polyethylene
Dumbbells/BarbellsCast Iron, SteelMetallic MaterialsMain body: cast iron (chrome-plated); Bar: chromium-molybdenum steel
Climbing RopeNylon, PolyesterPolymer MaterialsCore: braided nylon fibers; Outer sheath: polyester
Table Tennis BallCelluloid/ABS PlasticPolymer MaterialsProfessional: celluloid; Training: ABS plastic
Ice SkatesStainless Steel + Carbon SteelMetallic MaterialsBlade: high-carbon stainless steel; Holder: alloy steel
Sports Protective Gear (Knee Pads, etc.)EVA Foam + Nylon FabricPolymer + CompositeCushioning: EVA foam; Outer layer: nylon/PU-coated fabric
Baseball BatAluminum Alloy/Maple Wood/CompositeMetallic + Natural + CompositeProfessional: aluminum alloy; Traditional: maple wood; Advanced: carbon fiber + fiberglass composite
Climbing CarabinersAluminum AlloyMetallic MaterialsAerospace-grade aluminum alloy
Table 4. Common computational toolkits.
Table 4. Common computational toolkits.
ToolkitsDescription
ASE (Atomic Simulation Environment) [35] Widely used for atomistic simulations, supporting various quantum chemistry and molecular dynamics engines
Pymatgen [36]Powerful tool for materials science, mainly for crystal structure analysis, electronic structure processing, and data generation
RDKit [37]Open-source toolkit for cheminformatics and molecular modeling, widely used for molecule manipulation, reaction simulation, and property prediction.
Table 5. Databases for material science.
Table 5. Databases for material science.
ToolkitsDescription
OQMDOpen quantum materials database focused on DFT calculations
Materials ProjectProvides computational and experimental materials data
CCDCDatabase of small molecule crystal structures
PubChemLargest chemical molecule database with properties and bioactivity data
NIST Chemistry [38]Provides thermodynamic and spectral data from NIST
Table 6. Types and methods of the AI models.
Table 6. Types and methods of the AI models.
Model FormContentFeaturesDisadvantages
Data-driven Machine Learning Prediction Model [43,44]Trains machine learning models using large-scale experimental and computational data to establish a mapping relationship between material properties and structure.Fast prediction of material properties. Efficient handling of complex, multi-dimensional data to discover potential patterns.Strong dependency on high-quality data. The availability and balance of data directly affect model performance.
Machine Learning Potentials [45]Construct potential functions (Machine Learning Potentials, MLPs) through machine learning techniques to replace traditional quantum mechanical methods for simulating atomic interactions.This method is suitable for large-scale molecular dynamics simulations involving electron transfer and chemical bond breaking, applicable to studying reaction mechanisms and dynamic evolution in material systems at the thousand- to ten-thousand-atom scale.Requires a large amount of high-precision computational data (e.g., DFT data) for training. Model transferability is poor, and cross-system predictive capabilities need improvement.
Graph Neural Network-Based Material Modeling [41]Graph Neural Networks (GNNs) directly model the material’s molecular or crystal structure (usually represented as graphs) to predict the chemical properties and physical performance of materials.Incorporates molecular information from material structures into the prediction model, helping to accurately describe complex chemical bonds and interactions. Compared to traditional neural networks, it offers better physical meaning and interpretability.High computer resource requirements for building and optimizing graphics
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, F.; Jiang, S.; Li, J. The AI-Driven Transformation in New Materials Manufacturing and the Development of Intelligent Sports. Appl. Sci. 2025, 15, 5667. https://doi.org/10.3390/app15105667

AMA Style

Wang F, Jiang S, Li J. The AI-Driven Transformation in New Materials Manufacturing and the Development of Intelligent Sports. Applied Sciences. 2025; 15(10):5667. https://doi.org/10.3390/app15105667

Chicago/Turabian Style

Wang, Fang, Shunnan Jiang, and Jun Li. 2025. "The AI-Driven Transformation in New Materials Manufacturing and the Development of Intelligent Sports" Applied Sciences 15, no. 10: 5667. https://doi.org/10.3390/app15105667

APA Style

Wang, F., Jiang, S., & Li, J. (2025). The AI-Driven Transformation in New Materials Manufacturing and the Development of Intelligent Sports. Applied Sciences, 15(10), 5667. https://doi.org/10.3390/app15105667

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop