Next Article in Journal
Preparation of Self-Assembled Human Serum Albumin Nanoparticles Decorated with Trastuzumab as a Paclitaxel Delivery System
Previous Article in Journal
Radio-Frequency Characteristics of Stacked Metal–Insulator–Metal Capacitors in Radio-Frequency CMOS Devices
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Fabric Strain Sensor Array with Hybrid Deep Learning for Accurate Knee Movement Recognition

1
School of Future Technology, South China University of Technology, Guangzhou 511422, China
2
Research Institute for Intelligent Wearable Systems, The Hong Kong Polytechnic University, Hong Kong 999077, China
3
School of Fashion and Textiles, The Hong Kong Polytechnic University, Hong Kong 999077, China
4
School of Textile Science and Engineering, Wuyi University, Jiangmen 529020, China
*
Author to whom correspondence should be addressed.
Micromachines 2026, 17(1), 56; https://doi.org/10.3390/mi17010056
Submission received: 1 December 2025 / Revised: 29 December 2025 / Accepted: 29 December 2025 / Published: 30 December 2025
(This article belongs to the Special Issue Wearable Biosensors: From Materials to Systems)

Abstract

This paper presents a novel lightweight fabric strain sensor array specifically designed for comprehensive knee joint monitoring. The sensor system features a unique two-layer design incorporating eight strategically positioned sensing elements, enabling effective spatial mapping of strain distribution across the knee during movement. This configuration offers advantages in capturing complex multi-axis kinematics (flexion/extension, rotation) and localized tissue deformation when compared to simpler sensor layouts. To evaluate the system, ten subjects performed three distinct activities (seated leg raise, standing, walking), generating resistance data from the sensors. A hybrid deep learning model (CNN + BiLSTM + Attention) processed the data and significantly improved performance to 95%. This enhanced accuracy is attributed to the model’s ability to extract spatial-temporal features and leverage long-term dependencies within the time-series sensor data. Furthermore, channel attention analysis within the deep learning model identified sensors 2, 4, and 6 as major contributors to classification performance. The results demonstrate the feasibility of the proposed fabric sensor array for accurately recognizing fundamental knee movements. Despite limitations in the diversity of postures, this system holds significant promise for future applications in rehabilitation monitoring, sports science analytics, and personalized healthcare within the medical and athletic domains.

1. Introduction

Wearable electronic sensors have revolutionized the field of healthcare [1], sports science [2], and personalized medicine [3] by enabling continuous monitoring of physiological and biomechanical parameters. Among these, knee joint monitoring is crucial for rehabilitation support, injury prevention, and chronic disease management, as it offers precise measurement of movement angles, forces, and load distribution [4]. Knee joint monitoring systems use wearable sensors to capture detailed kinematic and kinetic information during daily activities, enabling accurate assessment of gait patterns and joint stability. When combined with modern soft materials, these devices offer comfortable long-term use, real-time feedback, and personalized rehabilitation or injury-risk predictions.
Early studies in knee joint monitoring deployed traditional rigid sensors, such as Inertial Measurement Units (IMUs) [5] and optical systems [6]. However, they face limitations in terms of comfort, seamless integration, and long-term durability, often leading to motion artifacts and environmental interference that compromise measurement accuracy [7]. IMU sensors are rigid and offer little comfort. The camera-based systems commonly used for human motion capture combine optical imaging data and computer modeling analysis. They can monitor movement with high precision but are confined to laboratory settings, as they cannot be worn and the systems are cumbersome [8]. Other non-textile flexible sensors face challenges in knee joint monitoring: hydrogel strain sensors [9,10] struggle with garment integration and are non-washable; skin-adhesive patches and epidermal electronics [11,12] may cause skin irritation with prolonged wear; thin-film strain gauges [13] have limited strain measurement ranges. Moreover, flexible sensors often require frequent recalibration, and their performance can be influenced by factors like temperature, humidity, sweat, and mechanical wear, limiting their consistency over time [14]. For instance, a review on fabric-based sensors highlights that “accuracy, reliability, calibration, durability, interference, cost, and compatibility” are the key challenges that need to be overcome [15].
Recently, fabric-based sensors have emerged as a promising alternative due to their flexibility, comfort, and adaptable ergonomic integration into clothing [5], making them ideal for long-term knee joint monitoring. For example, Shyr et al. developed a textile-based wearable sensing device for monitoring the flexion angles of elbow and knee movements via a strain-sensitive textile placed over the joint region [6]. More recently, Galli et al. presented a fully textile capacitive sensing system for knee angle monitoring, achieving strong correlation (R2 up to ~0.99) with optical motion capture in preliminary tests [8]. These works collectively show that fabric-based sensors conform to the previously stated wearability advantages (flexibility, garment integration, comfort) and that they have been preliminarily evaluated in knee joint or general joint motion monitoring.
Despite the above advances, fabric-based sensors for knee monitoring still face significant limitations: calibration drift and hysteresis remain concerns, particularly under repeated stretching and washing cycles [9]; sensor placement and movement relative to the skin/garment (slippage) can degrade accuracy; the strain or angle range may still be limited for high-intensity or complex movements (e.g., pivoting, lateral cut) and integration into truly everyday wearable garments (with laundering, sweat, multi-user variability) is still nascent [8].
Therefore, it can be seen that fabric-based strain sensors have emerged as a promising solution for knee joint motion monitoring, but they remain limited in their current form. One typical limitation is that these sensors provide valuable data on strain and angle changes at specific points of the knee, but they are often constrained by their lack of spatial coverage [8]. Most studies [16] on fabric strain sensors place a single sensor over the knee, which fails to provide a comprehensive view of the strain distribution across the entire joint surface. While using multiple sensors arranged in an array could offer a more complete picture, this approach introduces complexities, such as managing the intricate wiring and signal processing of multiple sensors [8,16,17].
The dual-layer FSS (Flexible Strain Sensor) array [18] developed previously in our research group, however, represents a breakthrough, incorporating built-in wiring within the array itself, which simplifies the setup and offers an advanced solution for continuous, multi-point monitoring of knee movement. The flexible printed fabric strain sensors offer distinct advantages, rendering them suitable for both clinical and sports applications. These sensors can be readily integrated into clothing or wearable devices, offering lightweight properties alongside a substantial strain measurement range. They enable long-term monitoring of joint movements, such as the dynamic changes in the knee during exercise. Their suitability for mass production and cost-effectiveness enhance their potential for everyday applications, presenting an exciting new avenue for future research and practical deployment.
However, despite their great potential, these fabric-based sensors still face challenges regarding their ability to provide precise, real-time feedback and accurate long-term monitoring without the support of advanced data processing techniques. Without algorithms to process sensor data effectively, fabric-based sensors lack the ability to offer real-time analytics, predictive insights, or detailed movement analysis, which limits their applicability in clinical and sports environments. It is worth noting that the integration of artificial intelligence (AI) and machine learning (ML) algorithms showed great potential to overcome these limitations [1]. In various fields, AI has already shown immense potential in enhancing sensor-based technologies [19], such as improving the accuracy of heart rate monitoring [20], gait analysis [21], and injury prediction [22]. For example, deep learning models like Convolutional Neural Networks (CNNs) and Long Short-Term Memory networks (LSTMs) have been successfully applied in gait analysis to detect steps, classify walking patterns, and even predict fall risks [23]. Similarly, edge computing is being leveraged to process sensor data locally on wearable devices, providing real-time feedback without relying on cloud computing [24]. These AI and data processing technologies can enhance the capabilities of fabric-based strain sensors by enabling precise strain distribution mapping across the knee joint, improving the accuracy of motion analysis, and providing immediate feedback during activities. Furthermore, by analyzing long-term sensor data, AI can offer predictive insights into joint health and injury risks, making it a valuable tool for both rehabilitation and athletic performance [25]. Therefore, the combination of FSS arrays with AI-driven algorithms will not only address the hardware limitations of fabric-based sensors [16,26,27,28,29,30,31,32,33,34,35,36,37,38,39] but will also significantly improve their functionality [4,40,41,42,43,44,45,46], providing more accurate, dynamic, and actionable data [47,48,49,50,51,52] for knee joint monitoring [53,54,55,56,57,58,59,60,61] in real-world, long-term applications. Furthermore, the research into the application of wearable sensors and ferroelectric nanogenerators in the fields of sports [62] and medicine [63] is continuously expanding. Such exploration serves as a guarantee for obtaining more accurate and reliable data [64] in practical engineering applications.
In this regard, this paper introduces this novel lightweight fabric strain sensor array with a two-layer design and eight strategically placed sensing elements for comprehensive knee joint monitoring, enabling effective spatial mapping of strain distribution during movement. A hybrid deep learning model (CNN + LSTM + Attention) achieved 95% accuracy by extracting spatial-temporal features and leveraging long-term dependencies in sensor data, with channel attention analysis highlighting sensors 2, 4, and 6 as key contributors. Despite posture diversity limitations, the system demonstrates promise for rehabilitation monitoring, sports analytics, and personalized healthcare applications.

2. Materials and Methods

2.1. Design and Fabrication of Knee Fabric Sensor

Traditional knee sensors typically use multiple sensor units from different categories, while this knee sensing system we designed only requires two layers, i.e., upper and lower layer design to make a fabric strain sensor array. For specifics on the sensor preparation process, refer to previous works [18,65]. The design of knee fabric sensor is shown in Figure 1. The sensor array includes 8 strain sensing elements arranged in two columns.
The first layer functions as the strain sensing element, typically composed of a conductive composite coated onto a fabric substrate, which directly transduces mechanical deformation into measurable electrical signals. The substrate layer, constructed from an elastic and breathable knitted fabric, ensures conformal contact with the knee surface, thereby maintaining user comfort during prolonged wear. The second layer is made from the same conductive material, electrically connecting rows of strain sensing elements. This two-layer configuration not only simplifies the fabrication process compared to conventional multi-unit sensor arrays but also enhances mechanical flexibility and facile integration with wearable garments. As a result, the proposed knee fabric sensor achieves a lightweight, low-profile design capable of reliable and repeatable performance. Furthermore, this configuration not only minimizes interfacial mismatches and assembly steps but also facilitates large-area production through scalable textile manufacturing techniques such as screen printing, dip coating, or electrospinning. Furthermore, by eliminating the need for rigid sensor modules or multiple heterogeneous sensing units, the device achieves a thinner profile, lighter weight, and higher integration with wearable garments. Therefore, such a streamlined structure is particularly advantageous for dynamic knee monitoring in sports, rehabilitation, and daily activity tracking, where unobtrusiveness, repeatability, and durability are essential for both user compliance and measurement reliability.
The materials and fabrication process of this fabric strain sensor followed our previous work [18,65], and the details are presented as follows. The conductive composite was prepared by uniformly mixing carbon black (3.0 g), room-temperature vulcanizing silicone rubber (30.3 g), and silicone oil (45.0 g) in a plastic beaker, followed by stirring in a blender at 400 rpm for 30 min. The substrate material is a plain-knit fabric composed of 70% polyester fiber and 30% Lycra fiber. To form the conductive pattern, the composite was screen-printed onto both sides of the substrate, and vertical interconnect access (vias) were punched at designated positions and filled with the composite to establish electrical connections between the two sides. The final fabric sensor array was obtained after curing in an oven at 100 °C for 1 h. The mixer used in sensor manufacturing was purchased from Shanghai Lichen Instrument Technology Co., Ltd. (Shanghai, China). The printing machine used in sensor manufacturing was purchased from Dongguan Deliou Precision Equipment Co., Ltd. (Dongguan, China).
The sensor comprises a single continuous conductive pattern printed in one step, which integrates two functional regions: sensing part and connection part. In the sensing part is a meandering trace with an effective length of 221 mm and a width of 2 mm, designed to achieve a high aspect ratio (>110) and thus a high initial resistance for strain sensing. The connection part is designed as wider rectangular areas (20 mm × 20 mm) with an aspect ratio of 1, providing a low-resistance path for reliable electrical interfacing.
Notably, the sensor array is made from a uniform composite material, in which a high loading of carbon black particles leads to the formation of aggregates that constitute the conductive network within the silicone elastomer matrix. Electrons can tunnel through the junctions between adjacent carbon black aggregates when the inter-aggregate distance is sufficiently small. Under strain deformation, this distance increases, resulting in reduction in electron tunneling probability. When a readout circuit is connected across the electrodes, this phenomenon manifests as a change in electrical resistance.
Considering the size of the knee and the actual wearing experience, the design of 8 sensors can enable more accurate monitoring and recognition of lower-limb movement postures. To address its complex multi-axis movements (primarily flexion/extension, accompanied by internal/external rotation and adduction/abduction), the sensor array is designed uniaxially along the distal direction to be able to precisely capture joint angle changes as well as to monitor patellar trajectory and tibiofemoral joint contact force distribution. Considering the critical role of ligaments (such as the anterior cruciate ligament (ACL), posterior cruciate ligament (PCL), medial collateral ligament (MCL), and lateral collateral ligament (LCL)) in stabilizing the joint and transmitting loads, strain sensors are integrated to non-invasively assess their tension state.
In our design, both the sensing and connecting sections are fabricated using the same carbon black/silicone elastomer (CB/SE) composite material via a single-step screen printing process. The key to ensuring the connecting section does not interfere with the strain response lies in the specifically designed aspect ratio (length/width) which was chosen meticulously.
The initial resistance R0 of the conductive tracks follows Ohm’s law, R ∝ L/W, where L and W denote the length and the width of the conductive film. The sensing section employs a serpentine structure (effective length = 221 mm, width = 2 mm, aspect ratio > 110), while the connection section features an extremely low aspect ratio design (length = width = 20 mm, aspect ratio = 1). This results in Rs0 ≫ Rc0, meaning the initial resistance of the sensing section dominates the total resistance value. Consequently, when under strain, the relative resistance change ΔR/R0 is primarily contributed by the sensing section, while the connection section functions primarily as a low-resistance interconnect structure. Therefore, the resistance change in the connection part can be reasonably neglected in the overall resistance change.
The eight sensors in the first layer are divided into four rows, with two sensors in each row, and they are numbered: Sensor 1, Sensor 2, and so on, as illustrated in Figure 1. The design of the second layer is intended to electrically connect the sensors in the first layer. An array of eight sensors can simultaneously measure the magnitude of strain at each position, enabling more detailed measurements within the covered surface area. In this way, the system can generate a detailed spatial strain map. This mapping capability is crucial for obtaining comprehensive bio-mechanical characteristics of joint movement, tissue deformation, or applied force, revealing subtle differences that single-point sensors may overlook. Additionally, a key aspect of our analysis is to investigate the relative contribution of each sensor to the overall feature set. In subsequent studies, we will present a detailed feature importance analysis. This analysis will quantify and rank the signal importance of sensors 1 to 8 in characterizing the monitored phenomenon (e.g., specific joint angles, gait phases, or pathological states). Mapping this feature importance distribution is of critical significance: it empirically validates the functional advantages of our specific fabric sensor design and layout strategy. By identifying which sensors provide the most discriminative or reliable information, we can demonstrate the effectiveness of the array configuration in capturing relevant bio-mechanical features and may provide a basis for optimizing sensor placement in future knee joint monitoring applications.

2.2. Data Collection and Experimentation

Ten male subjects were recruited for the test. Before participating in the test, the subjects provided their informed consent. The test included three parts, in which the subjects sit and lift their legs, stand, and walk. Table 1 presents the height, weight, and age information of the subjects.
In total, this test resulted in 30 experimental trials per movement category, providing a diverse dataset that captures inter-individual variations in knee joint dynamics as well as intra-individual variability across repeated sessions. Additionally, the dataset captures inter-individual variability within repeated testing sessions, which may stem from factors such as fatigue, environmental conditions, or subtle postural differences. This comprehensively illustrates the knee joint’s behavioral characteristics under varying conditions. By integrating both inter- and intra-individual variability, the dataset is suitable for training artificial intelligence models that require both cross-user generalization capabilities and sensitivity to fluctuations within an individual’s repeated performance.
As given in Figure 2, the experimental process is: when the subject first put on the knee pads, the location of the fabric strain sensor will be adjusted to make sure the knee is fully covered. Also, as peripheral of the sensor array were sewn onto the elastic fabric kneepad before the experiment, the two ends of the sensor array can maintain their relative position to the thigh and the calf, respectively.
Supplementary instructions for each movement are as follows: Kick: In the experiment, the kick movement must be performed within a range starting from a 90° knee bend and extending outward to approximately 180° (i.e., nearly fully straightened with slight hyperextension). Each participant performs 5 repetitions. The rhythm control for each movement is as follows: approximately 2 s to lift the leg, approximately 2 s to return to the starting position (each complete cycle takes about 4 s), with a 15 s rest between sets. If the subject feels discomfort, the angle or number of repetitions may be reduced. Standing Task: Participants maintain a natural standing posture for 1 min (with only minor adjustments to position, avoiding movement as much as possible) to assess sustained weight-bearing and postural control. Walking Task: Continuous for 1 min. For quantification and comparison with subsequent results, the common normal walking speed is estimated at approximately 1.2 m/s (about 72 m/min), corresponding to a step frequency of approximately 100 steps/min.
After each posture test, a CSV (Comma-Separated Values) file of 8 sensor resistance values will be generated, which will be saved in time to do the next action posture, until the data collection of three actions is done. To ensure consistency, each action is performed five times by every tester, and the corresponding sensor outputs are systematically recorded. The collected data are then organized into structured datasets, labeled according to the action category, test, and trial sequence, thereby facilitating subsequent data preprocessing, model training, and performance evaluation.

2.3. Posture Examples and Corresponding Resistive Line Diagram Results

During the experiment, we found that the most significant resistance change occurred during the sitting and lifting motion, followed by walking, and then standing. As illustrated in Figure 3 below, the three distinct movements correspond to three separate resistance variation curves. Each action exhibits corresponding differences in the amplitude, rate of change, and waveform characteristics of the resistance response, which align precisely with the actual movement sequences. This provides direct verification of the sensor’s accuracy and reliability in motion recognition. By comparing the morphological characteristics of the curves, flexion-extension, rotation, and lateral sway actions can be clearly distinguished. This demonstrates that the sensor not only captures strain signals generated by joint movements in real time but also maintains highly consistent response characteristics across multiple repeated experiments. This further validates its stability and repeatability in motion monitoring and pattern recognition.
To ensure consistency of sensor attachment during the study, we employed a controlled experimental procedure. The fabric strain sensor array was carefully positioned on each subject’s knee and adjusted to ensure complete coverage of the knee joint. To maintain consistent sensor placement, the array was sewn onto an elastic fabric knee pad, ensuring that the upper and lower boundaries of the array remained fixed relative to the thigh and calf, respectively, during movements. Data from each trial was systematically recorded, with repeated testing ensuring that any discrepancies in sensor data were due to the movement itself, rather than sensor placement issues. Furthermore, we plan to continue evaluating the sensor’s long-term attachment stability in future tests.
Furthermore, we standardized the donning process, conducting visual and tactile checks and initializing standardized experimental postures before each experiment. When each subject wore the knee brace, we ensured that: the center hole of the knee brace was aligned with the center of the patella; the upper edge of the knee brace was aligned with the lower part of the thigh, and the lower edge was aligned with the upper part of the calf; and the longitudinal axis of the sensor array was substantially parallel to the flexion-extension axis of the knee joint.

2.4. Posture Triple Classification Model and Its Prediction

We processed the data for 3 individual files and all 30 files in each of the three categories. Random forest algorithm was used to carry out the construction of multiple decision trees to combine the results for categorization, for different categories of CSV files (one CSV file was generated for each detected action to record the resistance value of each sensor), which were categorized as sitting and lifting the leg, standing, and walking. The work of feature extraction, model training, model prediction and model evaluation was carried out.
The CNN (Convolutional neural network) + BiLSTM (Long short-term memory network) + Attention hybrid deep learning model architecture combines the local feature extraction capabilities of CNN, the time series modeling capabilities of BiLSTM, and the feature selection capabilities of the attention mechanism. It is specifically designed for time series sensor data classification tasks.
As shown in Figure 4, the CNN part extracts local spatial-temporal features, the BiLSTM part captures long-term temporal dependencies, and the Attention part dynamically focuses on key time steps (such as key signal segments). This architecture balances feature extraction capabilities (CNN), temporal modeling capabilities (BiLSTM), and feature selection capabilities (Attention). By integrating these three components into a unified framework, the model achieves a more comprehensive representation of the sensor data, effectively bridging the gap between low-level signal fluctuations and high-level motion patterns. The CNN ensures that subtle variations in resistance signals caused by localized joint movements are preserved, while the BiLSTM leverages sequential dependencies to model the continuity of motion across time. The Attention mechanism further refines this process by weighting the most informative temporal segments, thereby filtering out redundant or noisy inputs. As a result, the architecture not only improves recognition accuracy under standard testing conditions but also demonstrates enhanced robustness against motion artifacts, signal drift, and inter-subject variability.
This hybrid architecture, combining CNN, BiLSTM, and Attention mechanisms, offers a holistic approach to sensor data analysis by efficiently capturing both the spatial and temporal dynamics of knee joint movement. The CNN component is particularly adept at extracting localized features from the raw sensor signals, enabling it to detect minute but significant variations in sensor resistance that correspond to small joint movements, such as flexion or extension. This step ensures that the fine-grained details of motion are preserved and utilized in the subsequent analysis.
The BiLSTM layer, on the other hand, excels at capturing the temporal dependencies between consecutive sensor readings, allowing the model to model the continuity of movement over time. By maintaining memory of past observations, the BiLSTM can learn long-range dependencies within the data, which is crucial for understanding cyclic or repetitive motion patterns, such as walking or squatting, where each phase of movement influences the next.
The Attention mechanism enhances this process by focusing the model’s computational resources on the most relevant segments of the input data, dynamically assigning higher weights to time steps that contain critical features for classification. The integration of these three components in a unified framework enables the model to handle complex knee motion data with high precision.

3. Results

3.1. Results After Processing 3 CSV Files with a Random Forest Algorithm

The results generated using the random forest algorithm are as follows:
The model evaluation including precision, recall, and F1 score is shown in Figure 5a. The confusion matrix is shown in Figure 5b. Among the results of the visualization, the confusion matrices for the three different postures, the results of which show that the model recognition performs best in the posture of sitting and lifting the leg. For example, if there is new data appearing, the features of the 8 sensors can be predicted more accurately with the Random Forest classifier, as shown in Figure 5c. The prediction of the new data has a high probability of sitting and lifting the leg.

3.2. Results for All 30 CSV Files with Deep Learning Model

When using CNN + BiLSTM + Attention for the same prediction, the accuracy rate was 95%. This is partly due to differences in data volume and partly due to differences in model architecture complexity. Therefore, the results visualization under the deep learning model shows an improvement in performance optimization.
As shown in Figure 6, the three images reveal the advantages of deep learning models from different perspectives. The observed improvement in accuracy highlights the significant impact of architectural optimization and data scale on model performance. Convolutional neural networks (CNNs) [10] are renowned for their ability to effectively extract spatial features from input data, particularly in image and sequence-related tasks [11]. By integrating bidirectional long short-term memory (BiLSTM) units, the model gains the ability to capture temporal dependencies and sequence patterns that pure CNN models may overlook. This is particularly important in scenarios where data has time series or ordered features. Additionally, the introduction of attention mechanisms enables the model to dynamically focus on the most relevant parts of the input sequence, assigning higher weights to important features during the learning process. This combination enhances the model’s ability to learn complex patterns and relationships in the data.
Differences in the amount of data between models play a key role in driving performance improvements. Larger datasets typically provide models with more diverse and rich learning information, helping to reduce overfitting and improve generalization capabilities on unseen data. When data is limited, even the most complex model architectures may struggle to achieve optimal performance due to insufficient training samples. Conversely, increasing the quantity of high-quality training data enables models to better fit the underlying data distribution, leading to more robust prediction results. Optimization strategies and training protocols also influence final accuracy. Techniques such as learning rate scheduling, batch normalization, dropout, and early stopping significantly impact model convergence and generalization capabilities [12]. Reasonably adjusting these hyperparameters typically enhances model stability and prevents it from getting stuck in local optima or overfitting noise. Additionally, the choice of loss functions and evaluation metrics directly influences training priorities and reported performance outcomes.
In addition to data volume and standard optimization techniques, data augmentation and synthetic data generation have become key strategies for improving model robustness, especially in areas where collecting large-scale, high-quality datasets is costly or impractical [13]. Augmentation methods—such as random cropping, rotation, scaling, time jittering (for time series signals), and noise injection—help models generalize to different scenarios [14]. The dataset in this paper is not large in terms of data volume. In the case of large-scale high-quality datasets, more considerations must be taken into account during training.
Combining convolutional neural networks (CNNs), bidirectional long short-term memory networks (Bi-LSTMs), and attention mechanisms represents a powerful approach to modeling complex data with spatial and temporal features. When combined with sufficient data volume and carefully designed training strategies, these advanced architectures demonstrate clear performance advantages over simpler models. This progress highlights the ongoing evolution of deep learning research, with hybrid models continuously pushing the boundaries of predictive accuracy and application scope. Future research could explore integrating modules such as transformers, graph neural networks, or domain-specific feature engineering to further enhance model capabilities. Additionally, expanding dataset size and diversifying data sources holds promise for further improvements in generalization ability and robustness.
The confusion matrix in Figure 6 shows that the model performs well in recognizing dynamic behaviors, but there is symmetry confusion between static behaviors. This is consistent with the 2D feature distribution in the figure. 3D visualization further reveals the spatial structure—static behaviors exhibit layered separation along the Z-axis, indicating that the bidirectional LSTM successfully captures posture height features. These visualizations collectively demonstrate: (1) residual networks exhibit high discriminative power in extracting motion features; (2) static behavior classification performance is constrained by the continuity of action transitions; (3) attention mechanisms effectively focus on key time frames (e.g., the vertical posture maintenance phase of standing).
We also conducted comparative experiments with pure CNN [66] and pure LSTM [67], as well as our deep learning architecture. The comparisons, as given in Table 2, showed that compared to the baseline model, our deep learning architecture outperformed CNN and LSTM in accuracy. Furthermore, its recall and F1 score also performed well, making it a worthwhile architecture for implementing pose recognition tri-classification tasks.

3.3. Results on the Effect of Sensor Importance Under the Channel Attention Mechanism

This channel attention mechanism handles structured numerical data by weighting the importance of multiple sensors’ features through the channel attention layer, and then the classification layer outputs the weighted features as classification probabilities for three categories to complete the sensor importance learning, and the model automatically determines which sensors are more important for the current classification task through the attention weights. From the results as given in Figure 7, 2, 4, and 6 sensors have better average attention weights for the overall array of 8 sensors, which is because the upper three rows of sensor elements are placed directly on the knee.
The above indicates that these sensors play a key role in behavioral recognition tasks. This adaptive sensor importance learning mechanism not only verifies the differences in the contribution of different sensor units to behavioral features but also provides data-driven theoretical basis for the optimization of sensor layout in wearable devices: in resource-constrained scenarios, high-weight sensors can be prioritized to maintain classification performance. Moreover, this mechanism enhances the interpretability of the recognition model by revealing the relationship between specific sensor placements and their corresponding biomechanical relevance during different motions. Such insights not only facilitate targeted hardware design, where redundant sensors can be reduced without significantly sacrificing accuracy, but also open avenues for personalized sensor deployment strategies tailored to individual gait characteristics or rehabilitation needs.

4. Conclusions

This paper presented a novel lightweight two-layer fabric strain sensor array tailored for comprehensive knee joint monitoring. There are three innovations: (1). The design of 8 sensors is more advantageous compared to single sensor or 2 × 2 sensors contact range. (2). Three experiments with different postures for triple classification of postures. (3). The use of two machine learning algorithms to improve the final results.
Based on these innovations, the proposed system achieved an encouraging balance between sensor hardware simplicity and analytical capabilities, providing a scalable framework for wearable motion monitoring. Firstly, the denser 8-sensor configuration enables more precise capture of strain distribution patterns in the knee region, thereby enhancing feature richness and improving classification accuracy, particularly in distinguishing subtle posture differences. Secondly, the inclusion of three distinct postures in the experimental protocol lays the groundwork for future expansion to more complex motion libraries, supporting richer activity recognition tasks. Further, the dual-algorithm approach not only provides comparative insights into model performance but also opens the door to hybrid or integrated models that can leverage the strengths of different classifiers to enhance robustness. Therefore, this fabric sensing system holds significant promise for future applications in rehabilitation monitoring, sports science analytics, and personalized healthcare within the medical and athletic domains.
Despite current limitations in pose diversity and generalization to unconstrained real-world movements, targeted applications for rehabilitation monitoring and movement performance optimization may still offer immediate deployment opportunities. Nonetheless, this limitation underscores the need for further research to expand the range of monitored activities and enhance the system’s robustness in diverse scenarios. For example, future research shall include more detailed experiments, such as investigating the effect of different speeds of the same movement [68], restricted movements [69], and differences in age [70]. In addition, future research could focus on integrating additional sensor modalities, expanding action datasets, and implementing real-time feedback systems to fully unlock the potential of knee-worn fabric strain sensors in personalized healthcare and sports training scenarios. By addressing current limitations and exploring new avenues, we anticipate that this technology will play a pivotal role in advancing personalized healthcare and sports training scenarios.

Author Contributions

Conceptualization, T.C. and F.W.; Methodology, T.C., X.C. and F.W.; Fabrication and Evaluation, T.C.; Writing, review and editing, T.C., X.C. and F.W. All authors have read and agreed to the published version of the manuscript.

Funding

The work was partially supported by “High-level Talent Start-up Scheme” (Grant No. 2021AL034) from Wuyi University, “the Fundamental Research Funds for the Central Universities” (Grant No. 2024ZYGXZR077), and “Scientific Research Innovation Capability Support Project for Young Faculty” (Grant No. ZYGXQNJSKYCXNLZCXM-H8). T.C. acknowledges a postgraduate scholarship received from South China University of Technology.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. Data are not publicly available due to privacy and ethical reasons.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Ricotti, V.; Kadirvelu, B.; Selby, V.; Festenstein, R.; Mercuri, E.; Voit, T.; Faisal, A.A. Wearable full-body motion tracking of activities of daily living predicts disease trajectory in Duchenne muscular dystrophy. Nat. Med. 2023, 29, 95–103. [Google Scholar] [CrossRef]
  2. Seçkin, A.C.; Ates, B.; Seçkin, M. Review on Wearable Technology in Sports: Concepts, Challenges and Opportunities. Appl. Sci. 2023, 13, 10399. [Google Scholar] [CrossRef]
  3. Chu, M.; Nguyen, T.; Pandey, V.; Zhou, Y.X.; Pham, H.N.; Bar-Yoseph, R.; Radom-Aizik, S.; Jain, R.; Cooper, D.M.; Khine, M. Respiration rate and volume measurements using wearable strain sensors. npj Digit. Med. 2019, 2, 8. [Google Scholar] [CrossRef]
  4. Zhu, X.J.; Liu, W.; Qian, J.Y.; Zhuang, H.R.; Zhang, W.D.; Cao, J.; Zhang, G.A.; Yang, Y.J.; Cai, Y.; Shi, Y.C.; et al. A Novel Wearable Knee Joint Angle Monitoring Sensor Utilizing Mach-Zehnder Interferometer with Triple-Clad Fiber. IEEE Sens. J. 2024, 24, 2785–2791. [Google Scholar] [CrossRef]
  5. Harindranath, A.; Arora, M. Effect of sensor noise characteristics and calibration errors on the choice of IMU-sensor fusion algorithms. Sens. Actuators A-Phys. 2024, 379, 115850. [Google Scholar] [CrossRef]
  6. Jha, R.; Mishra, P.; Kumar, S. Advancements in optical fiber-based wearable sensors for smart health monitoring. Biosens. Bioelectron. 2024, 254, 116232. [Google Scholar] [CrossRef]
  7. Kim, D.W.; Kim, S.W.; Lee, G.; Yoon, J.; Kim, S.; Hong, J.H.; Jo, S.C.; Jeong, U. Fabrication of practical deformable displays: Advances and challenges. Light Sci. Appl. 2023, 12, 61. [Google Scholar] [CrossRef]
  8. Huang, X.X.; Xue, Y.N.; Ren, S.Y.; Wang, F. Sensor-Based Wearable Systems for Monitoring Human Motion and Posture: A Review. Sensors 2023, 23, 9047. [Google Scholar] [CrossRef]
  9. Xu, L.G.; Huang, Z.K.; Deng, Z.S.; Du, Z.K.; Sun, T.L.; Guo, Z.H.; Yue, K. A Transparent, Highly Stretchable, Solvent-Resistant, Recyclable Multifunctional Ionogel with Underwater Self-Healing and Adhesion for Reliable Strain Sensors. Adv. Mater. 2021, 33, 2105306. [Google Scholar] [CrossRef]
  10. Li, T.L.; Wang, Q.A.; Cao, Z.C.; Zhu, J.L.; Wang, N.; Li, R.; Meng, W.; Liu, Q.; Yu, S.F.; Liao, X.Q.; et al. Nerve-Inspired Optical Waveguide Stretchable Sensor Fusing Wireless Transmission and AI Enabling Smart Tele-Healthcare. Adv. Sci. 2025, 12, e2410395. [Google Scholar] [CrossRef]
  11. Kim, D.H.; Lu, N.S.; Ma, R.; Kim, Y.S.; Kim, R.H.; Wang, S.D.; Wu, J.; Won, S.M.; Tao, H.; Islam, A.; et al. Epidermal Electronics. Science 2011, 333, 838–843. [Google Scholar] [CrossRef]
  12. Lu, N.S.; Lu, C.; Yang, S.X.; Rogers, J. Highly Sensitive Skin-Mountable Strain Gauges Based Entirely on Elastomers. Adv. Funct. Mater. 2012, 22, 4044–4050. [Google Scholar] [CrossRef]
  13. Dong, H.X.; Wang, Z.H.; Yang, C.; Chang, Y.Y.; Wang, Y.D.; Li, Z.; Deng, Y.G.; He, Z.Z. Liquid Metal-Based Flexible Sensing and Wireless Charging System for Smart Tire Strain Monitoring. IEEE Sens. J. 2024, 24, 1304–1312. [Google Scholar] [CrossRef]
  14. Luo, Y.F.; Abidian, M.R.; Ahn, J.H.; Akinwande, D.; Andrews, A.M.; Antonietti, M.; Bao, Z.N.; Berggren, M.; Berkey, C.A.; Bettinger, C.J.; et al. Technology Roadmap for Flexible Sensors. ACS Nano 2023, 17, 5211–5295. [Google Scholar] [CrossRef]
  15. Wang, P.W.; Ma, X.H.; Lin, Z.Q.; Chen, F.; Chen, Z.J.; Hu, H.; Xu, H.L.; Zhang, X.Y.; Shi, Y.Q.; Huang, Q.Y.; et al. Well-defined in-textile photolithography towards permeable textile electronics. Nat. Commun. 2024, 15, 887. [Google Scholar] [CrossRef]
  16. Wang, J.L.; Lu, C.H.; Zhang, K. Textile-Based Strain Sensor for Human Motion Detection. Energy Environ. Mater. 2020, 3, 80–100. [Google Scholar] [CrossRef]
  17. Liu, L.; Liang, X.H.; Wan, X.Q.; Kuang, X.J.; Zhang, Z.F.; Jiang, G.M.; Dong, Z.J.; Chen, C.Y.; Cong, H.L.; He, H.J. A Review on Knitted Flexible Strain Sensors for Human Activity Monitoring. Adv. Mater. Technol. 2023, 8, 2300820. [Google Scholar] [CrossRef]
  18. Chen, X.B.; Zhang, Z.L.; Shu, L.; Tao, X.M.; Xu, X.M. A novel double-sided fabric strain sensor array fabricated with a facile and cost-effective process. Sens. Actuators A-Phys. 2024, 370, 115208. [Google Scholar] [CrossRef]
  19. Borges, F.; Reis, P.R.; Pereira, D. A Comparison of Security and its Performance for Key Agreements in Post-Quantum Cryptography. IEEE Access 2020, 8, 142413–142422. [Google Scholar] [CrossRef]
  20. Huang, X.J.; Yuan, Y.L.; Liu, J.M.; He, J.; Shi, Y.X.; Gao, S.; Wu, J.; Xu, X.J.; Zhang, H.Q.; Li, P.; et al. AI-enhanced flexible ECG patch for accurate heart disease diagnosis, optimal wear positioning, and interactive medical consultation. Natl. Sci. Rev. 2025, 12, nwaf425. [Google Scholar] [CrossRef]
  21. Saboor, A.; Kask, T.; Kuusik, A.; Alam, M.M.; Le Moullec, Y.; Niazi, I.K.; Zoha, A.; Ahmad, R. Latest Research Trends in Gait Analysis Using Wearable Sensors and Machine Learning: A Systematic Review. IEEE Access 2020, 8, 167830–167864. [Google Scholar] [CrossRef]
  22. Reis, F.J.J.; Alaiti, R.K.; Vallio, C.S.; Hespanhol, L. Artificial intelligence and Machine Learning approaches in sports: Concepts, applications, challenges, and future perspectives. Braz. J. Phys. Ther. 2024, 28, 101083. [Google Scholar] [CrossRef]
  23. Otamendi, J.; Zubizarreta, A.; Portillo, E. Machine learning-based gait anomaly detection using a sensorized tip: An individualized approach. Neural Comput. Appl. 2023, 35, 17443–17459. [Google Scholar] [CrossRef]
  24. Uddin, M.Z. A wearable sensor-based activity prediction system to facilitate edge computing in smart healthcare system. J. Parallel Distrib. Comput. 2019, 123, 46–53. [Google Scholar] [CrossRef]
  25. Musat, C.L.; Mereuta, C.; Nechita, A.; Tutunaru, D.; Voipan, A.E.; Voipan, D.; Mereuta, E.; Gurau, T.V.; Gurau, G.; Nechita, L.C. Diagnostic Applications of AI in Sports: A Comprehensive Review of Injury Risk Prediction Methods. Diagnostics 2024, 14, 2516. [Google Scholar] [CrossRef]
  26. Shyr, T.W.; Shie, J.W.; Jiang, C.H.; Li, J.J. A Textile-Based Wearable Sensing Device Designed for Monitoring the Flexion Angle of Elbow and Knee Movements. Sensors 2014, 14, 4050–4059. [Google Scholar] [CrossRef]
  27. Galli, V.; Ahmadizadeh, C.; Kunz, R.; Menon, C. Textile-Based Body Capacitive Sensing for Knee Angle Monitoring. Sensors 2023, 23, 9657. [Google Scholar] [CrossRef]
  28. Isaia, C.; McMaster, S.; McNally, D. The effect of washing on the electrical performance of knitted textile strain sensors for quantifying joint motion. J. Ind. Text. 2022, 51, 8528S–8548S. [Google Scholar] [CrossRef]
  29. Yamashita, R.; Nishio, M.; Do, R.K.G.; Togashi, K. Convolutional neural networks: An overview and application in radiology. Insights Into Imaging 2018, 9, 611–629. [Google Scholar] [CrossRef]
  30. Zhao, X.; Wang, L.M.; Zhang, Y.F.; Han, X.M.; Deveci, M.; Parmar, M. A review of convolutional neural networks in computer vision. Artif. Intell. Rev. 2024, 57, 99. [Google Scholar] [CrossRef]
  31. Liñán-Cembrano, G.; Lourenco, N.; Horta, N.; de la Rosa, J.M. Design Automation of Analog and Mixed-Signal Circuits Using Neural Networks—A Tutorial Brief. IEEE Trans. Circuits Syst. II Express Briefs 2024, 71, 1677–1682. [Google Scholar] [CrossRef]
  32. Cui, C.L.; Yao, J.Y.; Xia, H. Data Augmentation: A Multi-Perspective Survey on Data, Methods, and Applications. CMC Comput. Mater. Contin. 2025, 85, 4275–4306. [Google Scholar] [CrossRef]
  33. Iwana, B.K.; Uchida, S. An empirical survey of data augmentation for time series classification with neural networks. PLoS ONE 2021, 16, e0254841. [Google Scholar] [CrossRef]
  34. Bai, Y.Z.; Zhou, Y.L.; Wu, X.Y.; Yin, M.F.; Yin, L.T.; Qu, S.Y.; Zhang, F.; Li, K.; Huang, Y.A. Flexible Strain Sensors with Ultra-High Sensitivity and Wide Range Enabled by Crack-Modulated Electrical Pathways. Nano-Micro Lett. 2025, 17, 64. [Google Scholar] [CrossRef]
  35. Chen, Z.H.; Lin, W.S.; Zhang, C.R.; Xu, Y.J.; Wei, C.; Hu, H.Q.; Liao, X.Q.; Chen, Z. Multifunctional and Reconfigurable Electronic Fabrics Assisted by Artificial Intelligence for Human Augmentation. Adv. Fiber Mater. 2024, 6, 229–242. [Google Scholar] [CrossRef]
  36. Fan, W.J.; Li, C.; Li, X.S.; Tian, H. Highly Sensitive Fabric Sensors Based on Scale-like Wool Fiber for Multifunctional Health Monitoring. ACS Appl. Mater. Interfaces 2023, 15, 28806–28816. [Google Scholar] [CrossRef]
  37. Bergmann, J.H.M.; Anastasova-Ivanova, S.; Spulber, I.; Gulati, V.; Georgiou, P.; McGregor, A. An Attachable Clothing Sensor System for Measuring Knee Joint Angles. IEEE Sens. J. 2013, 13, 4090–4097. [Google Scholar] [CrossRef]
  38. Papi, E.; Spulber, I.; Kotti, M.; Georgiou, P.; McGregor, A.H. Smart Sensing System for Combined Activity Classification and Estimation of Knee Range of Motion. IEEE Sens. J. 2015, 15, 5535–5544. [Google Scholar] [CrossRef]
  39. Xiang, H.; Li, Y.F.; Liao, Q.L.; Xia, L.; Wu, X.D.; Zhou, H.; Li, C.M.; Fan, X. Recent Advances in Smart Fabric-Type Wearable Electronics toward Comfortable Wearing. Energies 2024, 17, 2627. [Google Scholar] [CrossRef]
  40. Teague, C.N.; Hersek, S.; Töreyin, H.; Millard-Stafford, M.L.; Jones, M.L.; Kogler, G.F.; Sawka, M.N.; Inan, O.T. Novel Methods for Sensing Acoustical Emissions from the Knee for Wearable Joint Health Assessment. IEEE Trans. Biomed. Eng. 2016, 63, 1581–1590. [Google Scholar] [CrossRef]
  41. Faisal, A.; Majumder, S.; Scott, R.; Mondal, T.; Cowan, D.; Deen, M.J. A Simple, Low-Cost Multi-Sensor-Based Smart Wearable Knee Monitoring System. IEEE Sens. J. 2021, 21, 8253–8266. [Google Scholar] [CrossRef]
  42. Huang, F.; Zhu, Y.T.; Shi, L.; Teo, M.Y.; Kandasamy, S.; Aw, K. Capacitive stretch sensors for knee motion and muscle activity tracking for gait analysis. Sens. Actuators A-Phys. 2025, 390. [Google Scholar] [CrossRef]
  43. Papi, E.; Bo, Y.N.; McGregor, A.H. A flexible wearable sensor for knee flexion assessment during gait. Gait Posture 2018, 62, 480–483. [Google Scholar] [CrossRef]
  44. Abro, Z.A.; Zhang, Y.F.; Chen, N.L.; Hong, C.Y.; Lakho, R.A.; Halepoto, H. A novel flex sensor-based flexible smart garment for monitoring body postures. J. Ind. Text. 2019, 49, 262–274. [Google Scholar] [CrossRef]
  45. Yuan, J.F.; Zhang, Y.Z.; Wei, C.S.; Zhu, R. A Fully Self-Powered Wearable Leg Movement Sensing System for Human Health Monitoring. Adv. Sci. 2023, 10, e2303114. [Google Scholar] [CrossRef]
  46. Munro, B.J.; Campbell, T.E.; Wallace, G.G.; Steele, J.R. The intelligent knee sleeve: A wearable biofeedback device. Sens. Actuators B-Chem. 2008, 131, 541–547. [Google Scholar] [CrossRef]
  47. Chhoeum, V.; Kim, Y.; Min, S.D. Estimation of Knee Joint Angle Using Textile Capacitive Sensor and Artificial Neural Network Implementing with Three Shoe Types at Two Gait Speeds: A Preliminary Investigation. Sensors 2021, 21, 5484. [Google Scholar] [CrossRef]
  48. Akita, J.; Shinmura, T.; Sakurazawa, S.; Yanagihara, K.; Kunita, M.; Toda, M.; Iwata, K. Wearable electromyography measurement system using cable-free network system on conductive fabric. Artif. Intell. Med. 2008, 42, 99–108. [Google Scholar] [CrossRef]
  49. Zhang, Y.Z.; Caccese, J.B.; Kiourti, A. Wearable Loop Sensor for Bilateral Knee Flexion Monitoring. Sensors 2024, 24, 1549. [Google Scholar] [CrossRef]
  50. Xie, W.Q.; He, M.; Zheng, S.Y.; Li, H.Z.; Jin, H.F.; Ji, B.Z.; Yang, G.; Li, Y.S. Clinical application research of intelligent monitoring system for knee rehabilitation: A randomized controlled trial. J. Orthop. Surg. Res. 2024, 19, 477. [Google Scholar] [CrossRef]
  51. Tedesco, S.; Torre, O.M.; Belcastro, M.; Torchia, P.; Alfieri, D.; Khokhlova, L.; Komaris, S.D.; O’Flynn, B. Design of a Multi-Sensors Wearable Platform for Remote Monitoring of Knee Rehabilitation. IEEE Access 2022, 10, 98309–98328. [Google Scholar] [CrossRef]
  52. Rivera, B.; Cano, C.; Luis, I.; Elias, D.A. A 3D-Printed Knee Wearable Goniometer with a Mobile-App Interface for Measuring Range of Motion and Monitoring Activities. Sensors 2022, 22, 763. [Google Scholar] [CrossRef]
  53. Ramkumar, P.N.; Haeberle, H.S.; Ramanathan, D.; Cantrell, W.A.; Navarro, S.M.; Mont, M.A.; Bloomfield, M.; Patterson, B.M. Remote Patient Monitoring Using Mobile Health for Total Knee Arthroplasty: Validation of a Wearable and Machine Learning-Based Surveillance Platform. J. Arthroplast. 2019, 34, 2253–2259. [Google Scholar] [CrossRef]
  54. Huang, Y.P.; Liu, Y.Y.; Hsu, W.H.; Lai, L.J.; Lee, M.S. Monitoring and Assessment of Rehabilitation Progress on Range of Motion After Total Knee Replacement by Sensor-Based System. Sensors 2020, 20, 1703. [Google Scholar] [CrossRef]
  55. Neumann-Langen, M.V.; Ochs, B.G.; Lützner, J.; Postler, A.; Kirschberg, J.; Sehat, K.; Selig, M.; Grupp, T.M. Musculoskeletal Rehabilitation: New Perspectives in Postoperative Care Following Total Knee Arthroplasty Using an External Motion Sensor and a Smartphone Application for Remote Monitoring. J. Clin. Med. 2023, 12, 7163. [Google Scholar] [CrossRef]
  56. Bell, K.M.; Onyeukwu, C.; McClincy, M.P.; Allen, M.; Bechard, L.; Mukherjee, A.; Hartman, R.A.; Smith, C.; Lynch, A.D.; Irrgang, J.J. Verification of a Portable Motion Tracking System for Remote Management of Physical Rehabilitation of the Knee. Sensors 2019, 19, 1021. [Google Scholar] [CrossRef]
  57. Allseits, E.; Kim, K.J.; Bennett, C.; Gailey, R.; Gaunaurd, I.; Agrawal, V. A Novel Method for Estimating Knee Angle Using Two Leg-Mounted Gyroscopes for Continuous Monitoring with Mobile Health Devices. Sensors 2018, 18, 2759. [Google Scholar] [CrossRef]
  58. Ma, Z.T.; Fang, L.; Fang, C.; Chen, F.; Xing, S.P.; Chai, B.; Zheng, Z.J.; Wang, S.J. Intelligent wearable system design for personalized knee motion and swelling monitoring in osteoarthritis care. Cell Rep. Phys. Sci. 2025, 6, 102438. [Google Scholar] [CrossRef]
  59. Lou, N.; Diao, Y.A.; Chen, Q.Q.; Ning, Y.K.; Li, G.Q.; Liang, S.Y.; Li, G.L.; Zhao, G.R. A Portable Wearable Inertial System for Rehabilitation Monitoring and Evaluation of Patients with Total Knee Replacement. Front. Neurorobotics 2022, 16, 836184. [Google Scholar] [CrossRef]
  60. El Fezazi, M.; Achmamad, A.; Jbari, A.; Jilbab, A. IoT-Based System Using IMU Sensor Fusion for Knee Telerehabilitation Monitoring. IEEE Sens. J. 2025, 25, 11906–11914. [Google Scholar] [CrossRef]
  61. Babar, M.; Tariq, M.U.; Qureshi, B.; Ullah, Z.; Arif, F.; Khan, Z. An Efficient and Hybrid Deep Learning-Driven Model to Enhance Security and Performance of Healthcare Internet of Things. IEEE Access 2025, 13, 22931–22945. [Google Scholar] [CrossRef]
  62. Wang, Y.; Cai, X.; Guo, Y.; Chen, Z.; Cao, Y.; Du, W.; Xia, T.; Sepulveda, N.; Li, W. Self-powered highly stretchable ferroelectret nanogenerator towards intelligent sports. Nano Trends 2024, 8, 100053. [Google Scholar] [CrossRef]
  63. Cai, X.; Han, X.; Xie, J.; Cao, Y.; Li, W. Exploring the application of ferroelectret nanogenerators in medical engineering. FlexMat 2025, 2, 204–224. [Google Scholar] [CrossRef]
  64. Shi, Z.Y.; Meng, L.X.; Shi, X.L.; Li, H.P.; Zhang, J.Z.; Sun, Q.Q.; Liu, X.Y.; Chen, J.Z.; Liu, S.R. Morphological Engineering of Sensing Materials for Flexible Pressure Sensors and Artificial Intelligence Applications. Nano-Micro Lett. 2022, 14, 141. [Google Scholar] [CrossRef]
  65. Chen, X.B.; Wang, F.; Shu, L.; Tao, X.M.; Wei, L.; Xu, X.M.; Zeng, Q.; Huang, G.Z. A Single-material-printed, Low-cost design for a Carbon-based fabric strain sensor. Mater. Des. 2022, 221, 110926. [Google Scholar] [CrossRef]
  66. Zhao, L.; Zhang, Z.L. A improved pooling method for convolutional neural networks. Sci. Rep. 2024, 14, 1589. [Google Scholar] [CrossRef]
  67. Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
  68. Takamido, R.; Yokoyama, K.; Yamamoto, Y. Hitting movement patterns organized by different pitching movement speeds as advanced kinematic information. Hum. Mov. Sci. 2022, 81, 102908. [Google Scholar] [CrossRef]
  69. Wang, Z.M.; Liu, Y.J.; Duan, Y.F.; Li, X.C.; Zhang, X.R.; Ji, J.M.; Dong, E.R.; Zhang, Y.Y. USTC FLICAR: A sensors fusion dataset of LiDAR-inertial-camera for heavy-duty autonomous aerial work robots. Int. J. Robot. Res. 2023, 42, 1015–1047. [Google Scholar] [CrossRef]
  70. Renner, K.; Queen, R. Detection of age and gender differences in walking using mobile wearable sensors. Gait Posture 2021, 87, 59–64. [Google Scholar] [CrossRef]
Figure 1. Schematic of the fabric strain sensor (FSS) and its three-layer structure. The interconnect layer on the second layer is electrically connected to the sensors on the first layer via vertical vias. The strain sensing mechanism is mainly the tunneling effect of the CB/SE conductive composite materials. The same material was used to fill the small via holes as a paste.
Figure 1. Schematic of the fabric strain sensor (FSS) and its three-layer structure. The interconnect layer on the second layer is electrically connected to the sensors on the first layer via vertical vias. The strain sensing mechanism is mainly the tunneling effect of the CB/SE conductive composite materials. The same material was used to fill the small via holes as a paste.
Micromachines 17 00056 g001
Figure 2. Typical images during the test. Lifting the leg (right picture) and bending the knee (left picture) while sitting with a fabric sensor knee pad attached.
Figure 2. Typical images during the test. Lifting the leg (right picture) and bending the knee (left picture) while sitting with a fabric sensor knee pad attached.
Micromachines 17 00056 g002
Figure 3. Examples of 3 postures and corresponding resistance line graphs.
Figure 3. Examples of 3 postures and corresponding resistance line graphs.
Micromachines 17 00056 g003
Figure 4. Convolutional neural network (CNN) + Bidirectional long short-term memory network (BiLSTM) + attention mechanism deep learning model, specially designed for three types of posture and movement recognition.
Figure 4. Convolutional neural network (CNN) + Bidirectional long short-term memory network (BiLSTM) + attention mechanism deep learning model, specially designed for three types of posture and movement recognition.
Micromachines 17 00056 g004
Figure 5. (a) Model evaluation metrics: comparison of specific values of precision, recall, and F1 score in three poses; (b) Confusion matrix results for three different poses; (c) An example of the results predicted by the new data.
Figure 5. (a) Model evaluation metrics: comparison of specific values of precision, recall, and F1 score in three poses; (b) Confusion matrix results for three different poses; (c) An example of the results predicted by the new data.
Micromachines 17 00056 g005
Figure 6. (a) Confusion matrix results after training; (b) T-SNE 2D feature visualization; (c) T-SNE 3D feature visualization.
Figure 6. (a) Confusion matrix results after training; (b) T-SNE 2D feature visualization; (c) T-SNE 3D feature visualization.
Micromachines 17 00056 g006
Figure 7. Average attention weight vs. distribution on individual sensors.
Figure 7. Average attention weight vs. distribution on individual sensors.
Micromachines 17 00056 g007
Table 1. Height, weight, and age information of the ten test subjects.
Table 1. Height, weight, and age information of the ten test subjects.
No.12345678910
Age23252227252426232323
Height (cm)183182173174167192178177174174
Weight (kg)76958075748677706770
Table 2. Comparison of experimental results for CNN, LSTM, and CNN + BiLSTM + Attention.
Table 2. Comparison of experimental results for CNN, LSTM, and CNN + BiLSTM + Attention.
ModelsCNNLSTMCNN + BiLSTM + Attention
Accuracy0.790.670.95
Recall rate0.770.660.94
F1 score0.770.650.96
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, T.; Chen, X.; Wang, F. A Novel Fabric Strain Sensor Array with Hybrid Deep Learning for Accurate Knee Movement Recognition. Micromachines 2026, 17, 56. https://doi.org/10.3390/mi17010056

AMA Style

Chen T, Chen X, Wang F. A Novel Fabric Strain Sensor Array with Hybrid Deep Learning for Accurate Knee Movement Recognition. Micromachines. 2026; 17(1):56. https://doi.org/10.3390/mi17010056

Chicago/Turabian Style

Chen, Tao, Xiaobin Chen, and Fei Wang. 2026. "A Novel Fabric Strain Sensor Array with Hybrid Deep Learning for Accurate Knee Movement Recognition" Micromachines 17, no. 1: 56. https://doi.org/10.3390/mi17010056

APA Style

Chen, T., Chen, X., & Wang, F. (2026). A Novel Fabric Strain Sensor Array with Hybrid Deep Learning for Accurate Knee Movement Recognition. Micromachines, 17(1), 56. https://doi.org/10.3390/mi17010056

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop