Next Article in Journal
A CNN-Based Downscaling Model for Macau Temperature Prediction Using ERA5 Reanalysis Data
Previous Article in Journal
Coordinated Multimodal Transportation Infrastructure Planning in Megacities: An Agent-AI-Discrete Network Design Problem Approach Based on Chat-GPT
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Feature Fusion Graph Consecutive‑Attention Network for Skeleton‑Based Tennis Action Recognition

by
Pawel Powroznik
,
Maria Skublewska-Paszkowska
,
Krzysztof Dziedzic
* and
Marcin Barszcz
Department of Computer Science, Lublin University of Technology, Nadbystrzycka 36B, 20-618 Lublin, Poland
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(10), 5320; https://doi.org/10.3390/app15105320 (registering DOI)
Submission received: 10 April 2025 / Revised: 2 May 2025 / Accepted: 8 May 2025 / Published: 9 May 2025

Abstract

Human action recognition has become a key direction in computer vision. Deep learning models, particularly when combined with sensor data fusion, can significantly enhance various applications by learning complex patterns and relationships from diverse data streams. Thus, this study proposes a new model, the Feature Fusion Graph Consecutive-Attention Network (FFGCAN), in order to enhance performance in the classification of the main tennis strokes: forehand, backhand, volley forehand, and volley backhand. The proposed network incorporates seven basic blocks that are combined with two types of module: an Adaptive Consecutive Attention Module, and Graph Self-Attention module. They are employed to extract joint information at different scales from the motion capture data. Due to focusing on relevant components, the model enriches the network’s comprehension of tennis motion data representation and allows for a more invested representation. Moreover, the FFGCAN utilizes a fusion of motion capture data that generates a channel-specific topology map for each output channel, reflecting how joints are connected when the tennis player is moving. The proposed solution was verified utilizing three well-known motion capture datasets, THETIS, Tennis-Mocap, and 3DTennisDS, each containing tennis movements in various formats. A series of experiments were performed, including data division into training (70%), validating (15%), and testing (15%) subsets. The testing utilized five trials. The FFCGAN model obtained very high results for accuracy, precision, recall, and F1-score, outperforming the commonly applied networks for action recognition, such as the Spatial-Temporal Graph Convolutional Network or its modifications. The proposed model demonstrated excellent tennis movement prediction ability.
Keywords: feature fusion graph consecutive-attention network; tennis movement recognition; THETIS; Tennis-Mocap; 3DTennisDS feature fusion graph consecutive-attention network; tennis movement recognition; THETIS; Tennis-Mocap; 3DTennisDS

Share and Cite

MDPI and ACS Style

Powroznik, P.; Skublewska-Paszkowska, M.; Dziedzic, K.; Barszcz, M. Feature Fusion Graph Consecutive‑Attention Network for Skeleton‑Based Tennis Action Recognition. Appl. Sci. 2025, 15, 5320. https://doi.org/10.3390/app15105320

AMA Style

Powroznik P, Skublewska-Paszkowska M, Dziedzic K, Barszcz M. Feature Fusion Graph Consecutive‑Attention Network for Skeleton‑Based Tennis Action Recognition. Applied Sciences. 2025; 15(10):5320. https://doi.org/10.3390/app15105320

Chicago/Turabian Style

Powroznik, Pawel, Maria Skublewska-Paszkowska, Krzysztof Dziedzic, and Marcin Barszcz. 2025. "Feature Fusion Graph Consecutive‑Attention Network for Skeleton‑Based Tennis Action Recognition" Applied Sciences 15, no. 10: 5320. https://doi.org/10.3390/app15105320

APA Style

Powroznik, P., Skublewska-Paszkowska, M., Dziedzic, K., & Barszcz, M. (2025). Feature Fusion Graph Consecutive‑Attention Network for Skeleton‑Based Tennis Action Recognition. Applied Sciences, 15(10), 5320. https://doi.org/10.3390/app15105320

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop