This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Open AccessArticle
Video Frame Interpolation for Extreme Motion Scenes Based on Dual Alignment and Region-Adaptive Interaction
by
Xin Ning
Xin Ning 1,*
,
Jiantao Qu
Jiantao Qu 1
,
Junyi Duan
Junyi Duan 2,
Kun Yang
Kun Yang 3 and
Youdong Ding
Youdong Ding 1,4,*
1
Shanghai Film Academy, Shanghai University, Shanghai 200072, China
2
Data and Target Engineering, Information Engineering University, Zhengzhou 450000, China
3
College of Mechanical Engineering, Taiyuan University of Technology, Taiyuan 030024, China
4
Shanghai Engineering Research Center of Motion Picture Special Effects, Shanghai 200072, China
*
Authors to whom correspondence should be addressed.
Symmetry 2025, 17(12), 2097; https://doi.org/10.3390/sym17122097 (registering DOI)
Submission received: 1 November 2025
/
Revised: 28 November 2025
/
Accepted: 3 December 2025
/
Published: 6 December 2025
Abstract
Video frame interpolation in ultra-high-definition extreme motion scenes remains highly challenging due to large displacements, nonlinear motion, and occlusions that disrupt spatio-temporal symmetry. To address this issue, this study proposes a frame interpolation method for extreme motion scenes based on dual alignment and region-adaptive interaction from the perspectives of cross-frame localization and adaptive reconstruction. Specifically, we design a two-stage motion information alignment strategy that obtains two types of motion information via optical flow estimation and offset estimation, and it progressively guides reference pixels for accurate long-range cross-frame localization, mitigating structural misalignment caused by limited receptive fields while simultaneously alleviating spatiotemporal asymmetry caused by inconsistent inter-frame motion speed and direction. Based on this, we introduce a region-adaptive interaction module that automatically adapts motion representations for different regions through cross-frame interaction and leverages distinct attention pathways to accurately capture both the global context and local high-frequency motion details. This achieves a dynamic feature fusion tailored to regional characteristics, significantly enhancing the model’s ability to perceive the overall structure and texture details in extreme motion scenarios. In addition, the introduction of a motion compensation module explicitly captures pixel motion relationships by constructing a global correlation matrix that compensates for the positioning errors of the dual alignment module in extreme motion or occlusion areas. The experimental results demonstrate that the proposed method achieves excellent overall performance in ultra-high-definition extreme motion scenes, with a PSNR improvement of 0.05 dB over state-of-the-art methods. In multi-frame interpolation tasks, it achieves an average PSNR gain of 0.31 dB, demonstrating strong cross-scene interpolation capability.
Share and Cite
MDPI and ACS Style
Ning, X.; Qu, J.; Duan, J.; Yang, K.; Ding, Y.
Video Frame Interpolation for Extreme Motion Scenes Based on Dual Alignment and Region-Adaptive Interaction. Symmetry 2025, 17, 2097.
https://doi.org/10.3390/sym17122097
AMA Style
Ning X, Qu J, Duan J, Yang K, Ding Y.
Video Frame Interpolation for Extreme Motion Scenes Based on Dual Alignment and Region-Adaptive Interaction. Symmetry. 2025; 17(12):2097.
https://doi.org/10.3390/sym17122097
Chicago/Turabian Style
Ning, Xin, Jiantao Qu, Junyi Duan, Kun Yang, and Youdong Ding.
2025. "Video Frame Interpolation for Extreme Motion Scenes Based on Dual Alignment and Region-Adaptive Interaction" Symmetry 17, no. 12: 2097.
https://doi.org/10.3390/sym17122097
APA Style
Ning, X., Qu, J., Duan, J., Yang, K., & Ding, Y.
(2025). Video Frame Interpolation for Extreme Motion Scenes Based on Dual Alignment and Region-Adaptive Interaction. Symmetry, 17(12), 2097.
https://doi.org/10.3390/sym17122097
Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details
here.
Article Metrics
Article Access Statistics
For more information on the journal statistics, click
here.
Multiple requests from the same IP address are counted as one view.