Application of Digital Twin Technology in Synthetic Aperture Radar Ground Moving Target Intelligent Detection System
Abstract
:1. Introduction
- This paper designs an overall framework for SAR moving target intelligent detection based on the integration of DT and artificial intelligence technology. In particular, according to the characteristics of the measured data obtained from a real SAR system, a high-fidelity virtual replica of a current physical system is constructed in digital space. Then, the neural network model is trained on this basis to obtain an intelligent detector that can realize the robust detection of the moving targets in related application scenarios.
- By virtue of the SAR imaging algorithm, the parameters of target motion, position, and SCR (signal–clutter ratio) during echo synthesis are fully traversed to construct the high-fidelity SAR twin datasets, which can fully reflect the characteristics of the moving targets and preset scene in the real system and significantly improve the detection performance and generalization ability of the neural network model in practical application scenarios.
- The effectiveness of the proposed framework is verified on the related measured data obtained by radar sensors. The experimental results indicate that the DT-based framework for SAR ground moving target intelligent detection can effectively realize robust target detection and synchronous state perception in real test scenarios with no need for complex field experiments and is applicable to several SAR systems.
2. Methods
2.1. Physical SAR System
2.2. Virtual Replica of SAR-GMTI
- Target motion parameter traversal
- 2.
- Target position traversal
- 3.
- SCR traversal
2.3. SAR Ground Moving Target Detection Based on Deep Learning
3. Implementation Details
3.1. MiniSAR Measured System
3.2. Construction of MiniSAR Twin Datasets
3.3. SAR Moving Target Intelligent Detection Based on Faster-RCNN
- Feature extraction: To begin with, the feature extraction network generates feature maps used for subsequent RPN layers through convolution and pooling operations. Since SAR images are grayscale images and the targets to be recognized are relatively small, ResNet50 was selected as the backbone network of Faster-RCNN, and an FPN (Feature Pyramid Network) was further introduced to improve the detection performance of the model. In particular, ResNet50 can effectively balance the increasing network layers and the improvement in model performance, which can learn richer semantic information and avoid gradient vanishing and explosion at the same time.
- Multi-scale feature fusion: In the process of feature extraction, the shadow layers contain less semantic information but precise target positions, while the deep layers contain richer semantic information but rough locations. Generally, a CNN (Convolution neural network) adopts the bottom-up feature extraction strategy and only utilizes features of the last layer, which causes a sharp decline in the detection performance of small objects. By comparison, the FPN adopts a bidirectional feature extraction strategy and achieves feature fusion by horizontal connections, which can fully utilize bottom high-resolution features and high-level semantic information for prediction. On the basis of not increasing the computational load, a multi-scale feature pyramid was constructed by simply changing the network connection to realize the robust detection of multi-scale targets.
- Region proposal: An RPN directly processes the feature maps output from the backbone feature extraction network, which generates a series of rectangular region proposals through sliding windows (preset anchors). The convolution results are mapped to feature vectors and then input into fully connected layers for classification and regression. On the one hand, the likelihood probability of the output category is calculated by the SoftMax function, and a binary classification is performed on the anchors (target or background). On the other hand, the coordinates of the bounding boxes that contain targets are predicted and output through the regression layer.
- Classification and regression: The input feature vectors which are generated from the output proposals of the RPN are classified and located by the fully connected layers. Moreover, by virtue of the SoftMax function, the target category in proposals is further classified and the coordinates of the bounding boxes are determined from the regions that contain targets.
4. Experiment Results
4.1. Construction Results of Digital Twin Data
4.2. Validation of DT-Based SAR-GMTI Framework
4.3. Comparative Experiment Results and Analysis
5. Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Li, K.; Jiu, B.; Pu, W.; Liu, H.; Peng, X. Neural Fictitious Self-Play for Radar Antijamming Dynamic Game with Imperfect Information. IEEE Trans. Aerosp. Electron. Syst. 2022, 58, 5533–5547. [Google Scholar] [CrossRef]
- Barshan, B.; Eravci, B. Automatic Radar Antenna Scan Type Recognition in Electronic Warfare. IEEE Trans. Aerosp. Electron. Syst. 2012, 48, 2908–2931. [Google Scholar] [CrossRef]
- Wang, Z.; Xu, N.; Guo, J.; Zhang, C.; Wang, B. SCFNet: Semantic Condition Constraint Guided Feature Aware Network for Aircraft Detection in SAR Images. IEEE Trans. Geosci. Remote. Sens. 2022, 60, 1–20. [Google Scholar] [CrossRef]
- Luo, Y.; Song, H.; Wang, R.; Deng, Y.; Zhao, F.; Xu, Z. Arc FMCW SAR and Applications in Ground Monitoring. IEEE Trans. Geosci. Remote. Sens. 2014, 52, 5989–5998. [Google Scholar] [CrossRef]
- Lin, Y.; Zhao, J.; Wang, Y.; Li, Y.; Shen, W.; Bai, Z. SAR Multi-Angle Observation Method for Multipath Suppression in Enclosed Spaces. Remote. Sens. 2024, 16, 621. [Google Scholar] [CrossRef]
- Nan, Y.; Huang, X.; Guo, Y.J. An Universal Circular Synthetic Aperture Radar. IEEE Trans. Geosci. Remote. Sens. 2022, 60, 1–15. [Google Scholar] [CrossRef]
- Zhu, S.; Liao, G.; Qu, Y.; Zhou, Z.; Liu, X. Ground Moving Targets Imaging Algorithm for Synthetic Aperture Radar. IEEE Trans. Geosci. Remote. Sens. 2011, 49, 462–477. [Google Scholar] [CrossRef]
- Suwa, K.; Yamamoto, K.; Tsuchida, M.; Nakamura, S.; Wakayama, T.; Hara, T. Image-Based Target Detection and Radial Velocity Estimation Methods for Multichannel SAR-GMTI. IEEE Trans. Geosci. Remote. Sens. 2017, 55, 1325–1338. [Google Scholar] [CrossRef]
- Ding, C.; Mu, H.; Zhang, Y. A Multicomponent Linear Frequency Modulation Signal-Separation Network for Multi-Moving-Target Imaging in the SAR-Ground-Moving-Target Indication System. Remote. Sens. 2024, 16, 605. [Google Scholar] [CrossRef]
- Mu, H.; Zhang, Y.; Jiang, Y.; Ding, C. CV-GMTINet: GMTI Using a Deep Complex-Valued Convolutional Neural Network for Multichannel SAR-GMTI System. IEEE Trans. Geosci. Remote. Sens. 2022, 60, 1–15. [Google Scholar] [CrossRef]
- Guo, Q.; Liu, J.; Kaliuzhnyi, M. YOLOX-SAR: High-Precision Object Detection System Based on Visible and Infrared Sensors for SAR Remote Sensing. IEEE Sens. J. 2022, 22, 17243–17253. [Google Scholar] [CrossRef]
- Zhou, P.; Wang, P.; Cao, J.; Zhu, D.; Yin, Q.; Lv, J.; Chen, P.; Jie, Y.; Jiang, C. PSFNet: Efficient Detection of SAR Image Based on Petty-Specialized Feature Aggregation. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2024, 17, 190–205. [Google Scholar] [CrossRef]
- Zhou, Z.; Chen, J.; Huang, Z.; Wan, H.; Chang, P.; Li, Z.; Yao, B.; Wu, B.; Sun, L.; Xing, M. FSODS: A Lightweight Metalearning Method for Few-Shot Object Detection on SAR Images. IEEE Trans. Geosci. Remote. Sens. 2022, 60, 1–17. [Google Scholar] [CrossRef]
- Wang, Z.; Du, L.; Zhang, P.; Li, L.; Wang, F.; Xu, S.; Su, H. Visual Attention-Based Target Detection and Discrimination for High-Resolution SAR Images in Complex Scenes. IEEE Trans. Geosci. Remote. Sens. 2018, 56, 1855–1872. [Google Scholar] [CrossRef]
- Sun, M.; Li, Y.; Chen, X.; Zhou, Y.; Niu, J.; Zhu, J. A Fast and Accurate Small Target Detection Algorithm Based on Feature Fusion and Cross-Layer Connection Network for the SAR Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2023, 16, 8969–8981. [Google Scholar] [CrossRef]
- Zhou, Z.; Chen, J.; Huang, Z.; Lv, J.; Song, J.; Luo, H.; Wu, B.; Li, Y.; Diniz, P.S.R. HRLE-SARDet: A Lightweight SAR Target Detection Algorithm Based on Hybrid Representation Learning Enhancement. IEEE Trans. Geosci. Remote. Sens. 2023, 61, 1–22. [Google Scholar] [CrossRef]
- Mihai, S.; Yaqoob, M.; Hung, D.V.; Davis, W.; Towakel, P.; Raza, M.; Karamanoglu, M.; Barn, B.; Shetve, D.; Prasad, R.V.; et al. Digital Twins: A Survey on Enabling Technologies, Challenges, Trends and Future Prospects. IEEE Commun. Surv. Tutor. 2022, 24, 2255–2291. [Google Scholar] [CrossRef]
- Thieling, J.; Frese, S.; Roßmann, J. Scalable and Physical Radar Sensor Simulation for Interacting Digital Twins. IEEE Sens. J. 2021, 21, 3184–3192. [Google Scholar] [CrossRef]
- Yang, W.; Zheng, Y.; Li, S. Application Status and Prospect of Digital Twin for On-Orbit Spacecraft. IEEE Access 2021, 9, 106489–106500. [Google Scholar] [CrossRef]
- Rouffet, T.; Poisson, J.B.; Hottier, V.; Kemkemian, S. Digital twin: A full virtual radar system with the operational processing. In Proceedings of the 2019 International Radar Conference (RADAR), Toulon, France, 23–27 September 2019; pp. 1–5. [Google Scholar]
- Xie, W.; Qi, F.; Liu, L.; Liu, Q. Radar Imaging Based UAV Digital Twin for Wireless Channel Modeling in Mobile Networks. IEEE J. Sel. Areas Commun. 2023, 41, 3702–3710. [Google Scholar] [CrossRef]
- Sayed, A.N.; Ramahi, O.M.; Shaker, G. UAV Classification Utilizing Radar Digital Twins. In Proceedings of the 2023 IEEE International Symposium on Antennas and Propagation and USNC-URSI Radio Science Meeting (USNC-URSI), Portland, OR, USA, 23–28 July 2023; pp. 741–742. [Google Scholar]
- Ritchie, N.; Tierney, C.; Greig, D. Wind turbine clutter mitigation using a digital twin for airborne radar. In Proceedings of the International Conference on Radar Systems (RADAR 2022), Edinburgh, UK, 24–27 October 2022; Volume 2022, pp. 612–616. [Google Scholar]
- Ye, Y.; Shan, J.; Bruzzone, L.; Shen, L. Robust Registration of Multimodal Remote Sensing Images Based on Structural Similarity. IEEE Trans. Geosci. Remote. Sens. 2017, 55, 2941–2958. [Google Scholar] [CrossRef]
- Yang, Y.; Miao, Z.; Zhang, H.; Wang, B.; Wu, L. Lightweight Attention-Guided YOLO with Level Set Layer for Landslide Detection from Optical Satellite Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2024, 17, 3543–3559. [Google Scholar] [CrossRef]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1137–1149. [Google Scholar] [CrossRef]
- Liang, X.; Zhang, J.; Zhuo, L.; Li, Y.; Tian, Q. Small Object Detection in Unmanned Aerial Vehicle Images Using Feature Fusion and Scaling-Based Single Shot Detector With Spatial Context Analysis. IEEE Trans. Circuits Syst. Video Technol. 2020, 30, 1758–1770. [Google Scholar] [CrossRef]
- Li, J.; Tian, P.; Song, R.; Xu, H.; Li, Y.; Du, Q. PCViT: A Pyramid Convolutional Vision Transformer Detector for Object Detection in Remote-Sensing Imagery. IEEE Trans. Geosci. Remote. Sens. 2024, 62, 1–15. [Google Scholar] [CrossRef]
- Luomei, Y.; Xu, F. Segmental Aperture Imaging Algorithm for Multirotor UAV-Borne MiniSAR. IEEE Trans. Geosci. Remote. Sens. 2023, 61, 1–18. [Google Scholar] [CrossRef]
- Lv, J.; Zhu, D.; Geng, Z.; Han, S.; Wang, Y.; Yang, W.; Ye, Z.; Zhou, T. Recognition of Deformation Military Targets in the Complex Scenes via MiniSAR Submeter Images with FASAR-Net. IEEE Trans. Geosci. Remote. Sens. 2023, 61, 1–19. [Google Scholar] [CrossRef]
- Zhang, Y.; Zhu, D.; Mao, X.; Yu, X.; Li, Y. Multirotors Video Synthetic Aperture Radar: System Development and Signal Processing. IEEE Aerosp. Electron. Syst. Mag. 2020, 35, 32–43. [Google Scholar] [CrossRef]
- Sun, G.C.; Xing, M.; Xia, X.G.; Wu, Y.; Bao, Z. Beam Steering SAR Data Processing by a Generalized PFA. IEEE Trans. Geosci. Remote. Sens. 2013, 51, 4366–4377. [Google Scholar] [CrossRef]
- Shi, T.; Mao, X.; Jakobsson, A.; Liu, Y. Efficient BiSAR PFA Wavefront Curvature Compensation for Arbitrary Radar Flight Trajectories. IEEE Trans. Geosci. Remote. Sens. 2023, 61, 1–14. [Google Scholar] [CrossRef]
- Zhang, X.; Hu, D.; Li, S.; Luo, Y.; Li, J.; Zhang, C. Aircraft Detection from Low SCNR SAR Imagery Using Coherent Scattering Enhancement and Fused Attention Pyramid. Remote. Sens. 2023, 15, 4480. [Google Scholar] [CrossRef]
- Girshick, R. Fast R-CNN. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 2015; pp. 1440–1448. [Google Scholar]
- Leng, X.; Ji, K.; Kuang, G. Ship Detection From Raw SAR Echo Data. IEEE Trans. Geosci. Remote. Sens. 2023, 61, 1–11. [Google Scholar] [CrossRef]
- Joshi, S.K.; Baumgartner, S.V.; da Silva, A.B.; Krieger, G. Range-Doppler Based CFAR Ship Detection with Automatic Training Data Selection. Remote. Sens. 2019, 11, 1270. [Google Scholar] [CrossRef]
Parameter | Symbol | Value |
---|---|---|
Platform velocity | v | 6 m/s |
Carrier frequency | 9.7 GHz | |
Sample frequency | 0.54 GHz | |
Orbit height | H | 289 m |
Pulse repetition frequency | 1000 Hz | |
Range resolution | 0.33 m | |
Azimuth resolution | 0.25 m |
Parameter | Symbol | Value |
---|---|---|
Orbit height | H | 289 m |
Scene center distance | 649 m | |
Carrier frequency | 9.7 GHz | |
Range modulation frequency | GHz/s | |
Pulse repetition frequency | 1000 Hz | |
Platform velocity | v | 6 m/s |
Orbit radius | r | 581 m |
Sample frequency | 0.54 GHz |
Motion Parameter (, ) | Target Position (Range, Azimuth) | SCR (dB) | Correctly Detected Targets | False Alarms | Missed Targets |
---|---|---|---|---|---|
= 1.5, = 0.006 | (981, 899) | 16 | 183 | 241 | 496 |
= 1.0, = 0.004 | (981, 899) | 16 | 264 | 179 | 415 |
= 0.5, = 0.002 | (981, 899) | 16 | 307 | 154 | 372 |
= 1.0, = 0.004 | = 30 | 16 | 381 | 126 | 298 |
= 1.0, = 0.004 | = 20 | 16 | 436 | 93 | 243 |
= 1.0, = 0.004 | = 10 | 16 | 472 | 71 | 207 |
= 1.0, = 0.004 | = 20 | = 1 | 514 | 48 | 165 |
= 1.0, = 0.004 | = 20 | = 0.5 | 553 | 32 | 126 |
= 0.5, = 0.002 | = 10 | = 0.5 | 618 | 17 | 61 |
SCR (dB) | Correctly Detected Targets | False Alarms | Missed Targets | Precision |
---|---|---|---|---|
12 | 243 | 172 | 436 | 0.586 |
14 | 312 | 141 | 367 | 0.688 |
16 | 357 | 115 | 322 | 0.756 |
18 | 396 | 92 | 283 | 0.811 |
20 | 423 | 76 | 256 | 0.815 |
Methods | Correctly Detected Targets | False Alarms | Missed Targets | Recall Rate | Precision |
---|---|---|---|---|---|
CFAR () | 634 | 1582 | 45 | 0.934 | 0.286 |
CFAR () | 508 | 1243 | 171 | 0.748 | 0.290 |
CFAR () | 372 | 889 | 307 | 0.548 | 0.295 |
Ours | 618 | 17 | 61 | 0.910 | 0.973 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, H.; Yan, H.; Hao, J.; Xu, W.; Min, Z.; Zhu, D. Application of Digital Twin Technology in Synthetic Aperture Radar Ground Moving Target Intelligent Detection System. Remote Sens. 2024, 16, 2863. https://doi.org/10.3390/rs16152863
Liu H, Yan H, Hao J, Xu W, Min Z, Zhu D. Application of Digital Twin Technology in Synthetic Aperture Radar Ground Moving Target Intelligent Detection System. Remote Sensing. 2024; 16(15):2863. https://doi.org/10.3390/rs16152863
Chicago/Turabian StyleLiu, Hui, He Yan, Jialin Hao, Wenshuo Xu, Zhou Min, and Daiyin Zhu. 2024. "Application of Digital Twin Technology in Synthetic Aperture Radar Ground Moving Target Intelligent Detection System" Remote Sensing 16, no. 15: 2863. https://doi.org/10.3390/rs16152863
APA StyleLiu, H., Yan, H., Hao, J., Xu, W., Min, Z., & Zhu, D. (2024). Application of Digital Twin Technology in Synthetic Aperture Radar Ground Moving Target Intelligent Detection System. Remote Sensing, 16(15), 2863. https://doi.org/10.3390/rs16152863