Measurement Techniques for Highly Dynamic and Weak Space Targets Using Event Cameras
Abstract
1. Introduction
- For weak space targets’ event stream data, we improve the conventional spatiotemporal correlation filter. Specifically, the event stream is first segmented into pseudo-image frames at fixed time intervals; then, based on the trajectory characteristics of the space targets, a circular local sliding window is employed to determine the spatial neighborhood of the events; finally, noise within the event neighborhoods in the pseudo-image frames is assessed based on the event density—either within a single neighborhood or across multiple neighborhoods—thereby achieving effective denoising of the event stream data for weak space targets (Section 2);
- To extract space targets from the event stream, we introduce a density-based DBSCAN clustering algorithm to achieve subpixel target centroid extraction. For certain ultrahigh-dynamic scenarios—where rapid and nonuniform camera motions lead to target information distortion and data discontinuity—we utilize real-time motion data from an inertial measurement unit (IMU) to construct a camera motion model that infers the camera’s position and orientation at any given moment, thereby enabling the precise compensation and correction of trajectory distortions arising during motion (Section 3);
- We developed a ground-based simulation system to verify the applicability of our proposed method with actual cameras and in real-world scenarios (Section 4).
2. Denoising of Weak High-Dynamic Targets
2.1. Methodology
2.2. Optimization
- Define a set of radii , where . For each event () in the pseudo-image frame (), we define its neighborhood set () at different radii as follows:
- At each radius (), the local neighborhood density () is defined as follows:The above expression indicates the number of events within the kth-scale neighborhood that are sufficiently close to ;
- To integrate the density information across multiple scales, we introduce a multiscale fusion function () to describe the overall density of event within neighborhoods defined by different radii. The expression below indicates that only those events that maintain a certain density across all the scales are recognized as target events, thereby effectively eliminating false detections caused by local noise aggregation.
- In a manner similar to that in the single-scale filtering approach, we need to define a final decision threshold () to determine whether an event () is retained (denoted as 1) or discarded (denoted as 0).
- Hierarchical screening: First, a rapid preliminary screening is performed using a small radius () to eliminate the vast majority of the sparse noise; only the events retained from this initial screening are then subjected to a more refined density evaluation using larger radii ();
- Parallel/neighborhood accumulation: When computing the density across various scales, an octree spatial data structure is employed, enabling a single query to return the aggregation count for multiple radii, thereby reducing redundant scanning.
3. Extraction of Weak High-Dynamic Targets
- Initializing the Particle Set: The first step in particle filtering is initializing a set of particles, each representing a possible state of the camera. Each particle represents motion parameters, such as position, velocity, and orientation. These particles can be initialized using the IMU’s initial data or randomly based on a prior motion model. The state of each particle is represented as , where is the position, is the velocity, and is the orientation;
- Particle Prediction (Based on IMU Data): At each time step (k), the particles predict the current state based on the previous state and the IMU’s acceleration () and angular velocity (). This process is executed through the motion model.
- Particle Update (Based on Event Data and Observations): Particle filtering’s core is updating particles’ weights using the event camera’s observational data. Each particle’s weight is proportional to how closely its predicted state matches the actual observational data. For each particle (i), the match with the event camera data is calculated to determine the particle weight (). Particles with higher match values receive higher weights.
- Particle Resampling: Particle filtering resamples the particles based on their weights, creating a new particle set. This eliminates low-weight particles and focuses on those that better match the actual observational data, improving accuracy and robustness;
- State Estimation: The optimal estimate at the current time is obtained by calculating the weighted average of the particles’ states (). Particle filtering computes the camera’s final motion state by averaging the states of all the particles, weighted by their respective weights, as follows:
- Trajectory Correction and Compensation: Using the optimal estimated state, the trajectory of star points captured by the event camera can be corrected. Particle filtering compensates for lost or distorted event points, corrects deviations in the camera’s trajectory, and restores the original target event’s trajectory.
4. Experiments and Results
4.1. Experimental Setup
- Temporal Interval (): Event cameras typically operate at microsecond temporal resolutions. Conventional spatiotemporal denoising methods applied at these resolutions risk event loss during high-dynamic imaging. To mitigate this, our algorithm integrates events into pseudo-image frames, using intervals calibrated against typical star tracker exposure durations (e.g., 5 ms, 10 ms, and 20 ms). This ensures sufficient space target events are captured per frame while maintaining high-temporal-resolution advantages;
- Neighborhood Parameters (r, R) and Density Threshold (): These values were empirically determined through the statistical analysis of experimental data across 4–20°/s angular velocities, with scenario-adaptive adjustments: Stronger motion dynamics (reducing the target energy) necessitate larger radii (r) and lower density thresholds (), while elevated stray light conditions require smaller radii (r) and higher thresholds (). Spatial distribution analysis under varying angular velocities yielded the following statistically optimal parameters: neighborhood radius set R = 3, 5, 7, 9, 11, … pixels and density threshold = 25–100 pixel densities (equivalent to 5 × 5–10 × 10 pixel areas) based on high-dynamic star-imaging characteristics—target events are identified when neighborhood event counts exceed these thresholds.
4.2. Experimental Results and Analysis
5. Verification
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Xing, F.; You, Z.; Sun, T.; Wei, M. Principle and Implementation of APS CMOS Star Tracker; National Defense Industry Press: Beijing, China, 2017. [Google Scholar]
- Liu, H.N.; Sun, T.; Wu, S.Y.; Yang, K.; Yang, B.S.; Ren, G.T.; Song, J.H.; Li, G.W. Development and validation of full-field simulation imaging technology for star sensors under hypersonic aero-optical effects. Opt. Express 2025, 33, 6526–6542. [Google Scholar] [CrossRef]
- Liebe, C. Accuracy performance of star trackers—A tutorial. IEEE Trans. Aerosp. Electron. Syst. 2002, 38, 587–599. [Google Scholar] [CrossRef]
- Yang, B.; Fan, Z.; Yu, H. Aero-optical effects simulation technique for starlight transmission in boundary layer under high-speed conditions. Chin. J. Aeronaut. 2020, 33, 1929–1941. [Google Scholar] [CrossRef]
- Liu, H.; Sun, T.; Yang, K.; Xing, F.; Wang, X.; Wu, S.; Peng, Y.; Tian, Y. Aero-Thermal Radiation Effects Simulation Technique for Star Sensor Imaging Under Hypersonic Conditions. IEEE Sens. J. 2025, 25, 21803–21813. [Google Scholar] [CrossRef]
- Li, T.; Zhang, C.; Kong, L.; Wang, J.; Pang, Y.; Li, H.; Wu, J.; Yang, Y.; Tian, L. Centroiding Error Compensation of Star Sensor Images for Hypersonic Navigation Based on Angular Distance. IEEE Trans. Instrum. Meas. 2024, 73, 8508211. [Google Scholar] [CrossRef]
- Wang, J. Research on Key Technologies of High Dynamic Star Sensor. Ph.D. Thesis, University of Chinese Academy of Sciences (Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences), Changchun, China, 2019. [Google Scholar]
- Wan, X.; Wang, G.; Wei, X.; Li, J.; Zhang, G. ODCC: A dynamic star spots extraction method for star sensors. IEEE Trans. Instrum. Meas. 2021, 70, 5009114. [Google Scholar] [CrossRef]
- Zeng, S.; Zhao, R.; Ma, Y.; Zhu, Z.; Zhu, Z. An Event-based Method for Extracting Star Points from High Dynamic Star Sensors. Acta Photonica Sin. 2022, 51, 0912003. [Google Scholar]
- Schmidt, U. ASTRO APS-the next generation Hi-Rel star tracker based on active pixel sensor technology. In Proceedings of the AIAA Guidance, Navigation, and Control Conference and Exhibit, San Francisco, CA, USA, 15–18 August 2005; p. 5925. [Google Scholar]
- Cassidy, L.W. Miniature star tracker. In Proceedings of the Space Guidance, Control, and Tracking; SPIE: Bellingham, WA, USA, 1993; Volume 1949, pp. 110–117. [Google Scholar]
- Foisneau, T.; Piriou, V.; Perrimon, N.; Jacob, P.; Blarre, L.; Vilaire, D. SED16 autonomous star tracker night sky testing. In Proceedings of the International Conference on Space Optics—ICSO 2000, Toulouse, France, 5–7 December 2000; SPIE: Bellingham, WA, USA, 2017; Volume 10569, pp. 281–290. [Google Scholar]
- Cohen, G.; Afshar, S.; Morreale, B.; Bessell, T.; Wabnitz, A.; Rutten, M.; van Schaik, A. Event-based sensing for space situational awareness. J. Astronaut. Sci. 2019, 66, 125–141. [Google Scholar] [CrossRef]
- Chin, T.J.; Bagchi, S.; Eriksson, A.; Van Schaik, A. Star tracking using an event camera. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Long Beach, CA, USA, 16–17 June 2019. [Google Scholar]
- Bagchi, S.; Chin, T.J. Event-based star tracking via multiresolution progressive Hough transforms. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Snowmass, CO, USA, 1–5 March 2020; pp. 2143–2152. [Google Scholar]
- Roffe, S.; Akolkar, H.; George, A.D.; Linares-Barranco, B.; Benosman, R.B. Neutron-induced, single-event effects on neuromorphic event-based vision sensor: A first step and tools to space applications. IEEE Access 2021, 9, 85748–85763. [Google Scholar] [CrossRef]
- Muglikar, M.; Gehrig, M.; Gehrig, D.; Scaramuzza, D. How to calibrate your event camera. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 19–25 June 2021; pp. 1403–1409. [Google Scholar]
- Chen, S.; Li, X.; Yuan, L.; Liu, Z. eKalibr: Dynamic Intrinsic Calibration for Event Cameras From First Principles of Events. arXiv 2025, arXiv:2501.05688. [Google Scholar] [CrossRef]
- Salah, M.; Ayyad, A.; Humais, M.; Gehrig, D.; Abusafieh, A.; Seneviratne, L.; Scaramuzza, D.; Zweiri, Y. E-calib: A fast, robust and accurate calibration toolbox for event cameras. IEEE Trans. Image Process. 2024, 33, 3977–3990. [Google Scholar] [CrossRef] [PubMed]
- Peng, X.; Wang, Y.; Gao, L.; Kneip, L. Globally-optimal event camera motion estimation. In Proceedings of the Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, 23–28 August 2020; Proceedings, Part XXVI 16. Springer: Berlin/Heidelberg, Germany, 2020; pp. 51–67. [Google Scholar]
- Liu, Z.; Liang, S.; Guan, B.; Tan, D.; Shang, Y.; Yu, Q. Collimator-assisted high-precision calibration method for event cameras. Opt. Lett. 2025, 50, 4254–4257. [Google Scholar] [CrossRef] [PubMed]
- Liu, Z.; Guan, B.; Shang, Y.; Bian, Y.; Sun, P.; Yu, Q. Stereo Event-based, 6-DOF Pose Tracking for Uncooperative Spacecraft. IEEE Trans. Geosci. Remote Sens. 2025, 63, 5607513. [Google Scholar] [CrossRef]
- Cieslewski, T.; Choudhary, S.; Scaramuzza, D. Data-Efficient Decentralized Visual SLAM. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 2466–2473. [Google Scholar] [CrossRef]
- Elms, E.; Jawaid, M.; Latif, Y.; Chin, T. SEENIC: Dataset for Spacecraft PosE Estimation with NeuromorphIC Vision. 2022. Available online: https://zenodo.org/records/7214231 (accessed on 18 November 2024).
- Park, T.H.; Märtens, M.; Jawaid, M.; Wang, Z.; Chen, B.; Chin, T.J.; Izzo, D.; D’Amico, S. Satellite Pose Estimation Competition 2021: Results and Analyses. Acta Astronaut. 2023, 204, 640–665. [Google Scholar] [CrossRef]
- Liu, Z.; Shi, D.; Li, R.; Zhang, Y.; Yang, S. T-ESVO: Improved event-based stereo visual odometry via adaptive time-surface and truncated signed distance function. Adv. Intell. Syst. 2023, 5, 2300027. [Google Scholar] [CrossRef]
- Wang, R.; Wang, L.; He, Y.; Li, L. A space point object tracking method based on asynchronous event stream. Aerosp. Control Appl. 2024, 50, 46–55. [Google Scholar]
- Liu, H.; Brandli, C.; Li, C.; Liu, S.C.; Delbruck, T. Design of a spatiotemporal correlation filter for event-based sensors. In Proceedings of the 2015 IEEE International Symposium on Circuits and Systems (ISCAS), Lisbon, Portugal, 24–27 May 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 722–725. [Google Scholar]
- Gouda, M.; Abreu, S.; Bienstman, P. Surrogate gradient learning in spiking networks trained on event-based cytometry dataset. Opt. Express 2024, 32, 16260–16272. [Google Scholar] [CrossRef] [PubMed]
- Raviv, D.; Barsi, C.; Naik, N.; Feigin, M.; Raskar, R. Pose estimation using time-resolved inversion of diffuse light. Opt. Express 2014, 22, 20164–20176. [Google Scholar] [CrossRef] [PubMed]
- Çelik, M.; Dadaşer-Çelik, F.; Dokuz, A.Ş. Anomaly detection in temperature data using DBSCAN algorithm. In Proceedings of the 2011 International Symposium on Innovations in Intelligent Systems and Applications, Istanbul, Turkey, 15–18 June 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 91–95. [Google Scholar]
- Deng, D. Research on anomaly detection method based on DBSCAN clustering algorithm. In Proceedings of the 2020 5th International Conference on Information Science, Computer Technology and Transportation (ISCTT), Shenyang, China, 13–15 November 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 439–442. [Google Scholar]
- Stewart, T.; Drouin, M.A.; Picard, M.; Djupkep Dizeu, F.B.; Orth, A.; Gagné, G. A virtual fence for drones: Efficiently detecting propeller blades with a dvxplorer event camera. In Proceedings of the International Conference on Neuromorphic Systems 2022, Knoxville, TN, USA, 27–29 July 2022; pp. 1–7. [Google Scholar]
- Ahmad, N.; Ghazilla, R.A.R.; Khairi, N.M.; Kasi, V. Reviews on various inertial measurement unit (IMU) sensor applications. Int. J. Signal Process. Syst. 2013, 1, 256–262. [Google Scholar] [CrossRef]
- Yan, H.; Shan, Q.; Furukawa, Y. RIDI: Robust IMU double integration. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 621–636. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, H.; Sun, T.; Tian, Y.; Wu, S.; Xing, F.; Wang, H.; Wang, X.; Zhang, Z.; Yang, K.; Ren, G. Measurement Techniques for Highly Dynamic and Weak Space Targets Using Event Cameras. Sensors 2025, 25, 4366. https://doi.org/10.3390/s25144366
Liu H, Sun T, Tian Y, Wu S, Xing F, Wang H, Wang X, Zhang Z, Yang K, Ren G. Measurement Techniques for Highly Dynamic and Weak Space Targets Using Event Cameras. Sensors. 2025; 25(14):4366. https://doi.org/10.3390/s25144366
Chicago/Turabian StyleLiu, Haonan, Ting Sun, Ye Tian, Siyao Wu, Fei Xing, Haijun Wang, Xi Wang, Zongyu Zhang, Kang Yang, and Guoteng Ren. 2025. "Measurement Techniques for Highly Dynamic and Weak Space Targets Using Event Cameras" Sensors 25, no. 14: 4366. https://doi.org/10.3390/s25144366
APA StyleLiu, H., Sun, T., Tian, Y., Wu, S., Xing, F., Wang, H., Wang, X., Zhang, Z., Yang, K., & Ren, G. (2025). Measurement Techniques for Highly Dynamic and Weak Space Targets Using Event Cameras. Sensors, 25(14), 4366. https://doi.org/10.3390/s25144366