Next Article in Journal
Unlocking the Potential of Agri-Food Waste for Innovative Applications and Bio-Based Materials
Previous Article in Journal
Antimycotic Activity of Essential Oil of Origanum heracleoticum L. from Bulgaria Against Clinical Isolates of Candida spp.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Optical FBG Sensor-Based System for Low-Flying UAV Detection and Localization

1
Institute of Photonics, Electronics and Telecommunications, Riga Technical University, 12 Azenes Street, LV-1048 Riga, Latvia
2
Department of Radio Engineering and Information Security, Yuriy Fedkovych Chernivtsi National University, 58000 Chernivtsi, Ukraine
3
Fiber Optical Sensor Research Group (RTU FiberSens), Riga Technical University, Azenes Street 12, LV-1048 Riga, Latvia
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(21), 11690; https://doi.org/10.3390/app152111690
Submission received: 29 September 2025 / Revised: 24 October 2025 / Accepted: 30 October 2025 / Published: 31 October 2025

Abstract

With the recent increase in the threat posed by unmanned aerial vehicles (UAVs) operating in environments where conventional detection systems such as radar, optical, or acoustic detection are impractical, attention is paid to methods for detecting low-flying UAVs with small radar cross-section (RCS). The most commonly used detection methods are radar detection, which is susceptible to electromagnetic (EM) interference, and optical detection, which is susceptible to weather conditions and line-of-sight. This research aims to demonstrate the possibility of using passive optical fiber Bragg grating (FBG) as a sensitive element array for low-flying UAV detection and localization. The principle is as follows: an optical signal that propagates through an optical fiber can be modulated due to the FBG reaction on the air pressure caused by a low-flying (even hovering) UAV. As a result, a small target—the DJI Avata drone can be detected and tracked via intensity surge determination. In this paper, the experimental setup of the proposed FBG-based UAV detection system, measurement results, as well as methods for analyzing UAV-caused downwash are presented. High-speed data reading and processing were achieved for low-flying drones with the possible presence of EM clutter. The proposed system has shown the ability to, on average, detect an overpassing UAV’s flight height around 85 percent and the location around 87 percent of the time. The key advantage of the proposed approach is the comparatively straightforward implementation and the ability to detect low-flying targets in the presence of EM clutter.

1. Introduction

The use of UAVs, more commonly known as drones, is only becoming more widespread. Drones can serve civil and military purposes such as scouting, performing deliveries, taking measurements, etc. [1,2]. All of these use cases require dependable and accurate methods of UAV detection. Popular drone recognition methods use optical and acoustic sensors, and radars [3], as well as incorporate sensor fusion techniques to mitigate the shortcomings of any individual detection method [4].
Acoustic detection methods provide a relatively cheap means and are immune to vehicle size and lighting conditions; therefore, they can be used around the clock. It also does not require the UAV to be in the line of sight to be able to successfully detect the vehicle. These systems use highly sensitive sensors, such as omnidirectional microphones, that pick up sounds caused by the operation of drones [3,5,6,7,8]. Typically, noise caused by the UAV’s propeller blades is used to detect the aircraft, but additionally, other airborne sounds made by the drone’s movement through the air can be used. Since acoustic UAV detection is a relatively cheap and simple method that does not require sophisticated hardware to operate, its accuracy is highly susceptible to noise pollution caused by wind and other vehicles. The environment also plays a significant factor in acoustic detection as it directly impacts how the sound waves travel and echo across the terrain, therefore impacting the distance a UAV can be detected [9]. It is presumed that the usable range, in perfect weather conditions, of an acoustic detection UAV is around a couple of hundred meters, depending on the environment.
Distributed acoustic sensing (DAS) has emerged as a variation of conventional acoustic detection methods for UAV detection with the key advantage of overcoming the limited range of conventional microphone sensor arrays [7,8] by using optical fibers [10,11,12]. DAS employs fiber-optic acoustic sensors placed over a wide area, which can be interrogated from a central location, thereby enabling centralized data collection and processing, overcoming synchronization issues present with wide-scale deployment of microphone arrays [10]. Such an approach allows for drone localization and tracking by correlating audio signals received by several acoustic sensors; however, this approach is still susceptible to noise pollution.
Some of these problems can be mitigated by using an optical drone recognition method. Optical detection uses advanced camera systems that are designed for small to medium-sized object recognition [13,14]. The use of visual recognition means that optical detection systems are not impacted by sound pollution. Depending on the object size and camera resolution, the useful range is also significantly increased, ranging from a couple of hundred meters to several kilometers [3,13]. Unlike acoustic detection, optical drone recognition requires the object to be directly in the line of sight for successful operation. These systems are impacted by lighting conditions in the environment. Weather conditions such as rain, storms, snow, etc., play a significant role in the accuracy and operational range of such systems. Most optical detection systems cannot be used during the night and in low-light conditions. Environmental background can also increase or decrease the ability to distinguish a flying object [15]. Light detection and ranging (LiDAR) technology is being explored as a method for UAV detection, particularly in combination with acoustic sensing, for enhanced target detection and classification [16]. LiDAR systems can provide accurate 3D localization of the target while remaining immune to radio frequency (RF) interference; however, like all optical detection systems, it is highly susceptible to weather conditions.
However, by far, the most popular method in unmanned aerial vehicle detection is the use of radar technology [17,18]. Although not immune, radar technology mitigates the extreme sensitivity to weather conditions shown in the previously discussed methods. It does not require daylight or quiet conditions to work. It also has the furthest effective distance out of all the reviewed detection methods. Nevertheless, heavy rain and thunderstorms can impact the effective range of radar technology, but they do not have the same effect as in acoustic or optical recognition methods. One of the biggest drawbacks of using radar is its inability to detect low-flying targets since it is directly impacted by its distance from the ground [3,14]. Another disadvantage of radar technology is that all targets must be in the line of sight and can’t be obscured by other objects. For example, the effectiveness of radar technology is greatly diminished in urban or woodland areas where there are typically lots of objects that can interact with EM wave propagation and obscure a UAV. Some of these problems can be solved by using even higher frequency radars in the millimeter wavelength range, but this comes with its own problems as shorter wavelength waves are more susceptible to weather conditions, especially moisture and rain [3,14,15,19,20].
Contemporary counter unmanned aerial system (C-UAS) approaches trend towards involving sensor fusion techniques [4,21,22] and integrating artificial intelligence (AI) in counter unmanned aerial systems (C-UAS) to aid in target classification [4]. One approach is to integrate radar systems with cameras, which allows for a multi-layered architecture for UAV detection and classification [23,24]. While the radar part of the system provides information such as bearing, range and RCS of the target, the optical part, i.e., the camera, serves to classify the targets and allows for greater accuracy, in particular when it comes to differentiating small UAVs and birds. In a similar manner, UAV-emitted RF communication signal detection combined with acoustic detection can mitigate the noise sensitivity of a purely acoustic detection system as well as aid in target classification [25]. Combining radar, optical and acoustic C-UAS methods in a single system and training a convolutional neural network (CNN) on the radar echoes, captured images and audio signals can significantly increase target classification accuracy [26].
UAV caused downwash measurements can provide significant improvements in terms of low flying object detection and effectiveness in object-dense areas such as woods, cities, etc. Downwash detection based on optical communications technologies such as the FBG can provide similar benefits as passive radar technology, therefore mitigating various disruptive environmental factors and at a fraction of the cost [27,28]. Additionally, the possibility of reusing existing telecommunications infrastructure for this purpose can lower costs even further and allow for fast, wide-scale deployment of the proposed system. FBGs have a wide range of uses in medicine [29], measurement of strain [30], measurement of humidity [31], and temperature [32] and many others [33,34]. FBGs can operate over long distances, making them highly sought after for large-scale sensing networks [35]. Additionally, advanced nanostructures [36,37] show great promise in enhancing the light coupling efficiency and, consequently, the sensitivity of the FBG sensor network. While the possible benefits are clear, they do not yet outweigh the drawbacks, such as increased manufacturing costs and complexity, and the general maturity of this technology. As one of the key advantages of the proposed system is the reuse of existing fiber technology, this is a significant consideration. Integrating FBG sensors in roadways for vehicle detection and classification [38,39] has generated interest in utilizing similar techniques to classify passing drones. Classification of passing drones is also a relevant topic when looking into targeted radar systems [40], where this novel UAV detection system can be paired with others, such as radio-over-fiber (ROF)-based radar systems, to create a multifaceted detection system. The straightforward separation of FBGs allows for distinct readings from each section, enabling an array of sensors to deliver a comprehensive analysis of disturbances caused by passing objects [38].
FBG-based drone localization and tracking technique shares several similarities with DAS approach, namely the use of multiple passive sensors distributed along an optical fiber, which detect the air pressure changes; however, in the case of DAS, the pressure changes detected by the sensors are significantly smaller, requiring high sensitivity. Both approaches offer immunity to electromagnetic interference (EMI). While DAS allows for localization of drones over a wider area by using the time of arrival difference of the acoustic signals [11], the FBG-based approach allows for precise localization of the drone, at the point where it overflies the sensor array, and is immune to noise pollution. These features are useful when such a system is deployed with the goal of detecting intrusions in a critical area. DAS approach requires more complex signal processing and more costly sensor interrogator hardware, as well as the sensors themselves, than the FBG-based approach. DAS and FBG-based approaches also differ in the manner of deployment and can be complementary to each other, enabling a sensor fusion-based approach to drone tracking and intrusion detection.
In this research, we propose a system utilizing FBGs to evaluate the downwash effects directly beneath the drone by utilizing the FBG sensor network strain readings to calculate the airflow speed in the z-axis, and we aim to experimentally demonstrate the feasibility of the proposed system. The system relies on surrounding the perimeter of the area of interest with an optical fiber that has been equipped with FBG sensors that change the reflection spectrum based on mechanical deformation. The key advantage is that such fibers can detect objects flying as low as 1 m, which is significantly lower than most advanced radar systems. Furthermore, it is also a relatively cheap solution compared to previously discussed methods, as the manufacturing costs of fiber optic cables are notably smaller than for cameras, microphones, or radars used in drone detection. The sensors are not significantly impacted by weather conditions such as wind, rain, and snow, as the specific signature of a passing UAV can be easily distinguished. A significant advantage of FBG is that it is not affected by electromagnetic interference. This greatly increases its reliability in areas with EM clutter.
This paper is structured as follows: Section 2 summarizes the methods of analyzing UAV caused downwash, Section 3 presents the experimental setup of the proposed UAV detection system; meanwhile, Section 4 presents the results of the experimental validation of the system. Finally, Section 5 provides conclusions about the presented research.

2. Methods for the Analysis of UAV Downwash

UAV propellers generate complex patterns of air vortices, downwash flow, and other air disruptions that need to be accounted for when drones are used for precision applications [41,42,43]. Air disruption is caused in all directions, but the most notable and studied is the disruption below the drone in the z-direction. This will represent the largest disruption area, as the propellers’ pitch is specifically designed to generate downward thrust and is particularly critical, as it can disrupt pesticide spray in agricultural applications and can also be detected by UAV detection systems [41,42,43,44]. For this reason, many simulation tools provide the ability to assess these airflow fields and give insight into the potential disruptions a UAV can cause around it [43,45,46]. Most methods of analyzing air movement around the drone involve numerical simulations through Computational Fluid Dynamics (CFD) [43,45,46]. A widely used analytical approach is the Lattice Boltzmann method of fluid simulation, which provides insights into general downwash patterns and wingtip vortices [41,42].
Limited methods are available to provide practical insights into expected UAV downwash flow, which is essential for those seeking to evaluate the impact of a specific drone on the environment below. Practical evaluation of the intricacies of drone-induced turbulence is limited to using anemometers [41,44] or relying on costly and complex wind tunnel rotors, resulting in reduced induced turbulence. Basic fluid dynamics principles indicate that the thrust generated by a propeller is directly proportional to its surface area. From this, we can conclude that the turbulence induced will be linearly proportional to the square of its radius. However, this turbulence can manifest as vorticity, downwash airflow, and other components. For this reason, a thorough and detailed study would necessitate the use of simulation tools, e.g, Ansys Fluent [43,45,46]. However, creating the mesh and importing drone characteristics can be quite challenging, as each model has unique specifications, such as propeller guard design, propeller speed, and rotor pitch, which collectively increase the computational resources required for precise simulations, which is why a more straightforward and practical method for downwash analysis is essential [41,42]. A summary of the publications as well as current research is presented in Table 1.

3. Experimental Setup

The experimental setup block diagram is depicted in Figure 1, and captured photographs of the experiment setup can be seen in Figure 2a. The setup includes the fiber that is placed in a plastic trench to minimize any side airflow induced by the drone or other external sources of air movement that could affect the strain on the fiber. The fiber is securely fixed between each sensor, creating 10 independent sections, each 10 cm long, where each section is equipped with one sensor. The integrated FBGs in this fiber optical network feature a full width at half-maximum (FWHM) of less than 0.5 nm, light reflectivity exceeding 95%, and a suppression ratio of side-lobes (SRSL) greater than 15 dB. The reflected wavelength spacing between the sensors is 5 nm, with an initial reflected wavelength in the range of 1520 to 1565 nm. The raw wavelength shift data from the deformed FBG sensors can be converted into a strain value using Formula (1):
Δ ϵ = λ act λ 0 λ 0 B · ( T act T 0 ) A ,
where Δ ϵ is the strain shift, λ 0 is the initial strain wavelength, λ act is the actual strain wavelength, T 0 is the initial temperature, T act is the actual temperature, and A and B are the calibration coefficients.
The calibration coefficients for the used FBGs are A = 7.758423 · 10 7 μ ϵ 1 and B = 5.892923 · 10−6 °C−1. It is worth noting that with temperature changes below 0.1 °C, which was the case in our scenario, the second term becomes negligible while the first term is never negligible. Regardless, during the experiments, temperature compensation was performed.
The measurement device, SCN-80 S-line Scan 800 from Sylex (Bratislava, Slovakia), which operates in the wavelength range of 1510–1590 nm and has a scanning frequency of 5 kHz, consists of a superluminescent light emitting diode (SLED) that feeds an optical circulator (OC) and transfers light to the FBG sensors via a single mode optical fiber (SMF). The measurement device performs 150 readings of reflected signal from each FBG sensor every second and, being fully controlled from a personal computer (PC), allows saving raw data after passing the optical spectrometer with a listed resolution of 1 pm and recorded precision of 1 fm and digital signal processor (DSP).
The DJI Avata (Shenzhen, China) drone with a rotor diameter of 0.074 m and 5 blades per rotor was used as an object-under-test that can hover and fly over the fiber in accordance with the outlined squares as shown in Figure 2a. The size of each unit square is equal to 30 by 30 cm and is comparable to the drone size. Each square corresponds to the coordinate system values (x,y), where x and y are 0:1:8 as depicted in Figure 2b. The red straight line on the coordinate system presents the fiber part with ten equidistantly distributed FBG sensors (two sensors per unit square) that are placed exactly in the middle of the flight area.
During flights, the UAV’s location in the XY plane was noted using the DJI Avata integrated downward-facing camera, and the location in the Z plane was noted using its integrated height measurement system.

4. Results

In the first stage of the experiments, the ability of the FBG sensor array to analyze UAV aerial downwash was assessed. The DJI Avata UAV was flown over the sensor array at 4 different flight heights to collect relevant FBG strain data. The heights were 30, 60, 90, and 120 cm above the setup. Prior to this, each sensor was tested with an airflow module, which induced specific airflow velocities at a set distance from it. This was done to measure what strain value corresponds to a specific airflow velocity value for each sensor. It is necessary to mention that all raw measurement data were processed with a Savitzky–Golay-type third-order arithmetic filter with a 101-point sampling frame to ensure that sudden peaks and anomalies would be smoothed out, essentially providing anti-aliasing for our input signal. It was found that this combination of frame length and filter degree yielded the best fidelity results in measurements described below. Comparison between unfiltered and filtered data can be visible in Figure 3. The strain data from the overpassing UAV flights was interpolated to aerial velocity values, and the result is visualized in Figure 4. This visualization is made using a cubic interpolation between the four different flight data sets to show a comprehensive look into how the high-pressure dissipates below the UAV.
As visible in the figure, the UAV creates an area of high intensity downwash directly below it and then two distinct pockets of airflow further down, when viewed as a 2D slice. A low-pressure air pocket directly below and a small distance away from the hull of the UAV forms due to there being two rows of propellers, meaning two sets of vortices below them. With higher intensity vortices, they appear to merge, while at lower intensities, they can be distinguished. This phenomenon is also noticed by other authors [41,42]. From this, it can be concluded that it is not always possible to localize a UAV by only locating the sensor that is most deformed and deducing that the object passed the system at that point. By analyzing the downwash data of different flights passing the FBG sensor array, an effective method for UAV localization can be achieved.
The experimental study included two sets of measurements designed to give insight into the ability of the proposed FBG sensor array to correctly identify the location of an overpassing UAV. The first set of measurements served as a tool for height analysis, which allows evaluating the viability of UAV flight height detection and possible extension to object differentiation. Four flights were performed at the heights of 30, 60, 90, and 120 c m above the sensor array. The induced wavelength shift on the FBG sensors in the baseline measurements can be seen in Figure 5.
From these measurements, we can both analyze the aerial downwash of the UAV and use them as a baseline to compare other flights to detect the flight height of a passing drone over the sensor array. Due to random air vortices and the possibility of some sensors reacting differently to others, it would not be possible to accurately assess the location or height of the UAV crossing using only the raw sensor data and choosing the sensor that was most deformed as an indicator of location. For higher accuracy predictions, the cosine similarity principle [47,48], which is a way of comparing the trends of two vector arrays, was used. This form of pattern recognition was chosen because it is relatively simple and requires a small initial data set, which is essential in this type of experiment. To do this, the absolute peak wavelength shift from all sensors of a disturbed period was located, and its timestamp was noted. Five wavelength shift values were taken before and after this timestamp, with intervals of 6.7 ms each, to form the increase and decrease profiles of the specific wavelength shift of this most disturbed sensor, resulting in an 11 × 1 matrix. The wavelength shift values of all other sensors at those specific timestamps were taken to form an 11 × 10 matrix, which represents that specific crossing. Multiple iterations across each height of passing were taken to form an average 11 × 10 increase and decrease of the wavelength shift matrix. Adding together all 4 heights, a 4 × 11 × 10 base matrix was formed. Employing the same concept, the random flights were also converted into 10 × 11 value matrices, with each element corresponding to a strain value from the FBG sensors. The 11 arrays from the random matrix were compared to the 11 arrays from the set of 4 base matrices by using the cosine similarity principle shown in Formula (2), and a best match was determined. During the cosine similarity comparison, all arrays were normalized to their own top values to make sure that, when comparing them, the most significant factor would be the overall trend of the array and not the absolute peak of it.
cos Φ = A ( i , j , : ) · B ( j , : ) A ( i , j , : ) B ( j , : ) ,
where cos Φ is the cosine product, A ( i , j , : ) is the contemporary base array from the base matrix, and B ( j , : ) is the test array. The numerator consists of the dot product of these arrays, while the denominator is the product of the multiplication of the Euclidian norm of these arrays.
Using this principle and comparing to the real data collected at the time of the flights, it was concluded that flights at 30 c m were correctly identified by this FBG sensor system and algorithm 90 percent of the time, flights at 90 and 120 c m at around 85 percent, and flights at 60 c m 80 percent of the time. This is likely due to the fact that, as visible in Figure 5, 90 and 120 cm flights have more distinct pockets of heightened air pressure, while at 60 cm height, the high peak at the middle can be mistaken for the central peak visible at 30 cm flight height.
To examine this FBG sensor setup’s ability to localize the UAV on the x-y plane, an initial set of configuration flights was conducted over columns 1 through 7 from the grid visible in Figure 2b. The purpose of these flights was to establish what the FBG strain reading responses could look like when the UAV passes through these specific sections. A relatively constant speed of 1 m / s and height of 1 m were maintained during these measurements. After a set of baseline measurements was obtained, random flights were conducted perpendicular to the sensor array, and the sections that were passed over were visually noted for further analysis and to see how well this FBG sensor array could locate overpassing UAVs. The processed results of these measurements can be seen in Figure 6.
Across all measurements, the correct section was detected 81 percent of the time when using the cosine similarity principle. However, it is worth noting that this is because the flights over the first section, which were not directly over the sensor array, had only 70 percent accuracy. Flights over sections 2, 3, and 6 were correctly identified around 95 percent of the time. In contrast, sections 4, 5, and 7 were correctly identified in around 85 percent of measurements.
When looking at only flights that took place directly over the sensor array, this system was able to identify the location and narrow it down to a 30 c m long stretch of area in 90 percent of cases. The results obtained show the proof of this concept and the ability of an FBG array to localize an overpassing UAV. When considering future perspectives for this technology, it would greatly benefit this system if a more comprehensive air collection system were implemented to enable a more thorough accumulation of relevant flight data. This would ensure a higher accuracy in the discrimination of different flight heights and locations.

5. Conclusions

In this paper, a novel approach for a small target, for example, the DJI Avata drone, detection and localization using an optical FBG-based sensor array is presented. The presented system possesses a number of advantages over conventional detection systems, for example, radar. The conventional radar, using only EM waves for target detection, exhibits problems with the detection of small targets (targets with a small value of RCS) and detection in the case of large clutter presence, low-flying targets, and under conditions of electromagnetic and other interference (clouds, trees, radar jamming, etc.). Meanwhile, the suggested system uses the mechanical nature of the air pressure changes on the optical FBG sensor array generated by the target.
The analysis is based on the time-frame estimation of the reflected signal intensity, followed by the detection and tracking of its maximum. The utilized device SCN-80 S-line Scan 800 from Sylex, being the core component of the sensor’s data processing part, supports high-speed data reading and processing, 150 readings per second, which is enough for detecting slow drones as the DJI Avata.
The system successfully detected high-frequency aerial disturbances in the z-direction beneath a perpendicularly flying UAV. The UAV-induced strain on the FBG network substantially decreased when increasing flight height. By completing a series of test flights and compiling a standard flight matrix, it was possible to detect an overpassing UAV’s flight height around 85 percent and the location around 87 percent of the time.
The achieved results validate the use of FBG strain sensors for analyzing UAV-induced downwash as an alternative to complex systems and a proof of concept that FBGs can be used as small-scale devices in drone localization. The presented system prototype has a high potential to serve as an augmentation to existing radar and other UAV detection systems in a sensor fusion approach as well as a standalone system for the detection of low-flying targets in environments where the deployment of conventional radar, optical, and acoustic detection systems is impractical.
Our future work includes testing of the proposed system in various weather conditions such as rain, snow, and wind, as well as determining the detection and localization capabilities with various UAVs in different flight modes. Additionally, we aim to explore the proposed system’s impact on UAV localization when integrated with a frequency-modulated continuous wave (FMCW) radar system.

Author Contributions

Conceptualization, I.M., D.O. and T.S.; methodology, I.M. and D.O.; software, M.K. and V.T.; validation, D.A., N.K. and O.N.; formal analysis, J.B. and S.M.; investigation, I.M., D.O., D.A., N.K., O.N., R.K.Z., A.K. and P.E.S.; resources, J.B., V.B. and T.S.; data curation, R.K.Z., M.K., V.T. and A.S.; writing—original draft preparation, I.M. and D.O.; writing—review and editing, A.S., A.I., S.M. and T.S.; visualization, M.K. and V.T.; supervision, A.I., V.B. and T.S.; project administration, V.B.; funding acquisition, I.M. and T.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research has been supported by the EU Recovery and Resilience Facility within the Project No 5.2.1.1.i.0/2/24/I/CFLA/003 “Implementation of consolidation and management changes at Riga Technical University, Liepaja University, Rezekne Academy of Technology, Latvian Maritime Academy and Liepaja Maritime College for the progress towards excellence in higher education, science and innovation” academic career doctoral grant (ID 1010) and by the National Research Foundation of Ukraine (grant No 2023.04/0150).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article. The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AIartificial intelligence
CNNconvolutional neural network
C-UAScounter unmanned aerial systems
DASdistributed acoustic sensing
DSPdigital signal processor
EMelectromagnetic
EMIelectromagnetic interference
FBGfiber Bragg grating
FMCWfrequency-modulated continuous wave
FWHMfull width at half-maximum
LiDARlight detection and ranging
OCoptical circulator
PCpersonal computer
RCSradar cross-section
RFradio frequency
ROFradio-over-fiber
SLEDsuperluminescent light emitting diode
SMFsingle mode optical fiber
SRSLsuppression ratio of side-lobes
UAVunmanned aerial vehicle

References

  1. Kiss, B.; Ballagi, Á.; Kuczmann, M. Overview Study of the Applications of Unmanned Aerial Vehicles in the Transportation Sector. Eng. Proc. 2024, 79, 11. [Google Scholar] [CrossRef]
  2. Sivakumar, M.; Tyj, N.M. A Literature Survey of Unmanned Aerial Vehicle Usage for Civil Applications. J. Aerosp. Technol. Manag. 2021, 13, e4021. [Google Scholar] [CrossRef]
  3. Khan, M.A.; Menouar, H.; Eldeeb, A.; Abu-Dayya, A.; Salim, F.D. On the Detection of Unauthorized Drones—Techniques and Future Perspectives: A Review. IEEE Sensors J. 2022, 22, 11439–11455. [Google Scholar] [CrossRef]
  4. Semenyuk, V.; Kurmashev, I.; Lupidi, A.; Alyoshin, D.; Kurmasheva, L.; Cantelli-Forti, A. Advances in UAV detection: Integrating multi-sensor systems and AI for enhanced accuracy and efficiency. Int. J. Crit. Infrastruct. Prot. 2025, 49, 100744. [Google Scholar] [CrossRef]
  5. Mandal, S.; Chen, L.; Alaparthy, V.; Cummings, M.L. Acoustic Detection of Drones through Real-time Audio Attribute Prediction. In Proceedings of the AIAA Scitech 2020 Forum, Orlando, FL, USA, 6–10 January 2020. [Google Scholar] [CrossRef]
  6. Akbal, E.; Akbal, A.; Dogan, S.; Tuncer, T. An automated accurate sound-based amateur drone detection method based on skinny pattern. Digit. Signal Process. 2023, 136, 104012. [Google Scholar] [CrossRef]
  7. Lim, J.; Joo, J.; Kim, S.C. Performance Enhancement of Drone Acoustic Source Localization Through Distributed Microphone Arrays. Sensors 2025, 25, 1928. [Google Scholar] [CrossRef]
  8. Liu, L.; Sun, B.; Li, J.; Ma, R.; Li, G.; Zhang, L. Localization of UAVs Using Acoustic Signals Collected by Distributed Acoustic-Electric Sensors. In Proceedings of the 2025 IEEE 15th International Conference on Signal Processing, Communications and Computing (ICSPCC), Hong Kong, 18–21 July 2025; pp. 1–5. [Google Scholar] [CrossRef]
  9. Tejera-Berengue, D.; Zhu-Zhou, F.; Utrilla-Manso, M.; Gil-Pita, R.; Rosa-Zurera, M. Analysis of Distance and Environmental Impact on UAV Acoustic Detection. Electronics 2024, 13, 643. [Google Scholar] [CrossRef]
  10. Fang, J.; Li, Y.; Ji, P.N.; Wang, T. Drone Detection and Localization Using Enhanced Fiber-Optic Acoustic Sensor and Distributed Acoustic Sensing Technology. J. Light. Technol. 2023, 41, 822–831. [Google Scholar] [CrossRef]
  11. Chen, J.; Li, H.; Ai, K.; Shi, Z.; Xiao, X.; Yan, Z.; Liu, D.; Ping Shum, P.; Sun, Q. Low-Altitude UAV Surveillance System via Highly Sensitive Distributed Acoustic Sensing. IEEE Sensors J. 2024, 24, 32237–32246. [Google Scholar] [CrossRef]
  12. Fang, J.; Li, Y.; Ji, P.N.; Wang, T. Remote Drone Detection and Localization with Fiber-Optic Microphones and Distributed Acoustic Sensing. In Proceedings of the 2022 Optical Fiber Communications Conference and Exhibition (OFC), San Diego, CA, USA, 6–10 March 2022; pp. 1–3. [Google Scholar]
  13. Takano, H.; Nakahara, M.; Suzuoki, K.; Nakayama, Y.; Hisano, D. 300-Meter Long-Range Optical Camera Communication on RGB-LED-Equipped Drone and Object-Detecting Camera. IEEE Access 2022, 10, 55073–55080. [Google Scholar] [CrossRef]
  14. Zitar, R.A.; Al-Betar, M.; Ryalat, M.; Kassaymeh, S. A review of UAV Visual Detection and Tracking Methods. arXiv 2023, arXiv:2306.05089. [Google Scholar] [CrossRef]
  15. Liu, Z.; An, P.; Yang, Y.; Qiu, S.; Liu, Q.; Xu, X. Vision-Based Drone Detection in Complex Environments: A Survey. Drones 2024, 8, 643. [Google Scholar] [CrossRef]
  16. Seidaliyeva, U.; Ilipbayeva, L.; Utebayeva, D.; Smailov, N.; Matson, E.T.; Tashtay, Y.; Turumbetov, M.; Sabibolda, A. LiDAR Technology for UAV Detection: From Fundamentals and Operational Principles to Advanced Detection and Classification Techniques. Sensors 2025, 25, 2757. [Google Scholar] [CrossRef]
  17. Gong, J.; Yan, J.; Li, D. Radar Challenges and Solutions for Drone Detection. In Proceedings of the 2025 26th International Radar Symposium (IRS), Hamburg, Germany, 21–23 May 2025; pp. 1–8. [Google Scholar] [CrossRef]
  18. Larrat, M.; Sales, C. Classification of Flying Drones Using Millimeter-Wave Radar: Comparative Analysis of Algorithms Under Noisy Conditions. Sensors 2025, 25, 721. [Google Scholar] [CrossRef]
  19. Khawaja, W.; Ezuma, M.; Semkin, V.; Erden, F.; Ozdemir, O.; Guvenc, I. A Survey on Detection, Classification, and Tracking of UAVs Using Radar and Communications Systems. IEEE Commun. Surv. Tutor. 2025. [Google Scholar] [CrossRef]
  20. Gong, J.; Yan, J.; Kong, D.; Li, D. Introduction to Drone Detection Radar with Emphasis on Automatic Target Recognition (ATR) technology. arXiv 2023, arXiv:2307.10326. [Google Scholar] [CrossRef]
  21. Deng, M.; Ma, Y.; Wang, C. Design of Airborne Target Detection System with Fusion of Multi-Sensors. In Proceedings of the 2025 5th International Conference on Artificial Intelligence and Industrial Technology Applications (AIITA), Xi’an, China, 28–30 March 2025; pp. 330–334. [Google Scholar]
  22. He, T.; Hou, J.; Chen, D. Multimodal UAV Target Detection Method Based on Acousto-Optical Hybridization. Drones 2025, 9, 627. [Google Scholar] [CrossRef]
  23. Mehta, V.; Dadboud, F.; Bolic, M.; Mantegh, I. A Deep Learning Approach for Drone Detection and Classification Using Radar and Camera Sensor Fusion. In Proceedings of the 2023 IEEE Sensors Applications Symposium (SAS), Ottawa, ON, Canada, 18–20 July 2023; pp. 1–6. [Google Scholar] [CrossRef]
  24. Jajaga, E.; Rushiti, V.; Ramadani, B.; Pavleski, D.; Cantelli-Forti, A.; Stojkovska, B.; Petrovska, O. An Image-Based Classification Module for Data Fusion Anti-drone System. In Proceedings of the International Conference on Image Analysis and Processing—ICIAP 2022 Workshops; Mazzeo, P.L., Frontoni, E., Sclaroff, S., Distante, C., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 422–433. [Google Scholar]
  25. Frid, A.; Ben-Shimol, Y.; Manor, E.; Greenberg, S. Drones Detection Using a Fusion of RF and Acoustic Features and Deep Neural Networks. Sensors 2024, 24, 2427. [Google Scholar] [CrossRef]
  26. Lee, H.; Han, S.; Byeon, J.I.; Han, S.; Myung, R.; Joung, J.; Choi, J. CNN-Based UAV Detection and Classification Using Sensor Fusion. IEEE Access 2023, 11, 68791–68808. [Google Scholar] [CrossRef]
  27. Alhussein, A.N.D.; Qaid, M.R.T.M.; Agliullin, T.; Valeev, B.; Morozov, O.; Sakhabutdinov, A. Fiber Bragg Grating Sensors: Design, Applications, and Comparison with Other Sensing Technologies. Sensors 2025, 25, 2289. [Google Scholar] [CrossRef]
  28. Tang, Z.; Ma, H.; Qu, Y.; Mao, X. UAV Detection with Passive Radar: Algorithms, Applications, and Challenges. Drones 2025, 9, 76. [Google Scholar] [CrossRef]
  29. Akinyemi, T.O.; Omisore, O.M.; Lu, G.; Wang, L. Toward a Fiber Bragg Grating-Based Two-Dimensional Force Sensor for Robot-Assisted Cardiac Interventions. IEEE Sensors Lett. 2022, 6, 5000104. [Google Scholar] [CrossRef]
  30. He, X.L.; Wang, D.H.; Wang, X.B.; Xia, Q.; Li, W.C.; Liu, Y.; Wang, Z.Q.; Yuan, L.B. A Cascade Fiber Optic Sensors for Simultaneous Measurement of Strain and Temperature. IEEE Sensors Lett. 2019, 3, 3502304. [Google Scholar] [CrossRef]
  31. Mihailov, S.J.; Ding, H.; Szabo, K.; Hnatovsky, C.; Walker, R.B.; Lu, P.; De Silva, M. High Sensitivity Fiber Bragg Grating Humidity Sensors Made With Through-the-Coating Femtosecond Laser Writing and Polyimide Coating Thickening. IEEE Sensors Lett. 2024, 8, 5000404. [Google Scholar] [CrossRef]
  32. Jelbuldina, M.; Korganbayev, S.; Seidagaliyeva, Z.; Sovetov, S.; Tuganbekov, T.; Tosi, D. Fiber Bragg Grating Sensor for Temperature Monitoring During HIFU Ablation of Ex Vivo Breast Fibroadenoma. IEEE Sensors Lett. 2019, 3, 5000404. [Google Scholar] [CrossRef]
  33. Chourasia, R.K.; Katti, A. Bragg Fibers: From Optical Properties to Applications; Springer Nature: Cham, Switzerland, 2024. [Google Scholar] [CrossRef]
  34. Kok, S.P.; Go, Y.I.; Wang, X.; Wong, M.L.D. Advances in Fiber Bragg Grating (FBG) Sensing: A Review of Conventional and New Approaches and Novel Sensing Materials in Harsh and Emerging Industrial Sensing. IEEE Sensors J. 2024, 24, 29485–29505. [Google Scholar] [CrossRef]
  35. Braunfelds, J.; Haritonovs, E.; Senkans, U.; Kurbatska, I.; Murans, I.; Porins, J.; Spolitis, S. Designing of Fiber Bragg Gratings for Long-Distance Optical Fiber Sensing Networks. Model. Simul. Eng. 2022, 2022, 8331485. [Google Scholar] [CrossRef]
  36. Vaiano, P.; Carotenuto, B.; Pisco, M.; Ricciardi, A.; Quero, G.; Consales, M.; Crescitelli, A.; Esposito, E.; Cusano, A. Lab on Fiber Technology for biological sensing applications. Laser Photonics Rev. 2016, 10, 922–961. [Google Scholar] [CrossRef]
  37. Yermakov, O.; Zeisberger, M.; Schneidewind, H.; Kim, J.; Bogdanov, A.; Kivshar, Y.; Schmidt, M.A. Advanced fiber in-coupling through nanoprinted axially symmetric structures. Appl. Phys. Rev. 2023, 10, 011401. [Google Scholar] [CrossRef]
  38. Braunfelds, J.; Senkans, U.; Skels, P.; Janeliukstis, R.; Porins, J.; Spolitis, S.; Bobrovs, V. Road Pavement Structural Health Monitoring by Embedded Fiber-Bragg-Grating-Based Optical Sensors. Sensors 2022, 22, 4581. [Google Scholar] [CrossRef]
  39. Senkans, U.; Silkans, N.; Spolitis, S.; Braunfelds, J. Comprehensive Analysis of FBG and Distributed Rayleigh, Brillouin, and Raman Optical Sensor-Based Solutions for Road Infrastructure Monitoring Applications. Sensors 2025, 25, 5283. [Google Scholar] [CrossRef]
  40. Kozlov, V.; Kharchevskii, A.; Rebenshtok, E.; Bobrovs, V.; Salgals, T.; Ginzburg, P. Universal Software Only Radar with All Waveforms Simultaneously on a Single Platform. Remote Sens. 2024, 16, 1999. [Google Scholar] [CrossRef]
  41. Zhang, H.; Qi, L.; Wu, Y.; Musiu, E.M.; Cheng, Z.; Wang, P. Numerical simulation of airflow field from a six–rotor plant protection drone using lattice Boltzmann method. Biosyst. Eng. 2020, 197, 336–351. [Google Scholar] [CrossRef]
  42. Wen, S.; Han, J.; Ning, Z.; Lan, Y.; Yin, X.; Zhang, J.; Ge, Y. Numerical analysis and validation of spray distributions disturbed by quad-rotor drone wake at different flight speeds. Comput. Electron. Agric. 2019, 166, 105036. [Google Scholar] [CrossRef]
  43. Halim, M.N.A.; Fung, K.V.; Marwah, O.M.F.; Rahim, M.Z.; Saleh, S.J.M.; Hassan, S. CFD simulation on airflow behavior of quadcopter fertilizing drone for pineapple plantation. AIP Conf. Proc. 2023, 2530, 040009. [Google Scholar] [CrossRef]
  44. Shouji, C.; Alidoost Dafsari, R.; Yu, S.H.; Choi, Y.; Lee, J. Mean and turbulent flow characteristics of downwash air flow generated by a single rotor blade in agricultural drones. Comput. Electron. Agric. 2021, 190, 106471. [Google Scholar] [CrossRef]
  45. Ghirardelli, M.; Kral, S.T.; Müller, N.C.; Hann, R.; Cheynet, E.; Reuder, J. Flow Structure around a Multicopter Drone: A Computational Fluid Dynamics Analysis for Sensor Placement Considerations. Drones 2023, 7, 467. [Google Scholar] [CrossRef]
  46. Parra, P.H.G.; Angulo, M.V.D.; Gaona, G.E.E. CFD Analysis of two and four blades for multirotor Unmanned Aerial Vehicle. In Proceedings of the 2018 IEEE 2nd Colombian Conference on Robotics and Automation (CCRA), Barranquilla, Colombia, 1–3 November 2018; pp. 1–6. [Google Scholar] [CrossRef]
  47. Lahitani, A.R.; Permanasari, A.E.; Setiawan, N.A. Cosine similarity to determine similarity measure: Study case in online essay assessment. In Proceedings of the 2016 4th International Conference on Cyber and IT Service Management, Bandung, Indonesia, 26–27 April 2016; pp. 1–6. [Google Scholar] [CrossRef]
  48. Zhang, R.; Xu, Z.; Gou, X. ELECTRE II Method Based on the Cosine Similarity to Evaluate the Performance of Financial Logistics Enterprises under Double Hierarchy Hesitant Fuzzy Linguistic Environment. Fuzzy Optim. Decis. Mak. 2023, 22, 23–49. [Google Scholar] [CrossRef]
Figure 1. Block diagram of the experiment setup.
Figure 1. Block diagram of the experiment setup.
Applsci 15 11690 g001
Figure 2. Overview of the experiment setup, including (a) captured picture with the FBG sensor array, and (b) flight area coordinate system, where the red line indicates the fiber placement.
Figure 2. Overview of the experiment setup, including (a) captured picture with the FBG sensor array, and (b) flight area coordinate system, where the red line indicates the fiber placement.
Applsci 15 11690 g002
Figure 3. The intensity map of a test flight, (a) without filtering, (b) filtered with Savitzky–Golay-type third-order arithmetic filter.
Figure 3. The intensity map of a test flight, (a) without filtering, (b) filtered with Savitzky–Golay-type third-order arithmetic filter.
Applsci 15 11690 g003
Figure 4. UAV-induced airflow in the z-axis direction when crossing the FBG array.
Figure 4. UAV-induced airflow in the z-axis direction when crossing the FBG array.
Applsci 15 11690 g004
Figure 5. The intensity map of four test flights corresponding to flights perpendicular to the fiber at (a) 120 c m , (b) 90 c m , (c) 60 c m , and (d) 30 c m .
Figure 5. The intensity map of four test flights corresponding to flights perpendicular to the fiber at (a) 120 c m , (b) 90 c m , (c) 60 c m , and (d) 30 c m .
Applsci 15 11690 g005
Figure 6. The intensity map of four test flights corresponding to flights perpendicular to the fiber at column 3 from the grid designation.
Figure 6. The intensity map of four test flights corresponding to flights perpendicular to the fiber at column 3 from the grid designation.
Applsci 15 11690 g006
Table 1. Summary of research on UAV downwash.
Table 1. Summary of research on UAV downwash.
Ref.YearType of UAVRotor ø, mType of Assessment EnvironmentIntended ApplicationResult
[41]2020Custom six rotor drone.0.381Lattice Boltzmann fluid simulation and indoor anemometer measurement.Improvement of plant protection in agricultural drones.Confirmation that LBM simulations are within 95% of the practically measured airflow components.
[42]2019M234-AT type four-rotor drone.0.766Lattice Boltzmann fluid simulation and wind tunnel experiment.Accurate positioning of the pesticide spray nozzle.Airflow vortices and downwash velocities analyzed, and optimal spray parameters confirmed.
[44]2021Single rotor blade of Xrotor Pro X8 CCW HOBBY-WING0.740Indoor setup with anemometer at different points below the rotor.Better understanding of downwash generated by a single rotor in agricultural drones.Distributions of different velocity components found, as well as vorticity and turbulence intensities evaluated.
[43]2022Computer generated four-rotor PINEXRI-20 equivalent.0.737Ansys Fluent software for mesh analysis.Optimization of spray system design and location for agriculture.Downwash velocity distributions found and reviewed.
[45]2023Foxtech D130 X8 four-rotor drone.0.710Ansys Fluent software and open-air measurements.Drone turbulence study to counter propeller flow interference in wind measurements.Airflow velocity components are analyzed in CFD, and open-air measurements are used to find the best case for accurate wind measurements.
[46]2018CAD generated six rotor drone.N/AAnsys Fluent software for mesh analysis.Identify areas below the drone of high turbulence.Four and two blade configurations compared. Vortex, helicity, and turbulence evaluation.
This paper2025DJI Avata0.074Indoor setup with integrated FBG sensors in the optical fiber.Detect and localize drones passing over the FBG sensor array.Measured drone-induced downwash. Achieved localization of the drone passing over the FBG sensor array, as well as flight altitude detection with accuracy up to 90 percent.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Murans, I.; Zveja, R.K.; Ortiz, D.; Andrejevs, D.; Krumins, N.; Novikova, O.; Khobzei, M.; Tkach, V.; Samila, A.; Kopats, A.; et al. Optical FBG Sensor-Based System for Low-Flying UAV Detection and Localization. Appl. Sci. 2025, 15, 11690. https://doi.org/10.3390/app152111690

AMA Style

Murans I, Zveja RK, Ortiz D, Andrejevs D, Krumins N, Novikova O, Khobzei M, Tkach V, Samila A, Kopats A, et al. Optical FBG Sensor-Based System for Low-Flying UAV Detection and Localization. Applied Sciences. 2025; 15(21):11690. https://doi.org/10.3390/app152111690

Chicago/Turabian Style

Murans, Ints, Roberts Kristofers Zveja, Dilan Ortiz, Deomits Andrejevs, Niks Krumins, Olesja Novikova, Mykola Khobzei, Vladyslav Tkach, Andrii Samila, Aleksejs Kopats, and et al. 2025. "Optical FBG Sensor-Based System for Low-Flying UAV Detection and Localization" Applied Sciences 15, no. 21: 11690. https://doi.org/10.3390/app152111690

APA Style

Murans, I., Zveja, R. K., Ortiz, D., Andrejevs, D., Krumins, N., Novikova, O., Khobzei, M., Tkach, V., Samila, A., Kopats, A., Sics, P. E., Ipatovs, A., Braunfelds, J., Migla, S., Salgals, T., & Bobrovs, V. (2025). Optical FBG Sensor-Based System for Low-Flying UAV Detection and Localization. Applied Sciences, 15(21), 11690. https://doi.org/10.3390/app152111690

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop