Next Article in Journal
Harnessing Regenerative Agriculture, Unmanned Aerial Systems, and AI for Sustainable Cocoa Farming in West Africa
Next Article in Special Issue
Detection of Missing Insulators in High-Voltage Transmission Lines Using UAV Images
Previous Article in Journal
Cave of Altamira (Spain): UAV-Based SLAM Mapping, Digital Twin and Segmentation-Driven Crack Detection for Preventive Conservation in Paleolithic Rock-Art Environments
Previous Article in Special Issue
A Novel Telescopic Aerial Manipulator for Installing and Grasping the Insulator Inspection Robot on Power Lines: Design, Control, and Experiment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Fusion Framework of Remote Sensing and Electromagnetic Scattering Features of Drones for Monitoring Freighters

School of Aeronautic Science and Engineering, Beihang University, Beijing 100191, China
*
Author to whom correspondence should be addressed.
Drones 2026, 10(1), 74; https://doi.org/10.3390/drones10010074
Submission received: 22 December 2025 / Revised: 15 January 2026 / Accepted: 21 January 2026 / Published: 22 January 2026

Highlights

What are the main findings?
  • The quadcopter drone exhibits strong dynamic electromagnetic scattering characteristics, but its intuitive effect in remote sensing grayscale images is weak.
  • The peak and average RCS values of UAV are higher than those of quadcopters, making them easier to distinguish in remote sensing grayscale images.
What are the implications of the main findings?
  • Grayscale imaging technology can be prioritized for capturing UAV, vessel, and freighter.
  • Radar detection combined with remote sensing technology can be used to detect unmanned aerial vehicles with significant dynamic RCS features.

Abstract

Certain types of unmanned aerial vehicles (UAVs) represent convenient platforms for remote sensing observation as well as low-altitude targets that are themselves monitored by other devices. In order to study remote sensing grayscale and radar cross-section (RCS) in an example drone, we present a fusion framework based on remote sensing imaging and electromagnetic scattering calculations. The results indicate that the quadcopter drone shows weak visual effects in remote sensing grayscale images while exhibiting strong dynamic electromagnetic scattering features that can exceed 29.6815 dBm2 fluctuations. The average and peak RCS of the example UAV are higher than those of the quadcopter in the given cases. The example freighter exhibits the most intuitive grayscale features and the largest RCS mean under the given observation conditions, with a peak of 51.6186 dBm2. Compared to the UAV, the small boat with a sharp bow design has similar dimensions while exhibiting lower RCS features and intuitive remote sensing grayscale. Under cross-scale conditions, grayscale imaging is beneficial for monitoring UAVs, freighters, and other nearby boats. Dynamic RCS features and grayscale local magnification are suitable for locating and recognizing drones. The established approach is effective in learning remote sensing grayscale and electromagnetic scattering features of drones used for observing freighters.

1. Introduction

Vertical takeoff and landing rotorcraft-type drones equipped with remote sensing equipment can perform regional monitoring of landslides, ship operations, ground activity, and low-altitude aircraft [1,2,3]. Unilateral detection methods exhibit limitations in complex environments, such as ground clutter, nighttime, and heat island effects, and the comprehensive learning of remote sensing and electromagnetic scattering features of unmanned aerial vehicles (UAVs) is attracting increasing attention.
With the characteristics of being relatively low cost and providing flexible time, unmanned aerial vehicle platforms enable remote sensing technology to play a huge role in ground and low-altitude data collection [4]. UAV remote sensing provides technological potential for oil spill detection at sea [5]. A dynamic adaptive path-planning algorithm can ensure the safe navigation of unmanned vessels [6]. Drone remote sensing technology is used to determine key forest health indicators and several vegetation indices collected on-site [7]. The normalized vegetation index is used in remote sensing analysis of multispectral views of farmland [8]. Drone platforms are used to estimate the quantity and quality parameters of silage grassland [9]. Efficiently identifying objects with similar appearances from remote sensing images has always been a key challenge [10]. Similar objects in the coverage area of high-resolution images represent targets for extraction [11]. The emergence of variant aircraft has made it possible for two different targets to show similar results in the same remote sensing image, while the quadcopter has more complex electromagnetic scattering features [12]. Ships and ocean obstacles can have adverse effects on drone remote sensing imagery [13]. Microwave passive remote sensing has advantages such as all-weather capability, penetration depth, and low power consumption [14]. Lidar and drones are used to record the movement of clay and debris flows in coastal valleys [15]. The submeter-level detection capability provides technical support for the automatic recognition of ship targets [16]. Reducing the radar cross-section (RCS) of airborne radar, remote sensing sensors, and other equipment is an important part of aircraft stealth [17]. The electromagnetic scattering characteristics of ships can be extracted by drones equipped with remote sensors.
Conventional quadcopter drones have a cross or X-shaped planar layout. In contrast, the Y-shaped quadrotor can achieve low scattering levels at certain azimuth angles [18]; this type of fuselage has a smaller RCS in both forward and aft directions compared to a rectangular body. The hybrid energy integrated electric propulsion system enhances the high endurance capability of UAV platforms [19]. The high-sensitivity ultraviolet imaging sensor has a detection range of more than 10 km for air-to-air missiles [20]. High-resolution optical images may confuse the background with ships, and the sparse and uneven distribution of ships can also affect the generation of positive samples [21]. Synthetic aperture radar (SAR) can provide high-quality data results for ocean surveillance [22]. Artificial intelligence analysis has proven efficient in drone remote sensing [23]; the electromagnetic scattering features of curved surfaces have been calculated and analyzed [24]. Surface boats, low-altitude drones, and carrier-based aircraft are all targets of maritime surveillance [25,26]. The backpack exhaust port design of the drone blocks the downward infrared radiation from the nozzle, and this raised body part can become a positive sample for remote sensing imaging [27]. A ship detection framework combining multi-level feature cross-layer network fusion and a bidirectional recurrent neural network has been established [28]. A tilt rotor cabin is designed with different electromagnetic scattering characteristics under three deformation modes [29]. The pitch variation of the blades, the movement of the tail rotor, and the rotation of the main rotor all lead to complex dynamic electromagnetic scattering features [30]. A hierarchical exploration method based on mixed viewpoint generation has been established for finite-field UAV [31]. Numerical simulation experiments have been conducted to validate the optical remote sensing imaging system based on compressive sensing [32]. Key modules such as drone image acquisition, data transmission, and pest and disease detection have been integrated [33]. Optical remote sensing maps of water bodies in arid, semi-arid, and arid regions with a resolution of 1 km have been drawn [34]. Drones equipped with various sensors can detect or remotely image ships but also become observation objects themselves for high-altitude platforms.
As established earlier, a single detection method can show limitations in complex environments such as at nighttime or with background clutter. Certain drone and UAV platforms can become monitoring objects for high-altitude equipment while they themselves monitor ships. Whether the rotor that causes dynamic electromagnetic scattering characteristics in quadcopter drones can lead to prominent features in remote sensing grayscale has become a legacy issue. Optical remote sensing technology has high spatial resolution, but is greatly limited by weather conditions. Microwave remote sensing can effectively penetrate atmospheric obstacles such as clouds, rain, and fog, with relatively low spatial resolution and susceptibility to speckle effects. Therefore, in this study, we attempted to establish a fusion framework to analyze the remote sensing grayscale and RCS of example drones used for monitoring ships. This study has academic significance and engineering value in relation to the joint detection and monitoring of multiple platforms.
This paper is organized as follows: the research methodology is described in Section 2, the drone model is established in Section 3, and the relevant results are discussed in Section 4. Finally, the entire text is summarized.

2. Approaches

Figure 1 shows a sketch illustrating our fusion framework for the remote sensing and electromagnetic scattering features of drones, where xfyfzf refers to a coordinate system fixed on the freighter, and xuyuzu stands for a coordinate system fixed on the UAV. Different drones carrying sensors can conduct remote sensing monitoring of cargo ships and unidentified boats, where the spatial position of drones can be changed for mutual monitoring. High-altitude remote sensing airships can monitor drones and ships, with xsyszs being a coordinate system fixed on the freighter. Ground-based radar vehicles or radar stations on other platforms can move to designated locations to detect relevant targets.
For the grayscale images obtained using remote sensing sensors, the blended information can be represented in the following form:
B rs = [ b 1 , b 2 , b 3 ]
where Brs is the blended information matrix and bi is the matrix element. In a Cartesian coordinate system, a combination of coordinate values and grayscale can represent the matrix elements:
b 1 = x , b 2 = y , b 3 = ζ
where x and y represent coordinates and ζ is the grayscale value.
For the electromagnetic scattering features of the object, the blending learning information can be expressed as follows:
B es = [ e 1 , e 2 , e 3 ]
e 1 = α , e 2 = β , e 3 = σ
where Bes is another information matrix, ei is the matrix element, α is the azimuth, β is the elevation angle, and σ is the RCS. The calculation of RCS is beneficial for analyzing the stealth characteristics of cross-size targets at a given observation angle.
For a bright spot target appearing in remote sensing observation areas, the key azimuth evaluation of radar detection can be carried out as follows:
e 1 ras = arc cos X ras b 1 bs X ras b 1 bs 2 + Y ras b 2 bs 2
α rs = arcsin b 2 bs b 1 bs 2 + b 2 bs 2
where e1ras is the azimuth of the radar station, and b1bs and b2bs are coordinates of the bright spot target in grayscale figure coordinates. αrs is the azimuth determined based on the bright spot in the grayscale image. In the default state, the xy axis of the bright spot target itself is considered a translation of the horizontal coordinate system of the grayscale image. Xras and Yras are the coordinates of the radar station within the extended range of the grayscale image. The relative positional relationship in grayscale images can be transformed into sensitive orientations in the local coordinate system of freighters or UAVs.
Taking into account both electromagnetic scattering and remote sensing characteristics, the complete blending information matrix of the target can be obtained as follows:
B = [ B rs ; B es ]
where B is the complete information matrix. Remote sensing observations can provide the cargo ship with the orientation and grayscale information of other boats in the local coordinate system. More connections between RCS and grayscale multi-physics fields can be seen in Section 2.1 and Section 2.2.

2.1. Remote Sensing Imaging

Drones or remote sensing airships can perform remote sensing imaging of targets within the observation area. For the transformation of remote sensing grayscale,
ζ t = ζ t , max ζ t , min ζ 0 , max ζ 0 , min ζ 0 ζ 0 , min + ζ t , min
where ζ 0 is the original grayscale and ζ t is the grayscale after linear transformation. For the translation transformation of the grayscale image,
X , Y , ζ = x + Δ X , x + Δ Y , ζ
where X and Y are the transformed coordinates, while x and y are the coordinates before the transformation, and ζ is grayscale. For a water background, the grayscale needs to be kept at an extremely low level to highlight the target sample:
ζ b = K b 1 ζ b 0 + K b 2 ζ t , min
where ζ b is the background grayscale and Kb is the transformation coefficient. As presented in reference [3], the projection difference can be expressed as follows:
d h = R P + R P 2 Δ h 2 + 2 H Δ h
where dh is the projection difference, H is the height of the sensor, ∆h is the height of point P, and RP is the distance from the sensor to point P. When the heights of two drones are close,
x 2 y 2 z 2 = x 1 y 1 0 + k 1 L d + d 1 k 2 W d + d 2 z 2 z 2 z 1 H d + d 3
L d = max L d 1 , L d 2 ,   W d = max W d 1 , W d 2 ,   H d = max H d 1 , H d 2
where x, y, and z represent the coordinates of the center point of the drone. k1 and k2 are coefficients, and di is the preset safety gap. Ld, Wd, and Hd are the three geometric dimensions of length, width, and height of the drone, and numerical subscripts are used to distinguish different drones. When the horizontal positions of two drones approach each other,
x 2 y 2 z 2 = 0 0 z 1 + x 2 y 2 k 3 H d + d 3 x 2 x 1 L d + d 1 y 2 y 1 W d + d 2
For the example boat in the local coordinate system, the bow direction can be preset as follows:
M ve z m ve = cos A ve sin A ve 0 sin A ve cos A ve 0 0 0 1 × M ve m ve
where Mve is the grid coordinate matrix of the boat model, mve is the boat model, and Ave is the pointing angle of the ship. More calculations related to posture transformation can be found in reference [35,36]. The gray value is a conversion from surface electromagnetic scattering characteristics:
ζ 0 j = 1 σ um σ un σ u j σ un ζ a
where j is serial number. ζ a is the range of grayscale amplitude. The grayscale range can be transformed and set according to actual needs. The subscript u represents non-unitization, and the additional m and n represent the maximum and minimum, respectively. More information on remote sensing modeling can be found in the document [3]. Unlike optical imagers for monitoring water bodies [37], the microwave remote sensing modeling method here considers the absorption of incident waves by water bodies, resulting in extremely low grayscale background water bodies.
This remote sensing imaging method has been validated, as shown in Figure 2, with the comparison sample being the aircraft shown in Figure 22 of reference [3] where Fr is radar wave frequency. The observation angle settings in both cases are the same: α = 279° and β = 83°. It can be seen that the upper surface of the UAV wing is generally bright gray (grayscale value from −15 to −6), which is similar to the wing features of the comparative case. The orientation of the nose and tail wing features can also be obtained. The grayscale of the leading edge of the vertical tail is lower than that of the fuselage, which is also reflected in the comparative case. The dark gray details at the tail imply the transformation of surface electromagnetic scattering characteristics into grayscale in three-dimensional space. This imaging method established for observing the grayscale features of targets is feasible.

2.2. Electromagnetic Scattering Calculation

For the quadcopter configuration, the model of the drone can be represented as follows:
m d t = m r 1 t , m r 2 t , m r 3 t , m r 4 t , m f t t = 0
where md is the drone model, mr is the rotor model, mf is the fuselage model, and t is the time. In the local coordinate system of a drone, the dynamic updating of the rotor can be represented as follows:
M r 1 x y m r 1 t = 0 = M r 1 x m r 1 t = 0 X r 1 , y m r 1 t = 0 Y r 1
M r 1 z m r 1 t = cos A r , r 1 t sin A r , r 1 t 0 sin A r , r 1 t cos A r , r 1 t 0 0 0 1 × M r 1 x y m r 1 t = 0
where Ar is the rotor rotation angle, with additional subscripts representing various rotors. Mr is the grid matrix of the rotor model, with additional superscripts representing different transformation operations. Xr1 is the distance from the rotation axis of rotor 1 to the yz plane, and Yr1 is the distance from the rotation axis of rotor 1 to the xz plane, noting that the body remains level by default. For more information on change operations, please refer to references [18,30]. After updating all rotor models, the dynamic model of the drone can be updated to
M d m d t = M r 1 z m r 1 t , M r 2 z m r 2 t ,     M r 3 z m r 3 t , M r 4 z m r 1 t , M f m f t = 0
where Md and Mf are the grid matrices of the drone model and fuselage, respectively. Based on the assumption of isotropic scattering, the radar scattering area of a target refers to the ratio of the energy flux density scattered at the radar receiving antenna to the energy flux density incident on the target:
σ = lim R 4 π R 2 S s S i = lim R 4 π R 2 E s 2 E i 2
where Ss refers to the energy flux density of the target scattered wave at the radar receiving antenna. R represents the distance from the target to the radar antenna. Si is the energy flux density of the incident wave at the target. Es represents the electric field strength of the target scattered wave at the receiving antenna. Ei is the electric field strength of the incident wave at the target. For more information related to panel RCS calculation, please refer to references [12,29]. After taking into account the contributions of facet scattering and edge diffraction, the RCS of the target can be expressed as follows:
σ = i = 1 N F ( t ) σ F i + j = 1 N E ( t ) σ E j 2
where subscript E represents the contribution of the edge, with F representing facet contribution. NE is the number of edges, with NF representing the number of facets. RCS is solved under far-field conditions, and the incident wave is a plane wave. The RCS analysis under the change of ship attitude can be obtained from reference [36].
Regarding the surface scattering characteristics of curved surfaces:
C s i = D c B cn + D c σ fm σ fn σ f i σ fn B cm B cn
where Cs is the customized scattering characteristics intensity and i is the serial number. Bcm and Bcn are the upper and lower boundaries of the color window, respectively. Dc is the control factor of color depth. σf refers to the facet RCS. σfm and σfn represent the maximum and minimum of the facet RCS, respectively.
The RCS calculation method established has been validated, as shown in Figure 3, where Frh represents the radar wave frequency and horizontal polarization. The RCS mean of the blue line is 10.1399 dBm2, which is 0.188 dBm2 lower than that of the red dashed curve, where the comparative data come from the output results of FEKO [29,30] under current conditions. The two curves are similar in terms of peak, shape, and mean; the maximum RCS of the blue curve is 30.97 dBm2, occurring at α = 90.75°. The established electromagnetic scattering calculation method is accurate for evaluating the RCS of the target.
Another verification of electromagnetic scattering calculation on the example boat target is shown in Figure 4, where the RCS mean of the red line is 6.6703 dBm2. The RCS mean of the green line is 6.8154 dBm2, showing that the absolute error is 0.1451 dBm2, the relative error is 2.17%. The two curves are similar in shape and peak value; the maximum RCS of the red curve is 33.55 dBm2, which appears at α = 179.75°. These results indicate that the presented electromagnetic scattering calculation method is effective.

3. Drone Model Establishment

A model of drone 1 was established, as shown in Figure 5, where Dr1 is the diameter of rotor 1. This drone configuration adopts a typical symmetrical design: the fuselage is symmetrical about the xz plane, rotors 1 and 2 are symmetrical about the xz plane, and rotors 3 and 4 are symmetrical about the xz plane. The rotation speed of the four rotors is the same in the default state, and the rotation direction is set according to the markings in the figure.
The main dimensions of drone 1 are shown in Table 1, where Nr1 represents the number of rotors. Drone 2 and drone 1 adopt the same exterior design and size distribution, and the orientation of the blades is also the same in the initial state. In the top view, rotor 1 rotates counterclockwise. The overall fuselage layout is low and wide at the front and high and narrow at the back, with four support arms facing outward and below to connect various rotors.
The model of this UAV is presented in Figure 6, where Lu, Wu, and Hu represent the length, width, and height of the aircraft, respectively. Wht represents the wingspan of the horizontal tail wing, and Luf is the length of the front fuselage.
This UAV layout includes high-aspect-ratio wings and a narrow fuselage, with the size distribution shown in Table 2. The leading edge of the wing is parallel to the yz plane, and both the vertical and horizontal tail fins have a swept design. This UAV can be equipped with remote sensing sensors to monitor ships and drones, as well as radar to detect these targets.
Two ship models are established, as shown in Figure 7, where Lfr, Wfr, and Hfr, respectively represent the length, width, and height of the freighter. The superstructure of the cargo ship includes the bow compartment, several cargo hold ceilings, and the stern. Lve is the length of the small boat. These two ship types represent objects of observation for drones and radar stations.
The main dimensions of the ship are shown in Table 3, with Wve being the width of the boat. It should be noted that the boat target is set as a fishing boat, pirate ship, or other unidentified boat that may appear around the freighter.
The grid of drone 1 is shown in Figure 8, with high-fidelity unstructured grid technology applied. The small-sized areas of the support arm and the fuselage are strictly inspected to maintain grid quality. Partial lifting measures have been implemented to improve the mesh quality of the fuselage edge, convex surface, blades, and hub.
The mesh of the UAV model is generated as shown in Figure 9, with high-precision unstructured mesh technology adopted. The vertical tail, horizontal tail, wing leading edge, trailing edge, and fuselage edge are all strictly inspected to ensure grid quality.

4. Results and Discussion

4.1. Feature Analysis in Drones

The remote sensing grayscale image of drones 1 and 2 obtained from the high-altitude platform is shown in Figure 10, where it can be observed that the bow of the freighter ship is pointing at an angle of 18° to the x-axis. The y coordinate of the center point of drone 1 is 25 m, and the x coordinate of the center point of drone 2 is −50 m. After local magnification measures, the heads of both drones point along the positive x-axis direction, where the rotor 1 of drone 1 rotates 28° and that of drone 2 rotates 95°. At x = 35.5 m, the upper surface of the front fuselage of drone 1 appears as a bright light gray feature (grayscale about −12). At y = −19.2 m, a large amount of gray (grayscale around −18) and a small amount of dark gray (grayscale near −115) features appear at the stern of the cargo ship. The upper structures such as the tail tower and cargo hold ceiling of the ship can be identified. The good display of cross-scale multiple targets in a three-dimensional observation field relies on local magnification observations from multiple angles. The details of drone 2 show the presence of dark gray features on its side and top protrusions. In the local reference frame of drone 1, the cargo ship appears within the azimuth range of −152°~−95°. The extraction of grayscale images is a partial representation of four-dimensional data.
According to the blending method presented earlier, the observation radar station is set up in the far-field area on the right side of the drone for independent observation. The dynamic RCS of drone 1 at key azimuths in the local coordinate system is shown in Figure 11, where θ is the depression angle of the fuselage and wr1 is the rotational angular velocity of rotor 1. The RCS curve of α = 25° fluctuates between −18.6 and approximately −1.2866 dBm2, with a mean index of −8.596 dBm2, as shown in Table 4. At t = 0.0011 s, the RCS of the red curve reaches its maximum value and then enters a continuous fluctuation until it reaches its minimum at t = 0.0214 s. When α = 27°, the blue curve shows more and smaller minima, causing the mean indicator to decrease to −9.8871 dBm2, with the minimum decreasing to −29.95 dBm2 when t = 0.0122 s. In the case of α = 30°, the increased contribution of panel scattering on the side of the fuselage significantly improves the mean RCS indicator, with the maximum peak reaching 3.4035 dBm2. Although the blade size is smaller than the fuselage or support arm, the rotor can cause RCS fluctuations exceeding 25 dBm2. Based on previous remote sensing analysis, the fuselage has more distinct grayscale features than the propeller blades.
The surface scattering features of drone 1 are provided in Figure 12, where one blade of rotor 2 is light orange (about −58) and the other two blades are orange (around −46) because of the larger angle of electromagnetic wave incidence on the surface. The rotor rotated 16°, and the forward tilt of the fuselage caused the surface of the outer two blades of rotor 3 to appear light orange. Despite the small angle tilt design on the nose surface, these panels still cannot effectively deflect electromagnetic waves in non-threatening directions, resulting in these areas appearing red (about −8.5). The upper surface of the support arm is generally in a state of alternation among red, orange, and a small amount of yellow (about −72). Due to the obstruction of the fuselage, there are some blue (near −170) characteristic areas on the side and tail of the body.

4.2. UAV Feature Analysis

The remote sensing observations of this UAV and cargo ship are independently conducted, as shown in Figure 13, with drone 2 not appearing within the given range. This UAV target appears at x = −92 m, with the nose pointing at an angle of 20° to the x-axis of the grayscale image; the wings, fuselage, and tail can all be clearly identified. At y = −3 m, the UAV wing surface is distributed with a significant amount of white (grayscale about −7) and a small amount of light gray (grayscale close to −22). Because of the presence of the elevation angle, there is no dark gray color appearing on the side of the drone’s top protrusion. Due to its larger size and more conventional surfaces, this UAV is more easily noticed than drone 1. After local magnification, more features are revealed at the tail of drone 1, indicating the presence of a nose-down angle, with the rotor rotated 38°. In the local coordinate system of the freighter, the drone appears within the azimuth range of 51°~72°, but the dynamic characteristics of the drone’s RCS are extremely significant. For cross-scale targets and complex surfaces, a wide range of grayscale settings is necessary. At y = 7.5 m, a deep gray area (grayscale near −155) appears on the side of the freighter stern, with the bow of the cargo ship pointing at a 30° angle to the x-axis. Because of the increase in elevation angle, the original multiple dark gray features (grayscale around −120) of the cargo ship have significantly decreased. The time cost of high-precision detail analysis is shown in Appendix A Table A1.
The RCS comparison between this UAV and drone 1 is shown in Figure 14, with the transient selection of the drone being t = 0.02 s. Due to the combined effect of the fuselage side, vertical tail, and wing outer end, the RCS curve of the UAV forms a significant peak in the lateral direction, reaching 31.3046 dBm2 and appearing at α = 88.5°, where the mean indicator is 9.2840 dBm2, as presented in Table 5. When α = 43°, the RCS curve of the UAV reaches the minimum value of −58.33 dBm2. For the RCS curve of drone 1, the mean index is as low as −0.0212 dBm2 and the peak appears at α = 138.3°, with the rotor having rotated 72° and θ = 0°. Because the body of drone 1 has a simple polyhedral design and is a smaller size, its average level is much lower than that of the UAV. Although the size of this freighter is huge, the inclined design of the exterior facade of the hull and superstructure does not result in its average index being much higher than that of the UAV. The towering stern of the cargo ship not only provides significant contributions to the forward and aft RCS but also contributes to the lateral peak values to a certain extent; the RCS reaches 34.57 dBm2 when α = 90°.
When the elevation angle increases to 5°, the RCS of this UAV is as shown in Figure 15, where the mean indicator is 7.8307 dBm2. At α = 85.5°, the RCS of UAV curve reaches the maximum value of 30.2469 dBm2. Due to the increase in radar wave frequency, the mirror reflection effect of the body of polyhedral drone 1 is enhanced, resulting in a peak value of 16.2082 dBm2 at α = 85.5°, where the mean indicator is −1.1442 dBm2. At the current transient time, the rotor of drone 1 rotates 110°, and the minimum of the RCS curve is −35.03 dBm2, appearing at α = 83.25°. The superstructure of the freighter includes a large number of polyhedral designs, and the bow surface is a conventional elliptical multi-section configuration, which generates more ground element contributions at higher incident wave frequencies; the average RCS increases to 15.9962 dBm2, and the peak reaches 45.1612 dBm2. Within the azimuth range of 68.75°~87.75°, the RCS curve of the UAV is significantly higher than the other two curves.
As presented earlier, within the azimuth range of 6.25°~29.5°, the RCS curve of the UAV is significantly lower than the other two, while the electromagnetic scattering characteristics on its surface require further attention. As shown in Figure 16, large areas of deep red (about −15) and red (near −22) features appear on the leading edge of the nose and vertical tail. Most of the illuminated areas on the fuselage are orange (around −55), with a small amount of yellow (close to −68) and red. Due to the use of an inverted design for the horizontal tail, there are differences in the scattering characteristics of the surfaces of the two horizontal tail fins, with a large number of blue areas (approximately −172) on the upper surface of one tail fin. Because of the obstruction of the front half of the wing, a large area of blue features appears on the upper surface near the 3/5 chord line, while there are yellow features and a few green features near the trailing edge of the wing. As an important tool for entering the battlefield, UAVs have lower surface scattering characteristics and smaller RCS, which can improve their survivability and comprehensive combat effectiveness.

4.3. Feature Analysis for Freighter

The grayscale analysis of the ship target is shown in Figure 17, where the sample boat appears at y = 51.65 m. At x = −70.28 m, the exterior facade of the freighter stern is characterized by a darker gray color (grayscale close to −173), with the tail deck being gray (grayscale about −65), with a small amount of bright gray (grayscale near −28). In the middle of the freighter ship, seven cargo hold ceilings can be identified; the angle between the bow pointing and the x-axis is 12°. At x = −36.22 m, the surface of the boat ship is gray and bright gray (grayscale approximately −25), with the bow pointing at an angle of 36° to the x-axis. The UAV target appears in front of the cargo ship at x = 76.31 m, with the nose pointing at a 15° angle to the x-axis. After local magnification observation, there is a roll angle on the body of drone 1, where the rotor rotates 87°. At x = −63.76, the exterior facade of a building at the stern of the cargo ship represents a darker gray feature. Due to the huge deck area and differently shaped superstructure, the grayscale characteristics of this freighter are still evident under these observation conditions. As the observed object, considering the navigation safety of cargo ships, the larger the grayscale of the freighter, the better, that is, the closer the grayscale value of its surface is to 0 within a given range, the easier it is to be recognized at ultra-long distances. In order to highlight the details of various parts on the surface of the cargo ship, the more grayscale levels or ranges, the better.
The RCS comparison of different ships is shown in Figure 18, where the mean RCS of the blue line is 20.3747 dBm2. The RCS curve of the freighter shows three large peaks exceeding 32 dBm2, located at α = 0°, α = 90°, and α = 180°, respectively. Within the azimuth range of 0°~17.75° and 159.8°~180°, the blue curve is significantly higher than the other three curves. When α = 120°, the minimum of the freighter RCS curve is −41.53, and then the curve fluctuates violently with increasing azimuth until reaching the maximum value of 47.3988 dBm2. The RCS mean of the boat curve is as low as −6.6201 dBm2, where the peak is 11.0371 dBm2. Despite having a slender body compared to the boat, this UAV has an average RCS of 7.1488 dBm2, with a peak RCS of 28.6187 dBm2. Under the current transient conditions, the rotor of drone 1 rotates 29°, and the average RCS of the green curve is −0.6377 dBm2, with a peak of 15.4825 dBm2. Although the example boat has larger external dimensions and more pronounced grayscale visual effects compared to drone 1, it exhibits lower RCS mean and peak values. If there is a small change at the given moment [35], the local RCS distribution of drone 1 will also change accordingly.
The surface scattering characteristics of the freighter are provided in Figure 19, where the symmetrical plane of the cargo ship has been moved to y = −30 m. There are a large number of red (grayscale near −12) and deep red (grayscale around −6) features distributed on the side of the tail tower. The deck is mainly characterized by green (grayscale about −65) and a small amount of yellow (grayscale close to −43). The elliptical surface area of the bow of the ship presents a gradient feature from blue (grayscale approximately −108) to red. The side of the cargo hold ceiling features orange (grayscale about −32) and a small amount of yellow. The illumination area on the side of the ship is mainly orange and orange–yellow, with a small amount of red. There are some red and yellow features on the sides of the remaining superstructure. Within the current observation range, although the scattering characteristics of the deck are at a moderate level, its grayscale imaging is still prominent.

4.4. Comprehensive Analysis

The synthesis analysis of the grayscale characteristics of drones and ships is shown in Figure 20, where the bow of the freighter points at an angle of 22° to the x-axis. The drone 1 appears at x = −36.19 m behind the cargo ship where the rotor rotates 57°. The fuselage and support arms of drone 1 can be distinguished; the front body is bright gray (grayscale close to −11), and the rear body is gray (grayscale about −25), due to the dark gray addition on the convex side. The boat appears behind drone 1 at x = −56.22 m, with the bow pointing at an angle of 50° to the x-axis. Because of its simple appearance and sloping facade, the RCS index of the boat is relatively low, while its grayscale effect is obvious. At x = −18.52 m, the main grayscale distribution of the UAV can be captured, with the nose pointing at an angle of 16° to the x-axis. After local magnification observation, drone 2, appearing at y = −52.73 m, has a nose-down posture, with the rotor rotating 101°. At x = 36.63 m, there are some dark gray areas (grayscale near −165) on the surface of the freighter tower, whereas most areas of the deck are characterized by gray (grayscale around −38) and bright gray. Certain drones can monitor cargo ships but can also become objects of high-altitude remote sensing monitoring themselves. The boat represents certain unknown ships, such as cruise ships, small merchant ships, or pirate boats. The established imaging method can be used to analyze the grayscale of cross-scale targets such as drones and ships.
The RCS comparison of two types of drones is presented in Figure 21, where the mean RCS of the blue curve is 9.2826 dBm2. At α = 89.75°, the RCS curve of drone 1 reaches its maximum value of 30.6055 dBm2. Although this UAV has a large size, its mean index is as low as 5.4942 dBm2, with a maximum peak of 26.481 dBm2 appearing at α = 81°. Due to the large deck area and the scattering contribution of the superstructure, the average RCS of the freighter curve is the highest among these four curves; the RCS mean is as high as 23.6017 dBm2, and the maximum peak reaches 51.6186 dBm2. In the case of the boat, the red line is generally lower than the other three curves; the RCS mean is as low as −6.1848 dBm2, and the peak is 16.1694 dBm2. Within the azimuth range of 113.5°~136.25°, the blue curve is significantly higher than the other three; the rotor rotates 110°, and the rear fuselage side, rotor hub, and top protrusion provide the main scattering contributions. Compared with large cargo ships, the RCS mean and RCS over a wide azimuth range of this boat are small, which poses a challenge for actual radar detection. High altitude grayscale imaging can capture such small targets.
The remote sensing grayscale of the drones and freighter is provided in Figure 22, where the UAV’s nose is pointing at an angle of 20° to the x-axis. At y = −31.68 m, the grayscale features of drone 1 can be recognized; the front fuselage represents a bright gray feature (grayscale about −12), and the support arms are gray (grayscale approximately −28). It should be noted that the rotor rotates 26°. The UAV appears at x = 85.65 m, and the fuselage, wings, and tail can all be clearly identified, with the rear of the vertical tail side being dark gray (grayscale near −165). It should be noted that the angle between the nose pointing and the x-axis is 20°. The numerous superstructures and huge decks mean that the freighter is still the most prominent grayscale target; most of the cargo hold ceilings and decks are light gray, with some dark gray (grayscale close to −158) on the façade of the tail tower. In addition, the bow of this cargo ship points at an angle of 16° to the x-axis. Drone 2 appears at x = −53.53 m, and based on the projected width, it can be inferred that the fuselage has a rolling attitude, with the rotor rotating 98°. Vessel 1 appears at x = −37.3 m behind the UAV, and the deck of the small boat is mainly characterized by bright gray, with the side of the roof being gray (grayscale close to −52), where the angle between the bow pointing and the x-axis is 33°. Boat 2 appears next to drone 2 at y = −17.93 m, with the ship’s symmetry plane at an angle of 22° to the x-axis; the upper surface of the ship is characterized by light gray and gray features. In actual situations, small-angle attitude shaking causes changes in the inclination angle of the ship [36], and the predicted impact on remote sensing grayscale images is small. However, the grayscale characteristics of small boats with large-angle shaking will undergo predictable transitions. In real-world maritime surveillance scenarios, the basic framework can remain unchanged when extended to other unmanned aerial vehicles and ships, and radar wave frequencies can be selected from common bands such as L and X.
Overall, under current observation conditions, ship targets have clearer and more distinguishable grayscale features than aircraft targets, and the RCS-related indicators of the freighter are at the highest level. Under similar size conditions, the small boat has more prominent remote sensing grayscale but lower RCS levels than the UAV. Drones occupy the smallest intuitive grayscale features but exhibit RCS characteristics that exceed those of the freighter for some important azimuth angles.

5. Conclusions

Based on the established fusion framework, the remote sensing grayscale of drones and freighter was achieved and the relevant electromagnetic scattering features were analyzed. Based on the results, conclusions can be drawn as follows:
(1)
A quadcopter drone with a conventional layout has a weak intuitive effect on remote sensing grayscale images but exhibits extremely strong dynamic electromagnetic scattering characteristics, with the front fuselage providing both the main remote sensing grayscale effect and mirror scattering. The RCS fluctuations caused by rotors can exceed 25 dBm2, while in terms of cross-scale multi-physics, the remote sensing grayscale of blades or rotors is extremely weak. The limited output interface cannot achieve all the information of cross size targets in a super large space.
(2)
Under the given conditions, this UAV has higher RCS peak and mean values than the quadcopter, and the fuselage, wings, and orientation of the nose are clearer and more distinguishable in remote sensing grayscale images, whereas a boat of a similar size to the UAV exhibits intuitive grayscale and lower RCS indicators. Grayscale imaging techniques can supplement local information for incomplete images generated by actual sensors.
(3)
The upper facade and bow direction of the ship are clearly identifiable, while the huge deck and numerous superstructures make this freighter display the most prominent grayscale features and the highest RCS index under different contrast conditions, with the maximum peak reaching 51.6186 dBm2. The established fusion framework is effective for analyzing the RCS and remote sensing grayscale of cross-scale targets in multiple physical fields.
In future research, the fusion technology of five-dimensional information urgently needs to be strengthened, and field remote sensing and RCS testing of cross-scale targets are worth implementing.

Author Contributions

Conceptualization and methodology, Z.Z. and J.H.; software and validation, Z.Z.; formal analysis and investigation, Z.Z. and J.H.; writing—original draft preparation, Z.Z.; writing—review and editing, Z.Z. and J.H.; visualization, Z.Z.; supervision, J.H.; funding acquisition, Z.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by a project funded by the China Postdoctoral Science Foundation (Grant Nos. BX20200035, 2020M680005).

Data Availability Statement

The original contributions presented in the study are included in the article, and further inquiries can be directed to the corresponding author.

Acknowledgments

The authors thank the editor and reviewers for their valuable suggestions.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

The detailed display relies on high fidelity modeling and scattering characteristic calculations as shown in Table A1. This framework can be extended to more ships and drones, and high-precision grayscale detail display will consume more time.
Table A1. The main steps and time cost of high-precision detail analysis.
Table A1. The main steps and time cost of high-precision detail analysis.
Main StepModelingMeshInitialize CalculationSpatial ConstructionScattering CalculationGrayscale Conversion
Time cost5.2 h3.3 h1.8 min6 min2.7 min7 min

References

  1. Liu, Z.; Zou, Y.; Hu, Z.; Xue, H.; Li, M.; Rao, B. Research on Multi-Modal Fusion Detection Method for Low-Slow-Small UAVs Based on Deep Learning. Drones 2025, 9, 852. [Google Scholar] [CrossRef]
  2. Gade, S.A.; Madolli, M.J.; García-Caparrós, P.; Ullah, H.; Cha-Um, S.; Datta, A.; Himanshu, S.K. Advancements in UAV remote sensing for agricultural yield estimation: A systematic comprehensive review of platforms, sensors, and data analytics. Remote Sens. Appl. Soc. Environ. 2025, 37, 101418. [Google Scholar] [CrossRef]
  3. Zhou, Z. Comprehensive Discussion on Remote Sensing Modeling and Dynamic Electromagnetic Scattering for Aircraft with Speed Brake Deflection. Remote Sens. 2025, 17, 1706. [Google Scholar] [CrossRef]
  4. Mathews, A.J.; Singh, K.K.; Cummings, A.R.; Rogers, S.R. Fundamental practices for drone remote sensing research across disciplines. Drone Syst. Appl. 2023, 11, 1–22. [Google Scholar] [CrossRef]
  5. Asadzadeh, S.; de Oliveira, W.J.; de Souza Filho, C.R. UAV-based remote sensing for the petroleum industry and environmental monitoring: State-of-the-art and perspectives. J. Pet. Sci. Eng. 2022, 208, 109633. [Google Scholar] [CrossRef]
  6. Peng, J.; Zhao, X.; Zhao, Q. Dynamic Path Planning Method for Unmanned Surface Vessels in Complex Traffic Conditions of Island Reefs Waters. Drones 2024, 8, 620. [Google Scholar] [CrossRef]
  7. Wavrek, M.T.; Carr, E.; Jean-Philippe, S.; McKinney, M.L. Drone remote sensing in urban forest management: A case study. Urban For. Urban Green. 2023, 86, 127978. [Google Scholar] [CrossRef]
  8. Meivel, S.; Maheswari, S. Remote sensing analysis of agricultural drone. J. Indian Soc. Remote Sens. 2021, 49, 689–701. [Google Scholar] [CrossRef]
  9. Karila, K.; Alves Oliveira, R.; Ek, J.; Kaivosoja, J.; Koivumäki, N.; Korhonen, P.; Niemeläinen, O.; Nyholm, L.; Näsi, R.; Pölönen, I.; et al. Estimating Grass Sward Quality and Quantity Parameters Using Drone Remote Sensing with Deep Neural Networks. Remote Sens. 2022, 14, 2692. [Google Scholar] [CrossRef]
  10. Li, B.; Hu, X. Effective distributed convolutional neural network architecture for remote sensing images target classification with a pre-training approach. J. Syst. Eng. Electron. 2019, 30, 238–244. [Google Scholar] [CrossRef]
  11. Emerson, C.; Bommersbach, B.; Nachman, B.; Anemone, R. An Object-Oriented Approach to Extracting Productive Fossil Localities from Remotely Sensed Imagery. Remote Sens. 2015, 7, 16555–16570. [Google Scholar] [CrossRef]
  12. Zhou, Z.; Huang, J. V-shaped deformation quadrotor radar cross-section analysis. Proc. Inst. Mech. Eng. Part G J. Aerosp. Eng. 2025, 239, 9544100251328443. [Google Scholar] [CrossRef]
  13. Kieu, H.T.; Yeong, Y.S.; Trinh, H.L.; Law, A.W.-K. Enhancing Turbidity Predictions in Coastal Environments by Removing Obstructions from Unmanned Aerial Vehicle Multispectral Imagery Using Inpainting Techniques. Drones 2024, 8, 555. [Google Scholar] [CrossRef]
  14. Gruner, K.; Keydel, W.; Suss, H. Application Possibilities of Passive Remote-Sensing Systems in the Millimeter-Wave Region. IEEE Trans. Geosci. Remote Sens. 2007, GE-21, 376–382. [Google Scholar] [CrossRef]
  15. Horacio, J.; Muñoz-Narciso, E.; Trenhaile, A.S.; Pérez-Alberti, A. Remote sensing monitoring of a coastal-valley earthflow in northwestern Galicia, Spain. Catena 2019, 178, 276–287. [Google Scholar] [CrossRef]
  16. Shuai, T.; Sun, K.; Shi, B.; Pérez-Alberti, A. A ship target automatic recognition method for sub-meter remote sensing images. In Proceedings of the 2016 4th International Workshop on Earth Observation and Remote Sensing Applications (EORSA), Guangzhou, China, 4–6 July 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 153–156. [Google Scholar]
  17. Liang, H.S. Stealth technology for radar onboard next generation fighter. Mod. Radar 2018, 40, 11–14. [Google Scholar]
  18. Zhou, Z.; Huang, J. Y-type quadrotor radar cross-section analysis. Aircr. Eng. Aerosp. Technol. 2023, 95, 535–545. [Google Scholar] [CrossRef]
  19. Ismail, N.; Mohd Kamal, N.L.; Norhashim, N.; Abdul Hamid, S.; Sahwee, Z.; Ahmad Shah, S. Electric Propulsion and Hybrid Energy Systems for Solar-Powered UAVs: Recent Advances and Challenges. Drones 2025, 9, 846. [Google Scholar] [CrossRef]
  20. Chen, L.; Duan, P.F.; Yuan, C. Research on development status and key technology of stealth air-to-air missiles. Aero Weapon. 2022, 29, 14–21. [Google Scholar]
  21. Ren, Z.; Tang, Y.; He, Z.; Tian, L.; Yang, Y.; Zhang, W. Ship detection in high-resolution optical remote sensing images aided by saliency information. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5623616. [Google Scholar] [CrossRef]
  22. Zhang, T.; Zhang, X.; Liu, C.; Shi, J.; Wei, S.; Ahmad, I.; Zhan, X.; Zhou, Y.; Pan, D.; Li, J.; et al. Balance learning for ship detection from synthetic aperture radar remote sensing imagery. ISPRS J. Photogramm. Remote Sens. 2021, 182, 190–207. [Google Scholar] [CrossRef]
  23. Keerthinathan, P.; Sandino, J.; Mahendren, S.; Uthayasooriyan, A.; Galvez, J.; Hamilton, G.; Gonzalez, F. Advancing Real-Time Aerial Wildfire Detection Through Plume Recognition and Knowledge Distillation. Drones 2025, 9, 827. [Google Scholar] [CrossRef]
  24. Sui, M.; Xu, X.J. Electromagnetic scattering calculation of complex structure using iterative physical optics based on curved surface patches. Chin. J. Radio Sci. 2012, 27, 892–896. [Google Scholar]
  25. Zhou, Z.; Huang, J. An optimization model of parameter matching for aircraft catapult launch. Chin. J. Aeronaut. 2020, 33, 191–204. [Google Scholar] [CrossRef]
  26. Chen, J.; Chen, K.; Chen, H.; Li, W.; Zou, Z.; Shi, Z. Contrastive learning for fine-grained ship classification in remote sensing images. IEEE Trans. Geosci. Remote Sens. 2022, 60, 4707916. [Google Scholar] [CrossRef]
  27. Yin, P.; Jia, G.W.; Yang, X.X. Research on the development of foreign military uav stealth design. Aerodyn. Missile J. 2021, 12, 69–74. [Google Scholar]
  28. Li, X.; Li, Z.; Lv, S.; Cao, J.; Pan, M.; Ma, Q.; Yu, H. Ship detection of optical remote sensing image in multiple scenes. Int. J. Remote Sens. 2022, 43, 5709–5737. [Google Scholar] [CrossRef]
  29. Zhou, Z.; Huang, J. Study of RCS characteristics of tilt-rotor aircraft based on dynamic calculation approach. Chin. J. Aeronaut. 2022, 35, 426–437. [Google Scholar] [CrossRef]
  30. Zhou, Z.; Huang, J. Numerical investigations on radar cross-section of helicopter rotor with varying blade pitch. Aerosp. Sci. Technol. 2022, 123, 107452. [Google Scholar] [CrossRef]
  31. Ye, Y.; Wang, X.; Gou, G.; Zhang, H.; Li, H.; Sui, H. Autonomous Exploration-Oriented UAV Approach for Real-Time Spatial Mapping in Unknown Environments. Drones 2025, 9, 844. [Google Scholar] [CrossRef]
  32. Xiao, S.; Zhang, Y.; Chang, X. Ship detection based on compressive sensing measurements of optical remote sensing scenes. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 8632–8649. [Google Scholar] [CrossRef]
  33. Li, W.Q.; Han, X.X.; Lin, Z.B.; Rahman, A. Enhanced pest and disease detection in agriculture using deep learning-enabled drones. Acadlore Trans. Mach. Learn. 2024, 3, 1–10. [Google Scholar] [CrossRef]
  34. Haas, E.M.; Bartholome, E.; Combal, B. Time series analysis of optical remote sensing data for the mapping of temporary surface water bodies in sub-Saharan western Africa. J. Hydrol. 2009, 370, 52–63. [Google Scholar] [CrossRef]
  35. Zhou, Z.; Huang, J. X-Band Radar Cross-Section of Tandem Helicopter Based on Dynamic Analysis Approach. Sensors 2021, 21, 271. [Google Scholar] [CrossRef] [PubMed]
  36. Zhou, Z.; Huang, J. Dynamic Scattering Approach for Solving the Radar Cross-Section of the Warship under Complex Motion Conditions. Photonics 2020, 7, 64. [Google Scholar] [CrossRef]
  37. Yigit Avdan, Z.; Kaplan, G.; Goncu, S.; Avdan, U. Monitoring the Water Quality of Small Water Bodies Using High-Resolution Remote Sensing Data. ISPRS Int. J. Geo-Inf. 2019, 8, 553. [Google Scholar] [CrossRef]
Figure 1. Sketch illustrating the fusion framework.
Figure 1. Sketch illustrating the fusion framework.
Drones 10 00074 g001
Figure 2. Grayscale comparison verification for the UAV target, where α = 279°, β = 83°, and Fr = 7 GHz.
Figure 2. Grayscale comparison verification for the UAV target, where α = 279°, β = 83°, and Fr = 7 GHz.
Drones 10 00074 g002
Figure 3. RCS calculation validation for the UAV target, where Frh = 6 GHz and β = 0°.
Figure 3. RCS calculation validation for the UAV target, where Frh = 6 GHz and β = 0°.
Drones 10 00074 g003
Figure 4. RCS calculation verification for the boat target, where Frh = 8 GHz and β = 0°.
Figure 4. RCS calculation verification for the boat target, where Frh = 8 GHz and β = 0°.
Drones 10 00074 g004
Figure 5. Model of drone 1 in the local coordinates.
Figure 5. Model of drone 1 in the local coordinates.
Drones 10 00074 g005
Figure 6. Model of the UAV in the local coordinates.
Figure 6. Model of the UAV in the local coordinates.
Drones 10 00074 g006
Figure 7. Model of the ship sample in the local coordinates.
Figure 7. Model of the ship sample in the local coordinates.
Drones 10 00074 g007
Figure 8. Grid of the drone 1 model.
Figure 8. Grid of the drone 1 model.
Drones 10 00074 g008
Figure 9. Mesh of the UAV model.
Figure 9. Mesh of the UAV model.
Drones 10 00074 g009
Figure 10. Remote sensing grayscale of the drones, where α = 66°, β = 33°, and Fr = 7 GHz.
Figure 10. Remote sensing grayscale of the drones, where α = 66°, β = 33°, and Fr = 7 GHz.
Drones 10 00074 g010
Figure 11. Dynamic RCS of drone 1, where wr1 = 62.8319 rad/s, θ = 0°, β = 10°, and Frh = 6 GHz.
Figure 11. Dynamic RCS of drone 1, where wr1 = 62.8319 rad/s, θ = 0°, β = 10°, and Frh = 6 GHz.
Drones 10 00074 g011
Figure 12. Surface electromagnetic scattering features of drone 1, where wr1 = 62.8319 rad/s, θ = 6°, α = 25°, β = 20°, Fr = 7 GHz, and RCS unit: dBm2.
Figure 12. Surface electromagnetic scattering features of drone 1, where wr1 = 62.8319 rad/s, θ = 6°, α = 25°, β = 20°, Fr = 7 GHz, and RCS unit: dBm2.
Drones 10 00074 g012
Figure 13. Remote sensing grayscale of the UAV, where α = 16°, Fr = 7 GHz, and β = 50°.
Figure 13. Remote sensing grayscale of the UAV, where α = 16°, Fr = 7 GHz, and β = 50°.
Drones 10 00074 g013
Figure 14. Comparison of RCS between the UAV and drone 1 (wr1 = 62.8319 rad/s, t = 0.02 s), where Frh = 8 GHz and β = 2°.
Figure 14. Comparison of RCS between the UAV and drone 1 (wr1 = 62.8319 rad/s, t = 0.02 s), where Frh = 8 GHz and β = 2°.
Drones 10 00074 g014
Figure 15. RCS comparison between the UAV and drone 1 (wr1 = 62.8319 rad/s, t = 0.0306 s); Frh = 9 GHz, and β = 5°.
Figure 15. RCS comparison between the UAV and drone 1 (wr1 = 62.8319 rad/s, t = 0.0306 s); Frh = 9 GHz, and β = 5°.
Drones 10 00074 g015
Figure 16. Surface electromagnetic scattering features of the UAV, where α = 27°, β = 5°, Fr = 10 GHz, and RCS unit: dBm2.
Figure 16. Surface electromagnetic scattering features of the UAV, where α = 27°, β = 5°, Fr = 10 GHz, and RCS unit: dBm2.
Drones 10 00074 g016
Figure 17. Remote sensing grayscale of the freighter, where Fr = 7 GHz, α = 97°, and β = 65°.
Figure 17. Remote sensing grayscale of the freighter, where Fr = 7 GHz, α = 97°, and β = 65°.
Drones 10 00074 g017
Figure 18. RCS characteristics of the freighter, where Frh = 8 GHz and β = 7°. For drone 1, wr1 = 62.8319 rad/s and t = 0.0081 s.
Figure 18. RCS characteristics of the freighter, where Frh = 8 GHz and β = 7°. For drone 1, wr1 = 62.8319 rad/s and t = 0.0081 s.
Drones 10 00074 g018
Figure 19. Surface electromagnetic scattering features of the freighter, where α = 80°, β = 10°, Fr = 10 GHz, and RCS unit: dBm2.
Figure 19. Surface electromagnetic scattering features of the freighter, where α = 80°, β = 10°, Fr = 10 GHz, and RCS unit: dBm2.
Drones 10 00074 g019
Figure 20. Grayscale of the observation area, where Fr = 7 GHz, α = 136°, and β = 70°.
Figure 20. Grayscale of the observation area, where Fr = 7 GHz, α = 136°, and β = 70°.
Drones 10 00074 g020
Figure 21. RCS features, where Frh = 9 GHz and β = 10°. For drone 1, wr1 = 62.8319 rad/s, and t = 0.0217 s.
Figure 21. RCS features, where Frh = 9 GHz and β = 10°. For drone 1, wr1 = 62.8319 rad/s, and t = 0.0217 s.
Drones 10 00074 g021
Figure 22. Remote sensing grayscale of the observation area, where Fr = 8 GHz, α = 158°, and β = 81°.
Figure 22. Remote sensing grayscale of the observation area, where Fr = 8 GHz, α = 158°, and β = 81°.
Drones 10 00074 g022
Table 1. The dimensional parameters of drone 1.
Table 1. The dimensional parameters of drone 1.
ParameterDr1 (m)Ld1 (m)Wd1 (m)Hd1 (m)Nr1
Value2.26.25.010.954
Table 2. The main dimensions of this UAV.
Table 2. The main dimensions of this UAV.
ParameterLuf (m)Lu (m)Wu (m)Hu (m)Wht
Value5.3613.2713.22.454.2
Table 3. The dimensional parameters of the ship.
Table 3. The dimensional parameters of the ship.
ParameterLfr (m)Hfr (m)Wfr (m)Lve (m)Wve (m)
Value59.28.99.82103.47
Table 4. Drone 1 dynamic RCS indicator, where β = 10°, Frh = 6 GHz, wr1 = 62.8319 rad/s, and θ = 0°.
Table 4. Drone 1 dynamic RCS indicator, where β = 10°, Frh = 6 GHz, wr1 = 62.8319 rad/s, and θ = 0°.
Azimuth25°27°30°
RCS mean (dBm2)−8.596−9.8871−7.7052
RCS peak (dBm2)−1.2866−0.26593.4035
Table 5. RCS indicator, where wr1 = 62.8319 rad/s, t = 0.02 s, Frh = 8 GHz, and β = 2°.
Table 5. RCS indicator, where wr1 = 62.8319 rad/s, t = 0.02 s, Frh = 8 GHz, and β = 2°.
TargetUAVDrone 1Freighter
RCS mean (dBm2)9.2840−0.02129.6581
RCS peak (dBm2)31.304614.462034.5696
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhou, Z.; Huang, J. Fusion Framework of Remote Sensing and Electromagnetic Scattering Features of Drones for Monitoring Freighters. Drones 2026, 10, 74. https://doi.org/10.3390/drones10010074

AMA Style

Zhou Z, Huang J. Fusion Framework of Remote Sensing and Electromagnetic Scattering Features of Drones for Monitoring Freighters. Drones. 2026; 10(1):74. https://doi.org/10.3390/drones10010074

Chicago/Turabian Style

Zhou, Zeyang, and Jun Huang. 2026. "Fusion Framework of Remote Sensing and Electromagnetic Scattering Features of Drones for Monitoring Freighters" Drones 10, no. 1: 74. https://doi.org/10.3390/drones10010074

APA Style

Zhou, Z., & Huang, J. (2026). Fusion Framework of Remote Sensing and Electromagnetic Scattering Features of Drones for Monitoring Freighters. Drones, 10(1), 74. https://doi.org/10.3390/drones10010074

Article Metrics

Back to TopTop