Next Article in Journal
A Deployable Conical Log Spiral Antenna for Small Spacecraft: Electronic Design and Test
Next Article in Special Issue
Short Landing Control Techniques Using Optimization of Flare Time Constant for High-Speed Fixed-Wing UAV
Previous Article in Journal
Advances in Composite Materials for Space Applications: A Comprehensive Literature Review
Previous Article in Special Issue
Adapted Speed Control of Two-Stroke Engine with Propeller for Small UAVs Based on Scavenging Measurement and Modeling
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Design and Evaluation of a Direction Sensor System Using Color Marker Patterns Onboard Small Fixed-Wing UAVs in a Wireless Relay System

Graduate School of Engineering, Muroran Institute of Technology, Muroran 050-8585, Japan
*
Author to whom correspondence should be addressed.
Aerospace 2025, 12(3), 216; https://doi.org/10.3390/aerospace12030216
Submission received: 17 January 2025 / Revised: 28 February 2025 / Accepted: 6 March 2025 / Published: 7 March 2025
(This article belongs to the Special Issue UAV System Modelling Design and Simulation)

Abstract

:
Among the several usages of unmanned aerial vehicles (UAVs), a wireless relay system is one of the most promising applications. Specifically, a small fixed-wing UAV is suitable to establish the system promptly. In the system, an antenna pointing control system directs an onboard antenna to a ground station in order to form and maintain a communication link between the UAV and the ground station. In this paper, we propose a sensor system to detect the direction of the ground station from the UAV by using color marker patterns for the antenna pointing control system. The sensor detects the difference between the antenna pointing direction and the ground station direction. The sensor is characterized by the usage of both the color information of multiple color markers and color marker pattern matching. These enable the detection of distant, low-resolution markers, a high accuracy of marker detection, and robust marker detection against motion blur. In this paper, we describe the detailed algorithm of the sensor, and its performance is evaluated by using the prototype sensor system. Experimental performance evaluation results showed that the proposed method had a minimum detectable drawing size of 10.2 pixels, a motion blur tolerance of 0.0175, and a detection accuracy error of less than 0.12 deg. This performance indicates that the method has a minimum detectable draw size that is half that of the ArUco marker (a common AR marker), is 15.9 times more tolerant of motion blur than the ArUco marker, and has a detection accuracy error twice that of the ArUco marker. The color markers in the proposed method can be placed farther away or be smaller in size than ArUco markers, and they can be detected by the onboard camera even if the aircraft’s attitude changes significantly. The proposed method using color marker patterns has the potential to improve the operational flexibility of radio relay systems utilizing UAVs and is expected to be further developed in the future.

1. Introduction

Unmanned aerial vehicles (UAVs) have made significant progress since their inception, driven by continuous technological innovations in navigation and networking technologies. Wireless relay systems are among the most promising applications of UAVs. Extensive simulations and experiments on UAV-based communications have been conducted [1,2,3]. Among them, there are numerous studies on securing communication infrastructure during disasters by utilizing the mobility, flexibility, and pilotless flight capabilities of UAVs [4,5]. The authors of the present paper also aim to use UAVs for video transmission during disasters. Specifically, we envision using a small fixed-wing UAV that remains airborne while circling, as shown in Figure 1, as a wireless relay system for video transmission. The purpose of this system is to be deployed in disaster areas where network infrastructure has been disrupted in order to establish communication lines.
To link an antenna onboard the UAV with a ground station, it is imperative that the onboard antenna is accurately controlled to point toward the direction of the ground station. To achieve this, a highly accurate control system must be configured. Accordingly, the control system requires a direction detection system for the ground station, which is necessary to measure the deviation in the antenna angles from the ground station direction and provide feedback to the control system.
For a ground station direction detection system, the authors consider an approach to detect the direction of ground stations using image data obtained from cameras mounted on small fixed-wing UAVs. The most reliable method for detecting ground stations from image data is to place a distinctive marker on the ground station. Through the accurate recognition of the marker and detection of its position, the direction of the ground station can be determined with high precision. Furthermore, when operating a wireless relay system, the marker needs to be placed only on the ground station, which offers the advantage of not impairing the flexibility of the UAV. On the other hand, the detection performance of markers is highly dependent on the operating environment. Two factors can affect marker detection performance: first, the resolution of the marker, which can hinder detection due to the smaller size of the marker and the reduced number of pixels representing it at a distance; and second, motion blur, which occurs when a UAV is in flight and its posture is constantly changing. This motion blur distorts the markers, which can interfere with marker detection.
There are also methods for detecting the direction of ground stations without using markers or cameras. One example is a radio frequency (RF) sensor that directs the antenna of a geostationary communications satellite toward the ground station [6]. Since RF sensors only need to detect beacon signals, they can detect ground stations relatively easily, even from flying UAVs. However, because RF sensors are composed of multiple power feed systems, high-precision RF sensors are heavy and consume significant amounts of power. Furthermore, a beacon transmitter must be installed at the ground station for the RF sensor to detect its direction, which reduces the flexibility of UAV operations.
Considering the trade-offs of each approach, the increase in weight and power consumption associated with the higher accuracy of RF sensors presents a major disadvantage for small flight systems. Therefore, as part of a wireless relay system for small UAVs, it is essential to develop a sensor system that detects the direction of the ground station using a camera and a marker, while being resilient to the effects of the operating environment. A key factor is the marker detection method’s ability to tolerate the influences of the operating environment. The sensor must meet the following three performance requirements:
  • Detection of low-resolution, long-range markers;
  • Accuracy of marker detection;
  • Robust marker detection against motion blur.
Among these, the accuracy of detecting the direction angle toward the ground station is the most important for a highly accurate antenna control system. Moreover, motion blur, which is proportional to the angular velocity generated by the UAV’s motion, occurs in the acquired image. This motion blur can interfere with marker detection.
Conventionally, an Augmented reality (AR) marker is used in position detection. The detection of AR markers is carried out by using the contour information inherent to the shape of the marker. However, for a wireless relay system in a UAV operating environment, contour information may degrade due to low resolution and motion blur, leading to poor detection performance. Therefore, a method for marker detection that does not rely on contour shape is necessary.
We have adopted color markers that utilize color information as a marker and avoid the use of contour information. The rationale behind this approach is that the degradation of color information is minimal in motion blur and low-resolution markers. However, color information alone is not enough to distinguish markers from objects or noise that has similar colors to those of the markers. For example, if a red color marker is used, color information alone would not be sufficient to distinguish the marker from a leaf of the same red color. To solve this issue, we propose a new method to detect only multiple color markers using two types of information: the predicted multiple color marker placement information and the color information. Based on the methods presented above, this study presents an image processing-based direction detection sensor that can be applied in a small fixed-wing UAV.
This paper is configured as follows: Section 2 gives a brief overview of conventional direction detection technologies, which have been or can be applied to ground station direction detection. Section 3 describes the proposed method in detail, and Section 4 presents the performance requirements of the proposed method. Section 5 presents the experimental results and analysis, and finally, Section 6 concludes with discussing the validity of the proposed method.

2. Conventional Direction Detection Technology

There are many direction detection technologies, and some of them can be applied to wireless relay systems, albeit with some issues. As for the direction detection of ground stations, there are two methods: one without markers and the other with markers.

2.1. Non-Marker-Based Approaches

Non-marker-based approaches frequently utilize the strength of the signal transmitted from the ground station as a means of detecting the direction of the ground station. Though these have been highly reliable, they require heavier equipment for more accurate detection and a corresponding signal transmitter on the ground station side.
The most straightforward method for detecting signals from a ground station is through signal detection [7]. The process of signal detection utilizes the Automatic Gain Control (AGC) signal of a digital receiving tuner, with the objective of measuring the strength of the signal received by the antenna.
Radio frequency (RF) sensors quantify the angular deviation between the direction of the beacon wave transmitted from a ground station and the sensor’s main axis. RF sensors are primarily employed for antenna control in satellite communications [7]. RF sensors comprising multiple power feed systems determine the direction of the beacon wave by calculating the difference in the received power for each feed system. Consequently, high-precision RF sensors cause an increase in weight and therefore cannot be equipped on a small fixed-wing UAVs.
The approach that uses signal strength to detect the ground station direction requires a device that transmits signals to the ground station. The installation of such a signal transmitter affects the cost of building a radio relay system and may cause operational feasibility problems. In addition, the installation of signal transmitters that require power may compromise the independence of the video transmission system using small fixed-wing UAVs. In particular, the signal transmitter may not function in certain environments, such as during disasters. In addition, this approach carries the risk of signal interference. Especially in congested wireless environments, such as urban areas, the accuracy of signal detection may be compromised by surrounding wireless systems. Given these concerns, the non-marker-based approach risks compromising the operational flexibility of small fixed-wing UAVs.

2.2. Marker-Based Approaches

A marker-based approach detects the direction of a ground station indirectly, wherein markers are placed on the ground stations. There are two methods for marker detection: one using contour information and the other using other methods.
An AR marker is detected by acquiring contour information through image processing. AR markers cause lower error detection because of the use of contour information, and the position of a marker can be determined more precisely. For instance, an ArUco marker relies on the thresholding of image contours and polygon extraction to detect markers from input images. Conversely, if the contours are blurred due to low resolution or motion, the performance of marker detection is significantly degraded. In fact, it has been observed that the detection performance of normal ArUco markers degrades as motion blur increases in intensity [8]. A number of studies address this issue. For instance, a method for detecting markers that are not identified due to motion blur has been proposed using deep learning (DL) [9]. DL is a machine learning technique employed for object detection in the field of image processing. Another approach utilizes circular markers for the detection of unclear marker images in underwater environments [10].
Given that AR markers are detected and tracked as objects within an image rather than as markers, the technique of object tracking, which is an image processing method for monitoring specific objects in a degraded image, can be employed. This work concerns the tracking of moving objects, such as cars, in an image. One example of object tracking is optical flow, which can estimate the vector of movement of an object, such as a marker. Indeed, Daniel D. Doyle et al. demonstrated that a camera attached to a two-axis gimbal can track a small UAV using optical flow [11].

3. Proposal of Ground Station Direction Detection Method Using Color Markers

3.1. Color Marker Detection Algorithm

Blob detection, an image processing method for detecting color markers, employs only color information, tends to be subject to noise, and has a high probability of false detection. Consequently, the blob detection method is inadequate for accurately detecting only the color markers installed at ground stations. To solve this issue, we employ the pattern information of multiple color markers in addition to the color information possessed by these color markers. Assume that multiple color markers are installed and their relative positions are known. We can detect only the color markers by matching the known position coordinates of the markers with some of the coordinate points obtained by the blob detection process.
Figure 2 provides an overview of the proposed method. The underlying assumption is that a reference color marker (RCM) is placed as a center maker at a ground station, with color markers placed around the RCM to assist in RCM detection. The color information from the blob detection process is compared with the color marker placement information predicted on the image plane, and the RCM that best matches the prediction is detected. The detection of RCMs located at ground stations requires three processes:
  • Blob coordinate detection;
  • Prediction of the marker drawing position in an image;
  • RCM detection.
The subsequent sections provide a detailed description of each process.

3.1.1. Blob Coordinate Detection

The blob detection process binarizes a specified color gamut in an acquired image and calculates the center of gravity of the detected blobs. In binarizing an image, a histogram of luminance is flattened to improve contrast by influencing backlighting and reflections.
The coordinates of the center of gravity of each blob after binarization can be calculated as follows. Assuming that the coordinates of white pixels in the image are (x, y) and the pixel value is I(x, y), the area A of a blob is expressed by Equation (1).
A = x y I ( x , y )
Furthermore, the coordinates of the center of gravity of blob x g , y g are given by Equations (2) and (3).
x g = 1 A x y x · I ( x , y )
y g = 1 A x y y · I ( x , y )
This coordinate is called blob coordinate B n x g n , y g n .

3.1.2. Predicting the Marker Position in an Image

As shown in Figure 3, the camera position at a distance of position T : T x s , T y s , T z s T from the origin O s of the space-fixed coordinate system O s X s Y s Z s is set as the origin O c . Define a camera coordinate system O c X c Y c Z c that is rotated by the camera attitude angle ψ , θ , ϕ and is expressed by using Euler angles in this spatial coordinate system. When multiple color markers are expressed in the camera coordinate system, the coordinate points of the markers on the image plane are deformed in the expansion, rotation, and translation directions. We compute each point of these deformed marker coordinates. The initial coordinates M c _ i ( x i , y i ) of each marker i are determined by Equations (4) and (5).
M c _ i = x c _ i y c _ i z c _ i = R × M s _ i T
where R is the rotation matrix obtained from the camera attitude angle ψ , θ , ϕ . The marker coordinates in the camera coordinate system obtained from Equation (4) can be projected to the 2-D image plane by the perspective projection transformation in Equation (5).
s i x i y i 1 = f x 0 c x 0 f y c y 0 0 1 x c _ i y c _ i z c _ i = K × M c _ i
The matrix K in Equation (5) represents the camera internal matrix, f x ,   f y are the camera focal lengths, and c x ,   c y are the image center coordinates. The coordinates ( x i ,   y i ) in Equation (5) are M i ( x i , y i ) , the i-th marker coordinates predicted in the image plane.

3.1.3. RCM Detection

The blob coordinates B n x g n , y g n that best match the pattern of the predicted marker coordinate points M i ( x i , y i )   are calculated by matching the blob coordinates on the acquired image with the predicted marker coordinate points in the image plane calculated so far. For example, consider the case of nine color markers; each marker is placed, with the RCM at the center, and spaced at equal intervals, as shown in Figure 4. Consider a graph a data structure representing the relationship between markers. The data structures representing the connection between adjacent color markers can be said to be an undirected graph G = ( V , E ) , represented by a finite set V of marker coordinates and a finite set E of vectors connecting them.
We focus on each of the blob coordinates B n obtained in Section 3.1.1, respectively, compare the predicted marker placement with the surrounding blob coordinate placement, assuming that B n is the RCM, and compute the error. Recursively calculate the sum L n , 0 of the errors for the entire graph, assuming coordinates V 1 = B n in Equation (6).
L n ,   i = min B m V i + M i + 1 M i + L n , i + 1
where B m is the set of blob coordinates excluding the blob coordinate of interest B n . M i + 1 M i is a vector indicating the coordinate direction of the next node calculated from the marker coordinate points obtained from the initial value calculation of the marker coordinate point set. Note that L n , i + 1 = 0 when i 9 .
The above calculations are applied to all blob coordinates, and the blob coordinate with the smallest sum of errors in the entire graph is identified to be the RCM.

3.1.4. Operation of the Proposed Method

The operation results of the method described in Section 3.1.1 and Section 3.1.2 up to the detection of RCMs from captured images are shown in Figure 5. The image processing required for RCM detection was implemented using OpenCV 4.9.0 [12]. Figure 5a is the image captured by the camera, and Figure 5b is the result of blob detection processing on the image in Figure 5a. The detected blob coordinates are indicated by black dots. The blue dots in Figure 5c show the predicted positions of the markers in the image. The result of RCM detection based on the information in Figure 5b,c is the green dot in Figure 5d. As shown in Figure 5d, it can be confirmed that RCMs are detected accurately even when there are objects of the same color around the color markers. In a single frame, the processing time for Figure 5a–d was 0.062 s. In other words, the proposed method can detect 16.13 frames of color markers per second.

3.2. Calculation of Ground Station Direction

The ground station direction is calculated from the marker coordinates in the detected image. The angle θ between the camera lens optical axis and the reference marker is calculated using the center coordinates D of the RCM in the image, the image size I, the image sensor size S, and the focal length L. Equation (7) shows the angle of elevation component and Equation (8) shows the angle of azimuth component in the direction of the ground station. The subscripts x and y represent the direction in the image plane. x is the horizontal direction, and y is the vertical direction.
θ = tan 1 D y × S y / I y L
ψ = tan 1 D x × S x / I x L

4. Target Performance

We determined the required performance of the ground station detection system, which is assumed to be installed in a radio relay system using a small fixed-wing UAV. Table 1 shows the parameters of the cameras assumed to be used in the operation.

4.1. Detection Performance for Low Resolution

We set a performance target for detecting small markers in the image that exist at a distance, i.e., low-resolution markers. The relationship between the size of a marker, M, and the size of the image, D, is shown in Figure 6 and expressed in Equation (9).
D = M × L r × I S
where r is the distance to the marker, L is the focal length of the camera, I is the size of the image, and S is the size of the image sensor (obtained by multiplying the image size by the pixel size).
Considering the flight time, we assume a case in which markers are detected from a small fixed-wing UAV circling and staying at an altitude of 100 m and a speed of 25 m/s at a maximum distance of 2000 m. The marker size is tentatively set to be 2 m, which is large but realistic for operation. The marker sizes for the surface distance between the ground and the UAV are shown in Figure 7.
Figure 7 shows that the marker drawing size will be 14.5 [pixels] at the assumed surface distance of 2000 m. As an example, if the turning orbit of a small fixed-wing UAV deviates by about 10 [m], the marker drawing size changes by only about 0.05 [pixel]. Therefore, without considering the margin, the detection performance required of the proposed method for low resolution is the ability to detect a marker of 14.5 [pixel].

4.2. Marker Direction Detection Accuracy

The antenna mounted on the wireless relay system has an antenna diameter of 30 [cm], a frequency of 5.8 GHz, and a beamwidth of 11.854 [deg.]. The antenna diameter was tentatively determined to be a realistic size for a small UAV of 2 to 3 m in size. The antenna frequency was selected to be one of the frequency bands for image transmission using UAVs, as issued by the Ministry of Internal Affairs and Communications of Japan. Based on the above, the antenna pointing accuracy was set to ±0.29 [deg.], which is 1/20 of the beamwidth [13].
To achieve the determined antenna pointing accuracy, the required marker direction detection accuracy for the ground station detection system is set to be 0.14 [deg.] with a margin of 1/2 of the antenna pointing accuracy.

4.3. Detection Performance of Markers Against Motion Blur

Motion blur is caused by the angular velocity applied to a camera during the exposure time when the lens is released. The strength of motion blur is related to the angular velocity applied to the camera, the viewing angle, and the exposure time. It can be modeled using these three variables. Its strength R is expressed in Equation (10).
R = ω × E θ
Using Equation (10), we found that the intensity of motion blur R becomes stronger as the angular velocity applied to the camera increases. The angular velocity applied when the wireless relay system using a small fixed-wing UAV directs its antenna to the ground station should be calculated, and the marker detection performance against motion blur should be determined based on this value.
MATLAB (Ver. 9.4) and Simulink (Ver. 9.1) simulations were performed to determine the angular velocity. Geometrical parameters and the speed of the UAV are indicated in Figure 8.
The simulation results of the error angular velocity between the direction of the antenna and the direction of the ground station are shown in Figure 9. This error angular velocity is the same as the angular velocity applied to the camera in detecting the RCM.
The simulation results in Figure 9 show that the average absolute value of the elevation angular velocity is 13.44 [deg./s] and the average absolute value of the azimuth angular velocity is 15.40 [deg./s]. Finally, the angular velocity ω applied to the camera is calculated from Equation (11) as
ω = 13.44 2 + 15.40 2 = 20.44   [ d e g . / s ]
Based on the previous information, the required marker detection performance for motion blur using Equation (11) is 0.0120 [-].

4.4. Summary of Target Performance

The target performance calculated in Section 4.1, Section 4.2 and Section 4.3 is summarized in Table 2.

5. Experiments

To confirm the validity of the proposed method and evaluate whether it meets the target performance, experiments corresponding to following three target criteria were conducted: (1) the minimum marker size, (2) the marker detection accuracy, and (3) the maximum motion blur.

5.1. Configuration of Experimental System

Table 3 enumerates the apparatus utilized in the experiment. A Raspberry Pi 4B (manufacturer: Raspberry Pi Ltd., Cambridge, UK) was employed as the computing device for image processing and the calculation of RCM orientation. The camera used to capture images was a V2 Camera Module with a 62.2 × 48.8 [deg.] field of view and a focal length of 3.04 [mm].
The color markers utilized in the experiment were arranged according to the pattern depicted in Figure 4. Each marker was a square with a side length of 26 mm, and each marker was positioned at a distance of 26 mm from its neighbor. The camera attitude angle and camera coordinates required for the proposed method were estimated using the AR markers. There are two reasons for this. Firstly, GPS signals did not reach the room. Secondly, the position error of the inertial navigation system was 3σ = ±3 m, which was too large compared to the scale of the room. In this experiment, we used ArUco markers [14], a type of AR marker, to estimate the camera posture.
In the experiments conducted to evaluate the marker detection accuracy of the proposed method and the maximum motion blur that the proposed method could detect, a high-precision rotary table was employed to provide the angles at which markers were detected or the angular velocity to the camera. The high-precision rotary table was a table that could measure the driving angle accumulated from the startup time and could rotate at any angular velocity.

5.2. Experiment to Evaluate Minimum Detectable Marker Size

Experiments were conducted to evaluate the detection performance of the RCM for small, low-resolution markers in images. In the experiment, the image size of the color markers was varied by changing the distance between the camera and the color markers. RCM detection was performed 100 times for each image size, and the detection rate was measured. The experimental system is depicted in Figure 10.
Figure 11 illustrates the proportion of RCMs identified by the proposed methodology for each color marker rendering size, thereby confirming that the detection rate of the proposed methodology declines markedly for markers smaller than 10.2 pixels.
Additional experiments were conducted to verify the effect of inaccuracies in attitude estimation with the ArUco markers used in the experiments on the proposed method. The experimental setup and configuration were the same as in Figure 10, but RCM detection was performed with noise added to the AR marker attitude estimation results. The noise was based on the value from the inertial navigation system, and an error of ±0.5 degrees with respect to the attitude angle and ±40 mm with respect to the spatial coordinate system was added. The results of the additional experiment are shown in Figure 12.
Figure 12 shows that the minimum detectable marker size remained unchanged at about 10 pixels in the additional experiment. In other words, it was confirmed that deviation in the attitude angle used in RCM detection can be tolerated to some extent. As demonstrated in Table 2, the requisite detection performance of the proposed methodology for low resolution is less than 14.5 pixels. Consequently, the proposed methodology is capable of attaining the necessary performance.

5.3. Experiments to Evaluate Marker Detection Accuracy

The objective of this evaluation was to ascertain whether the proposed method was capable of detecting the RCM and whether the accuracy of the RCM direction calculated from the RCM coordinates met the requisite performance standards. In the experiment, the ground station detection system was installed in such a way that the RCM was centered within the image. Furthermore, the ground station detection system was rotated using a high-precision rotary table that was capable of measuring the rotation angle. The discrepancy between the initial detection angle of the experimental system and the angle obtained after high-precision rotary table operation, denoted as θ s , was compared with the drive angle of the high-precision rotary table, θ t . The experimental system is illustrated in Figure 13.
The relationship between the angle θ s obtained from the ground station detection system and the drive angle θ t measured from the high-precision rotary table is shown by the red marker in Figure 14. The experimentally obtained angle θ s is considered to have a scale factor added to the actual drive angle θ a . Therefore, to obtain higher detection accuracy, it was corrected using Equation (12).
θ a = θ s / A  
The deviation θ between the corrected detection angle θ a and the drive angle θ t is shown in Figure 15. The measurement error of the ground station detection system in the experimental system was 0.62 [deg.]. However, the field-of-view angle of 26.5 [deg.] of the camera used in the experiment is different from that of 5.1 [deg.] of the camera used to calculate the target performance. Therefore, to compare the target performance with the experimental results, the scales were adjusted using Equation (13).
Λ θ = 0.61 × 5.1 / 26.5 = 0.12   [ d e g . ]
As shown in Equation (13), the proposed method had a marker direction detection accuracy of 0.12 [deg.]. The proposed method was required to have an accuracy better than 0.14 [deg.]. Therefore, the proposed method met the required performance standard.

5.4. Experiment to Evaluate Maximum Detectable Motion Blur

The objective is to ascertain whether RCM can be discerned in a compact fixed-wing UAV that is traversing the atmosphere under the influence of anticipated motion blur. As previously outlined in Section 4.3, the degree of motion blur increased in proportion to the angular velocity imparted to the camera. In the course of our experiments, we utilized a high-precision rotary table that was capable of maintaining a constant angular velocity to simulate the angular velocity exerted by a small fixed-wing UAV on a ground station detection system. The ground station detection system was positioned on the high-precision rotary table. The maximum angular velocity at which the ground station detection system was unable to detect the RCM was determined, and the marker detection performance in the presence of motion blur was evaluated using Equation (9). The experimental system is illustrated in Figure 16. The color markers were installed with a drawing size of 15 pixels, taking the operating environment into account.
Figure 17 shows the RCM detection probability for the angular velocity from the experiment with the UAV. Based on the experimental results shown in Figure 17, all frames in which the RCM was drawn were detectable up to an angular velocity of 155 [deg./s], and half of the frames were undetectable at angular velocities of 160 [deg./s] or higher. Therefore, the proposed method was capable of detecting the RCM up to an angular velocity of 155 [deg./s]. Based on these results, the strength of motion blur was calculated using Equation (10). The formula is shown in Equation (14).
R = ω × E θ = 155 × 0.003 26.5 = 0.0175   [ - ]
The calculation results show that the marker detection performance of the proposed method against motion blur is 0.0175 [-], which exceeds the required performance of 0.0120 [-].
Based on the experimental results shown in Figure 17, additional experiments were conducted to confirm the effect of the variation in exposure time, as indicated in Equation (14), on motion blur tolerance. In these experiments, the same configuration of equipment was used to measure motion blur resistance when the camera exposure time was set to 0.006 [s] and 0.009 [s]. The relationship between the maximum angular velocity at which the RCM could be detected and each exposure time is shown in Figure 18a, and the motion blur intensity at each time is summarized in Figure 18b.
In Figure 18, the blue markers represent the results of the experiment shown in Figure 17, while the orange markers represent the results of the additional experiment. The results of the additional experiment indicate that the RCM can be detected up to a maximum angular velocity of 90 [deg./s] when the exposure time is 0.006 [s] and up to 60 [deg./s] when the exposure time is 0.009 [s]. This shows that as the exposure time increases, the maximum angular velocity at which the RCM can be detected decreases, narrowing the sensor’s applicable range. On the other hand, the motion blur intensity at each exposure time setting remained consistent at 0.0204 [-] in both cases. In other words, the motion blur intensity generated in the image at each exposure time setting was the same, indicating that motion blur tolerance does not change with exposure time variation in the proposed method.

5.5. Summary of Performance

The performance requirements of the ground station detection system, as presented in Section 4, are compared with the results of the three experiments in Section 5.2, Section 5.3 and Section 5.4 and summarized in Table 4. For comparison with the performance of the proposed method, the performance of AR markers obtained using a similar experimental system is also presented in Table 4.
As shown in Table 4, the proposed method achieved the required performance in all three performance categories.
In terms of detection performance for low-resolution markers, the proposed method was found to be able to detect RCMs up to a marker size of 10.2 [pixels]. This corresponds to the detection of a 2 [m] diameter color marker at an altitude of 100 [m] above the ground and at a ground distance of 2839 [m]. Since the expected operational range is 2000 [m], the sensor system has enough of a margin. Its performance is half that of AR markers in terms of marker drawing size.
The proposed method also has a marker direction detection accuracy of 0.12 [deg.], which is 0.02 [deg.] more accurate than the required accuracy. It can therefore be used for antenna control in the range of ±0.29 [deg.]. However, when compared to AR markers, the detection accuracy was found to be inferior, entailing twice as much error.
The proposed method’s marker detection performance for motion blur was 0.175 [-]. This performance is found to be applicable to angular velocities up to 29.8 [deg./s] in the assumed operational environment.

5.6. Limitations of the Proposed Method

It has been observed that the proposed method fails to detect RCMs under several conditions. The main reasons for this are motion blur, changes in ambient light, and the partial obstruction of the markers.
First, regarding motion blur, according to the performance evaluation results shown in Table 4, the proposed method fails to detect RCMs when the motion blur intensity exceeds 0.0175. This is because motion blur alters the color of the color markers, making color discrimination difficult. Figure 19 shows an image in which RCM detection fails due to motion blur.
Second, RCM detection may fail due to the discoloration of the color markers caused by changes in ambient light. Specifically, when lighting is inadequate, such as at night, the color of the markers changes due to insufficient ambient light, which leads to RCM detection failure. In Figure 20, RCM detection fails because the color marker at the predicted location (circled in blue) is too dark.
Finally, RCM detection may fail when a portion of the color marker is completely obscured. As shown in Figure 21, if part of the color marker is missing, the detection program determines that the error exceeds the tolerance during the matching process and concludes that RCM detection has failed.
These causes of detection failure can be categorized according to two main factors: first, environments where the color markers become discolored, and second, environments where the color markers are not visible in the image. Discoloration can be counteracted to some extent by adjusting the color gamut applied to the markers. However, in environments where the color markers are completely invisible, there are physical limitations on the system. For example, although the placement pattern of the color markers in the proposed method offers a certain degree of flexibility, depending on the camera’s field of view (FoV) and the distance from the markers, the markers may fall outside the FoV. In such cases, optimizing the camera’s FoV and the placement of markers is required to prevent poor system performance.

6. Conclusions

In this paper, a system for detecting multiple color markers by image processing and detecting the direction of a ground station based on the location information of the markers was proposed, and its performance was evaluated for a wireless relay system using a small fixed-wing UAV for image transmission. Though color markers are robust to low resolution and motion blur due to their detection principle, they are prone to false detection in noisy natural images and cannot detect only color markers installed at ground stations. In this study, we proposed a method to detect only the color markers in captured images by calculating the placement patterns of the markers in the image based on the relative positions and coordinates of the markers and cameras, and matching them with the captured images to construct a ground station detection system. Based on the assumed operational environment of the wireless relay system, we determined the performance requirements for low resolution, marker direction detection accuracy, and motion blur, and evaluated whether the constructed ground station detection system achieves the required performance. The results were as follows: (1) the system was able to detect markers with 1.4 times lower resolution than the required performance standard; (2) the system was able to detect marker direction more accurately than the required performance standard by 0.02 [deg.]. This has enabled us to construct a basic ground station detection system for establishing a wireless relay system using a small fixed-wing UAV.
Although the proposed method has achieved good results in ground experiments, its performance under flight conditions has not yet been verified. In outdoor flight, changes in ambient light may cause the discoloration of the acquired images, and during circling flight, the wing may hide the color markers due to the bank angle. Additionally, the camera may be subjected to unexpected angular velocities due to aircraft engine vibration. Furthermore, the placement pattern of the color markers depends on the camera’s field of view (FoV) and the distance from the markers, which could cause the markers to move out of the FoV and fail to be detected. If RCMs cannot be detected due to these factors, a method is needed either to correct the image using deep learning (DL) or to detect RCMs directly using DL.
Moreover, there is room for improvement in the onboard computation of small fixed-wing UAVs. For example, dedicated hardware such as an Application-Specific Integrated Circuit (ASIC) could be utilized to reduce the power consumption of the sensor system, extend the UAV’s flight time, and improve its operability.
In future research, it will be important to prioritize flight experiments to verify system performance in real-world environments. Additionally, evaluating system robustness in various flight environments and making adjustments to mitigate potential problems related to visibility and environmental conditions will be critical for future success.

Author Contributions

Conceptualization, K.H.; methodology, K.H. and M.U.; software, K.H.; validation, K.H.; resources, M.U.; writing—original draft preparation, K.H.; writing—review and editing, M.U.; supervision, M.U. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available as they are to be used in future research.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shah, K.; Madiha, F.; Usman, N.; Farasat, A. Performance Evaluation of Next-Generation Wireless (5G) UAV Relay. Wirel. Pers. Commun. 2020, 113, 945–960. [Google Scholar]
  2. Ouyang, J.; Zhuang, Y.; Lin, M.; Liu, J. Optimization of beamforming and path planning for UAV-assisted wireless relay networks. Chin. J. Aeronaut. 2014, 27, 313–320. [Google Scholar] [CrossRef]
  3. Li, R.; Zhang, G.; Chen, Y. IRS-assisted UAV wireless powered communication network for sustainable federated learning. Phys. Commun. 2024, 67, 102504. [Google Scholar] [CrossRef]
  4. Zakria, Q.; Fahim, U.; Hafiz, M.; Fadi, A. Addressing disasters in smart cities through UAVs path planning and 5G communications: A systematic review. Comput. Commun. 2021, 168, 114–135. [Google Scholar]
  5. Waleed, E.; Arslan, A.; Aliza, M.; Mohamed, I. Energy-efficient task scheduling and physiological assessment in disaster management using UAV-assisted networks. Comput. Commun. 2020, 155, 150–157. [Google Scholar]
  6. Yoichi, K.; Kenji, U.; Kazuo, N.; Eihiko, H. Antenna Pointing Control System for Satellite Use. Jpn. Soc. Aeronaut. Space Sci. 1988, 36, 73–78. (In Japanese) [Google Scholar]
  7. Mulla, A.; Vasambekar, P. Overview on the development and applications of antenna control systems. Annu. Rev. Control 2016, 41, 47–57. [Google Scholar] [CrossRef]
  8. Francisco, R.; Rafael, M.; Rafael, M. Tracking fiducial markers with discriminative correlation filters. Image Vis. Comput. 2021, 107, 104094. [Google Scholar]
  9. Mostafa, E.; Cecilia, S.; Arpad, K. A Novel Marker Detection System for People with Visual Impairment Using the Improved Tiny-YOLOv3 Model. Comput. Methods Programs Biomed. 2021, 205, 106112. [Google Scholar]
  10. Jost, W.; Sangam, C.; Thomas, S. Robust marker detection and identification using deep learning in underwater images for close range photogrammetry. ISPRS Open J. Photogramm. Remote Sens. 2024, 13, 100072. [Google Scholar]
  11. Daniel, D.; Alan, J.; Jonathan, B. Optical flow background estimation for real-time pan/tilt camera object tracking. Measurement 2014, 48, 195–207. [Google Scholar]
  12. OpenCV—Releases. Available online: https://opencv.org/releases/ (accessed on 4 December 2024).
  13. Yoichi, K.; Hiroshi, H.; Masazumi, U.; Hiroshi, T. Design and Characteristics of the High-accuracy On-board Antenna Pointing Control System. Jpn. Soc. Aeronaut. Space Sci. 1991, 39, 373–378. (In Japanese) [Google Scholar]
  14. Detection of ArUco Markers. Available online: https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html (accessed on 4 December 2024).
Figure 1. Wireless relay system using a small fixed-wing UAV.
Figure 1. Wireless relay system using a small fixed-wing UAV.
Aerospace 12 00216 g001
Figure 2. Process to detect RCM.
Figure 2. Process to detect RCM.
Aerospace 12 00216 g002
Figure 3. Relationship between the space-fixed coordinate system O s , the camera coordinate system O c , and the marker coordinates.
Figure 3. Relationship between the space-fixed coordinate system O s , the camera coordinate system O c , and the marker coordinates.
Aerospace 12 00216 g003
Figure 4. An undirected graph consisting of 9 markers and 12 lines connecting them.
Figure 4. An undirected graph consisting of 9 markers and 12 lines connecting them.
Aerospace 12 00216 g004
Figure 5. Process of detecting RCM from acquired images: (a) captured image; (b) blob detection; (c) marker position prediction process in the image; (d) RCM detection.
Figure 5. Process of detecting RCM from acquired images: (a) captured image; (b) blob detection; (c) marker position prediction process in the image; (d) RCM detection.
Aerospace 12 00216 g005
Figure 6. Relationship between the size of the marker M and the size D of the image.
Figure 6. Relationship between the size of the marker M and the size D of the image.
Aerospace 12 00216 g006
Figure 7. Relationship between the marker size D in the image and surface distance.
Figure 7. Relationship between the marker size D in the image and surface distance.
Aerospace 12 00216 g007
Figure 8. Usage of the UAV in the wireless relay system.
Figure 8. Usage of the UAV in the wireless relay system.
Aerospace 12 00216 g008
Figure 9. Simulation results of angular velocity applied to the antenna: (a) elevation direction; (b) azimuth direction.
Figure 9. Simulation results of angular velocity applied to the antenna: (a) elevation direction; (b) azimuth direction.
Aerospace 12 00216 g009
Figure 10. Experimental system for minimum marker size.
Figure 10. Experimental system for minimum marker size.
Aerospace 12 00216 g010
Figure 11. Results of RCM detection rate by rendering size.
Figure 11. Results of RCM detection rate by rendering size.
Aerospace 12 00216 g011
Figure 12. RCM detection rate when errors are applied to ArUco markers.
Figure 12. RCM detection rate when errors are applied to ArUco markers.
Aerospace 12 00216 g012
Figure 13. Experimental system to evaluate the minimum marker drawing size at which the RCM can be detected.
Figure 13. Experimental system to evaluate the minimum marker drawing size at which the RCM can be detected.
Aerospace 12 00216 g013
Figure 14. Relationship between the angular difference in the detected angle θ s and the actual drive angle θ t . The blue markers represent the raw data output from the sensor, while the red markers represent the results after correction with A = 1.607.
Figure 14. Relationship between the angular difference in the detected angle θ s and the actual drive angle θ t . The blue markers represent the raw data output from the sensor, while the red markers represent the results after correction with A = 1.607.
Aerospace 12 00216 g014
Figure 15. Result of deviation vs. drive angle.
Figure 15. Result of deviation vs. drive angle.
Aerospace 12 00216 g015
Figure 16. Experimental system to evaluate the maximum motion blur at which the RCM can be detected.
Figure 16. Experimental system to evaluate the maximum motion blur at which the RCM can be detected.
Aerospace 12 00216 g016
Figure 17. Result of RCM detection rate per angular velocity.
Figure 17. Result of RCM detection rate per angular velocity.
Aerospace 12 00216 g017
Figure 18. Effect of the proposed method on motion blur resistance with varying exposure time: (a) maximum angular velocity detectable by RCM vs. exposure time; (b) motion blur tolerance vs. exposure time. The blue markers represent the experimental results from Figure 17, while the orange markers represent the results from additional experiments.
Figure 18. Effect of the proposed method on motion blur resistance with varying exposure time: (a) maximum angular velocity detectable by RCM vs. exposure time; (b) motion blur tolerance vs. exposure time. The blue markers represent the experimental results from Figure 17, while the orange markers represent the results from additional experiments.
Aerospace 12 00216 g018
Figure 19. Example of RCM detection failure due to motion blur.
Figure 19. Example of RCM detection failure due to motion blur.
Aerospace 12 00216 g019
Figure 20. Example of RCM detection failure due to changes in ambient light.
Figure 20. Example of RCM detection failure due to changes in ambient light.
Aerospace 12 00216 g020
Figure 21. Example of RCM detection failure due to some color markers being hidden (lower right color marker in image not visible).
Figure 21. Example of RCM detection failure due to some color markers being hidden (lower right color marker in image not visible).
Aerospace 12 00216 g021
Table 1. Camera parameters assumed for wireless relay system.
Table 1. Camera parameters assumed for wireless relay system.
ItemValue
Focal Length50 [mm]
Pixel Size0.00345 [mm]
Image Size 1280 × 720 [pixel]
Field of View5.1 [deg.]
Exposure Time0.003 [s]
Table 2. Three target performance values.
Table 2. Three target performance values.
Detection Performance for Low ResolutionMarker Direction Detection AccuracyDetection Performance of Markers Against Motion Blur
Smaller than 14.5 [pixel]Smaller than 0.14 [deg]Greater than 0.0120 [-]
Table 3. Experimental conditions (equipment used; method selected).
Table 3. Experimental conditions (equipment used; method selected).
ComputerRaspberry Pi 4B
CameraV2 Camera Module
Field of view (azimuth)26.5 [deg.]
Focal length3.04 [mm]
Picture size1280 × 720 [pixel]
Exposure time0.003 [s]
Color markersPrint size26 × 26 [mm]
High-precision rotary tableAngular resolution0.0057 [deg.]
Method of acquiring relative orientation between camera and markerPose estimation with AR markers
Table 4. Summary of performance.
Table 4. Summary of performance.
ItemDetection Performance for Low ResolutionMarker Direction
Detection Accuracy
Detection Performance
of Markers Against Motion Blur
Performance requirementsSmaller than 14.5 [pixel]Smaller than 0.14 [deg]Greater than 0.0120 [-]
Proposed method
(Color markers)
10.2 [pixel]0.12 [deg.]
(Converted value)
0.0175 [-]
AR marker20.0 [pixel]0.06 [deg.]0.0011 [-]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hirai, K.; Ueba, M. The Design and Evaluation of a Direction Sensor System Using Color Marker Patterns Onboard Small Fixed-Wing UAVs in a Wireless Relay System. Aerospace 2025, 12, 216. https://doi.org/10.3390/aerospace12030216

AMA Style

Hirai K, Ueba M. The Design and Evaluation of a Direction Sensor System Using Color Marker Patterns Onboard Small Fixed-Wing UAVs in a Wireless Relay System. Aerospace. 2025; 12(3):216. https://doi.org/10.3390/aerospace12030216

Chicago/Turabian Style

Hirai, Kanya, and Masazumi Ueba. 2025. "The Design and Evaluation of a Direction Sensor System Using Color Marker Patterns Onboard Small Fixed-Wing UAVs in a Wireless Relay System" Aerospace 12, no. 3: 216. https://doi.org/10.3390/aerospace12030216

APA Style

Hirai, K., & Ueba, M. (2025). The Design and Evaluation of a Direction Sensor System Using Color Marker Patterns Onboard Small Fixed-Wing UAVs in a Wireless Relay System. Aerospace, 12(3), 216. https://doi.org/10.3390/aerospace12030216

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop