Next Article in Journal
NMPC-Based Trajectory Optimization and Hierarchical Control of a Ducted Fan Flying Robot with a Robotic Arm
Previous Article in Journal
Advances in Cartography, Mission Planning, Path Search, and Path Following for Drones
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A GPS-Free Bridge Inspection Method Tailored to Bridge Terrain with High Positioning Stability

1
Graduate Institute of Communication Engineering, National Taiwan University, Taipei 10617, Taiwan
2
Department of Civil Engineering, National Taiwan University, Taipei 10617, Taiwan
3
Department of Electrical Engineering, National Taiwan University, Taipei 10617, Taiwan
*
Author to whom correspondence should be addressed.
Drones 2025, 9(10), 678; https://doi.org/10.3390/drones9100678
Submission received: 31 August 2025 / Revised: 20 September 2025 / Accepted: 22 September 2025 / Published: 28 September 2025

Abstract

Highlights

This study proposed an inspection system suitable for various bridge terrains and conducted practical experiments on real bridge structures. Its unique innovation lies in establishing a UWB network with multiple anchors, enabling precise positioning of a filming drone beneath the bridge being inspected, even in the absence of GPS signals.
What are the main findings?
  • The system uses a handover mechanism to prevent electromagnetic interference between anchors and ensure accurate positioning by quickly controlling anchor switches in distinct zones. This is suitable for bridges hundreds of meters long, using dozens of UWB anchors, but with a total of no more than six assigned anchor IDs.
  • The positioning algorithm uses an enhanced two-stage method that adapts to the terrain under the bridge, which reduces the elevation error by ten times compared with the original two-stage method and by half compared to the Taylor series method, successfully improving the UAV’s position accuracy to 0.2–0.5 m.
What is the implication of the main finding?
  • Combining the bipartite graph and vertex coloring analogy, the number of anchor points and anchor IDs can be optimized, so that the length of bridges that can be inspected by this method can be extended to several kilometers.
  • The positioning results using the enhanced two-stage method are robust for various terrains under the bridge. Combined with further extended analysis, the anchor configuration can be optimized, and the positioning accuracy can be well controlled.

Abstract

With the development of drone technology in recent years, many studies have discussed how to leverage drones equipped with sensors and cameras to conduct inspections under bridges. To address positioning challenges caused by the lack of GPS signals under the bridges, triangulation methods with on-site pre-installed Ultra-Wideband (UWB) sensors were used extensively to determine drone locations. However, the practical hurdles of deploying anchors under bridges are often overlooked, including variable terrain and potential electromagnetic interference from deploying a large number of UWB sensors. This study introduces a handover mechanism to address long-distance positioning challenges and an enhanced two-stage algorithm to enhance its suitability for bridge terrain with higher stability. By integrating these concepts, a practical bridge inspection system was devised, and realistic under-bridge experiments were conducted to validate the method’s efficacy in real-world settings.

1. Introduction

Bridges are a vital piece of infrastructure critical to people’s daily lives, but their components inevitably degrade over time. Without proper maintenance and repairs, the consequences can be severe. For example, a bridge collapse in Tennessee in 2019 [1] had a significant impact on the lives of residents. Therefore, regular inspections are imperative. However, the current practice of bridge inspection still heavily relies on manual visual inspection, resulting in each inspection requiring a lot of manpower and taking a long time. Hence, developing a more efficient and cost-effective inspection method is of paramount importance.
Unmanned Aerial Vehicles (UAVs), also known as drones, have emerged as highly promising tools thanks to their high maneuverability. In recent years, researchers have dedicated significant efforts to develop automatic bridge inspection systems using UAVs [2,3]. These frameworks allow an inspection path for drones to be preset, enabling the drones to autonomously navigate along a designated route using GNSS (Global Navigation Satellite System). However, some components of a bridge, such as piers, supports, and abutments, are often located underneath the bridge, which may prevent these methods from being successfully implemented in these areas due to unavailable GPS signals. While techniques like SOP (Signals of Opportunity) [4,5,6] have been used for determining a drone’s position in GPS-denied areas, they are typically deployed in urban environments where multiple signals are readily available, such as LTE, Wi-Fi, and 5G. However, there is currently no work using this technology in bridge inspection scenarios, possibly because the bridge’s location cannot receive such diverse signals.
To enable drones to operate in GPS-denied environments, earlier studies proposed adding sensors to drones, such as optical flow sensors [7,8] or upward two-dimensional (2-D) laser range finders [9], so that the drone can fly under the bridge by measuring the distance between the underside of the bridge and the drone. Nonetheless, this method only allows the drone to fly autonomously vertically; horizontal control still requires a pilot. Subsequent studies attempted to utilize positioning algorithms with auxiliary sensors, such as Inertial Measurement Units (IMUs), visual sensors, ultrasonic sensors, and Ultra-Wideband (UWB) devices, to achieve fully autonomous flight of drones.
Depending on the type of sensors utilized, they can be divided into two categories: onboard sensors within the drone and off-board sensors mounted on the drone. For instance, an Inertial Navigation System (INS) [10] serves as a classical example, utilizing an IMU within the drone itself to measure angular velocity and linear acceleration. These measurements are then integrated to estimate the drone’s position and direction from the starting point [11]. However, an INS is susceptible to errors that accumulate over time, leading to severe estimation errors. This positioning method is typically employed only for short periods of time when the GPS signal is unavailable. Although subsequent studies attempt to reduce accumulated errors through the use of DEM (Digital Elevation Maps) [12], building a DEM can be time-consuming, especially for large or high-resolution datasets.
Other studies focus on mounting off-board sensors on UAVs for positioning, such as visual cameras and beacon-based sensors. In vision-based positioning, visual cameras primarily assist UAV positioning. A common approach is using visual odometry (VO) [13], which tracks the position of a monocular camera from an initial local reference frame [14]. However, the camera only provides a 2-D projection of the scene, making it difficult to accurately estimate the drone’s position due to the lack of scale information. To address this limitation, researchers have turned their attention to RGB [15] or stereo cameras [16]. While these cameras can measure depth to achieve more accurate positioning, their accuracy decreases with increasing distance from the target [17,18]. To solve this problem, the concept of global attitude estimation has emerged, e.g., SLAM system [19,20,21], which reduces the impact of accumulated errors by establishing an environment map and continuously updating the target attitude corresponding to the map [22]. But it requires substantial memory space and computing power, which makes implementing this approach on bridges challenging.
Beacon-based sensors, such as ultrasonic range sensors and UWB, are also widely used for positioning. When using this method, anchors must first be deployed in the area to be localized. The tag on the drone can then obtain distance information from the anchors and use a localization algorithm to estimate the location. For example, Ali et al. [23] successfully used ultrasonic range sensors to estimate the global position of a drone. However, the effective communication distance of ultrasonic waves is usually only about ten meters, which limits their use on long bridges.
The effective communication distance of UWB is usually larger than that of ultrasonic sensors, about 50–60 m. It is also used for various indoor or outdoor positioning problems [24,25,26,27], but still faces the limitation of short communication range when applied to bridge inspection. Although this problem can be solved by deploying a large number of UWB anchors under the bridge, the increase in the number of anchors will cause electromagnetic interference between them, making it difficult for the tag to identify which anchors sent the message. Recently, a UWB handover system has been proposed to alleviate these problems [28]. It uses a bipartite graph and a greedy algorithm to transform the problem into a vertex coloring problem to solve the challenge caused by the long area and numerous beams and columns under the bridge.
In addition, most previous studies utilizing UWB for positioning lacked consideration of the terrain where the anchors are deployed. These studies typically conducted positioning experiments on flat ground [24,25,26,29]. Although Thien et al. [27] placed anchors at different heights, the experimental area remains relatively small, such as within a 6 by 6 m square, which is not representative of real-world applications like bridge inspection. Although trilateration positioning is a classical problem, previous studies, such as [30,31], employed the least squares method and the cosine law to determine the solutions. Positioning accuracy, especially regarding height, is highly sensitive to the placement of anchor positions. Chan et al. [25] proposed using the Taylor series algorithm [30] to enhance height accuracy, but it requires relatively more time due to iterations. Other methods utilize an IMU combined with a Kalman filter [27] or a barometer to enhance vertical position accuracy [26]. However, relying on these sensors for an extended duration may lead to accumulated errors. Recently, an enhanced UWB positioning algorithm was proposed to address these difficulties [32]. The algorithm employed a two-stage singular value decomposition (SVD) to reduce positioning errors caused by tilted anchor configurations. Furthermore, optimal anchor placement strategies were explored to provide better positioning accuracy, which enhances more precise outlier detection and robust performance in bridge inspection environments.
Hence, it is crucial to develop a method that adapts to the terrain under the bridge and allows UAVs to be positioned with high accuracy for long periods of time. However, previous studies have not addressed anchor deployment and terrain tilt in real bridges. To overcome these limitations, this study proposes innovative ideas to address this problem, with major contributions outlined below:
  • Presents an inspection system tailored to various bridge terrains and conducts practical experiments on real bridge structures.
  • Applies a handover mechanism to prevent electromagnetic interference among anchors, and ensures accurate positioning by quickly controlling anchor switches in distinct areas.
  • Utilizes an enhanced two-stage method that adapts to the terrain under the bridge, which reduces the error in height by about ten times compared with the original two-stage method and about half that of the Taylor series method.
The subsequent Sections are highlighted below. Section 2 provides an overview of the entire bridge inspection process, detailing its key components and methods. Section 3 describes the concept of handover, a key mechanism to enable the simultaneous use of multiple UWB anchors. It also summarizes the concept of enhanced localization based on the two-stage method. Section 4 presents the experimental results obtained from conducting inspections under the bridge. Finally, the paper summarizes the findings and insights from this study.

2. Framework of Bridge Inspection

This section provides an overview of the entire bridge inspection process, from data acquisition by drones to the establishment of an inspection platform, as depicted in Figure 1.
Firstly, a quick survey of the selected area is conducted to determine whether it lacks GNSS signals. GNSS signals are often missing when bridge inspections are conducted under the bridge. If GNSS signals are missing, a UWB network is deployed in the area to enable drones to fly and capture images of crucial components, such as the main beams and bridge deck panels. Then, an AI model is employed to detect defects such as “cracks”, “spalling”, and “exposed rebar” in the captured images. Subsequently, the images are processed using geometric extraction methods in combination with rating criteria specified by regulations. This process yields corresponding evaluation integers of DER&U, including the Degree of deterioration, the Extent of deterioration, and the Relevancy, which are utilized to assess the severity of the structural safety impact and propose a classification of Urgency of repairing. Ultimately, the results obtained above are summarized on the inspection platform.

2.1. Data Acquisition Using UAV

To enable the drone to autonomously navigate along a predefined path for capturing the appearance of the bridge structure, this study employs the open-source software Mission Planner to generate a flight path that is readable for the drone. Figure 2 illustrates the planning process for capturing UAV images along the predefined path. Throughout the path planning phase, it is imperative to ensure that the captured images adhere to subsequent evaluation criteria, encompassing scale, geometric transformation, and coordinate data. To fulfill the prerequisites of bridge inspection, this study captures images at specific points, including the bridge deck, both sides of the bridge, and beneath the structure.
Moreover, due to the absence of GNSS signals under the bridge, a UWB network must be deployed to facilitate UAVs in capturing crucial components such as beam webs, transverse diaphragms, and supports. As shown in Figure 3, the UWB network consists of a UWB module (also called the tag) in the drone under test (DUT) and multiple UWB modules in the anchors. The hardware layer and software stack of the DUT and the anchors can be referred to Figure 3 in [28]. Among the anchors, some are located outside the bridge and their precise coordinates can be obtained using RTK (real-time kinematic) technology. Other anchors may be located under the bridge, where GPS is missing. Nonetheless, those RTK-equipped anchors outside the bridge can accurately determine their coordinates. A UWB tag sensor is installed on the drone to receive distance data from the anchors. Leveraging the method proposed in this study (Section 3) and the open source toolkit MAVProxy enables the drone to localize itself in GPS-denied areas.

2.2. Automatic Detection of Damaged Structures

To accurately identify the damaged parts of the bridge, this study employs Mask R-CNN [33] as our training model. This is an instance segmentation model that not only detects but also identifies a target object to a pixel level. Such a model helps to quantify the regions of bridge deteriorations (e.g., the width and length of a crack). For training data, we collect information from the open source dataset [34] and data provided by Taiwan CECI Engineering Consultants Inc. This data covers various regions, including vehicular bridges, pedestrian footbridges, and river crossings. This study mainly assesses problems such as concrete cracks, damage, and exposed steel bars, as shown in Figure 4. After filtering, approximately 800 images remained, chosen as training data. These images are then divided into training and validation set at a ratio of 9:1.
During the training phase, we implement a trial approach, where the learning rate increases as the training iterations progress. After numerous experiments, the optimal parameters for the model are determined to be a batch size of 4, a learning rate of 0.0002, and 80 epochs.

2.3. Detection Results to the Inspection Checklists

In the previous subsection, we employed AI deep learning techniques to identify bridge components and defects. This identification is then further analyzed according to the DER&U values obtained through the rating criteria, which allows us to assess the severity of the structure and propose an urgency classification. Based on this classification, maintenance personnel prioritize inspections of higher-severity locations and develop a repair and reinforcement plan. In addition, due to differences in bridge shape, size, color, location, etc., applying this method requires collaboration with professional engineers to adjust the defined rules or add exception criteria for special situations.

2.4. Build a Detection Management System

By integrating images captured by drones at points of interest with the identification results from AI models, we can construct a comprehensive three-dimensional (3D) bridge management system (refer to Figure 5). This system not only showcases the 3D model of the bridge and its associated facilities, but can also promptly display cracks or concrete spalling on the structure, allowing maintenance personnel to initially understand the bridge condition and reduce their burden. The 3D models shown in this management system can be obtained by using the UAV photos together with commercial photogrammetry software such as Pix4DTM. Three-dimensional models for the bridge can also be constructed by digitalizing its engineering drawing with a software such as SketchUp.

3. Positioning by Anchors Handover and SVD-Enhanced Method

3.1. Statement of the Handover Problem

In view of the problems caused by the large number of anchors under the bridges, the idea of “handover” is proposed to achieve precise positioning by timely controlling the switching of anchors in different areas. This ensures that the tag can accurately identify which anchor number is sending the message, allowing for precise positioning.

3.1.1. Handover Mechanism

A communication network is established between the tag in DUT and anchors, allowing any anchor to be controlled by the tag’s position at any given time. Each UWB tag and anchor is equipped with a Raspberry Pi. All Raspberry Pis are configured to operate under the same wireless local area network (WLAN), enabling the tag to communicate with each anchor using the UDP protocol. In addition, each anchor is equipped with a relay that can be switched on and off as needed.
The flow chart of the handover process is shown in Figure 6. To begin with, the entire area is divided into non-overlapping zones, each containing a group of at least 4 anchors for DUT positioning. Each group of anchors in a specific zone is assigned a predefined character. Anchors located at the intersection of two zones are identified by two characters, and so on. However, each anchor can only be assigned one ID number, and the IDs of anchors in the same group must be different. The characters and IDs of all anchors together with their coordinates are stored in the tag in a table format so that the tag can employ the data in the table to locate the DUT.
If the DUT changes the zone, handover occurs. The tag will broadcast the corresponding character to all anchors based on its location (determined by the SVD-enhanced two-stage method described later). If the Raspberry Pi on the anchor receives the characters corresponding to its zone, the control relay will turn on the UWB module; otherwise, the module will be turned off. The on-off mechanism ensures that the anchors in the appropriate group are activated to locate the drone, while others are de-activated.
An optimization algorithm for anchor deployment and ID assignment has been developed to adapt the UWB handover system to bridge inspection [28]. It solves three main issues: (1) occlusion by the bridge which has many beams and columns and the effective range of UWB being greater than 40 m according to actual measurements; (2) deployment cost reduction by reducing the number of anchors that is deployed, even if the UWB signal covers the area under the bridge; and (3) the limited number of anchors that can be assigned at one time. Different IDs must be assigned to different anchors in the same group to avoid ranging interference between anchors.
More specifically, the occlusion problem can be solved by selecting more anchors for each zone, for example, 5 anchors per zone. This algorithm, which combines a bipartite graph and vertex coloring analogy, can be easily applied, but with a slight increase in the total number of anchors as the tradeoff. Since the positioning algorithm discussed later in Section 3.2 relies on 4 distance measurements to locate the correct position, the additional distance measurement can provide redundancy and successfully solve the problem. An algorithm capable of identifying the outlier measurements caused by obstruction was proposed in [32]. Therefore, even in the presence of occlusion, the most reliable 4 distance measurements can be obtained to find the solution.

3.1.2. Experiment for Handover

The anchors are deployed within a 10.47 m × 3.22 m area, as shown in Figure 7. The drone’s area is divided into three zones, each of which can be located by four anchors. Although there are eight anchors, they only need to be assigned four IDs (labeled as ID 1, 2, 3, and 4) to avoid duplication of IDs in each zone. When the tag resides in a certain zone, the anchors within that zone are activated, while the remaining anchors are deactivated.
During the experiment, the experimenter, carrying the Tag, walked along a straight path from coordinates (0, 1.61) to (11.44, 1.61). The handover mechanism was activated according to the tag’s position, and the positioning results were recorded throughout the experiment for subsequent analysis. The positioning results, shown in Figure 8, demonstrate continuous positioning throughout the handover process. The horizontal axis represents time, and the vertical axes represent the x and y coordinates in meters. The red lines denote the estimated location of the DUT, the black dots denote the actual positioning solution, and the black dashed line marks the time point of handover. By comparing the estimated position with the actual routes, the positioning errors per axis can then be estimated, with an RMSE of 0.176 m and 0.102 m for x and y coordinates, respectively.

3.2. SVD-Enhanced Positioning in Slant Terrains

A two-stage method was originally introduced to locate the position of a drone using a UWB device [25]. In the first stage, trilateration is used to determine the x and y coordinates, which are then substituted into a predefined cost function to obtain the z coordinate. In real-world scenarios such as bridge inspection, uneven terrain limits the placement of anchors to accessible heights, usually near human working levels. When anchors are placed at different heights, the prediction accuracy drops significantly, mainly due to a large condition number.

3.2.1. Enhanced Two-Stage Algorithm

To reduce positioning errors caused by tilted anchor configurations, an enhanced UWB positioning algorithm based on SVD is developed [32]. It is found that if the anchors are nearly coplanar on sloping terrain, the unequal altitudes of the anchors will seriously contaminate the accuracy of the horizontal coordinates, resulting in inaccurate positioning of the DUT. To overcome this difficulty, the coordinate system of the anchors is transformed by rotation to a new coordinate system so that the anchors are dominantly in a horizontal plane with minimum vertical altitude span. Then, the two-stage algorithm is applied in the new coordinate system, where the horizontal coordinates of the tag are found in the first stage and the vertical height in the second stage. Finally, the new coordinates are converted back to the original coordinate system to obtain the tag position. The implementation of the rotation back follows the theoretical formulation described in [32] Section II.B.
In a positioning scenario with N = 4 or more anchors, let the coordinates of the DUT at point T ( x ^ , y ^ , z ^ ) and the coordinates of the anchors be A i ( x i , y i , z i ) for i = 1 ~ N . Choosing one anchor, say A N , as the reference, the anchor configuration can be characterized by a matrix A consisting of vectors A i A N   f o r   i = 1 ,   ,   N 1 . Apply SVD to decompose matrix A into the form of U Σ V T , where Σ is a diagonal matrix of singular values. Let the solution for the DUT be w = T A N . This transformation shifts all computations to the V domain, allowing us to express p as p = V T w . Therefore, the x’ and y’ coordinates of the solution p = x , y , z are no longer adversely affected by the large condition number. The second stage of the two-stage method [25] is then performed to obtain the expected value of z by giving the accurate x and y . Finally, the solution can be determined using w = V p .

3.2.2. Experiment for Positioning

A set of UWB anchors with IDs 1–4 are located within a 27 m × 4.83 m area. To mimic the bridge’s downhill terrain, anchors 1 and 2 are placed at a lower elevation, while anchors 3 and 4 are placed at a higher elevation, with tilt angle θ = 3.4 degrees (see Figure 9). In Figure 9, the red points correspond to anchors labeled 1–4 and the experimenter, carrying a tag, walked along the blue rectangular path from P1, P4, P3, P2, and back to P1. The tag receives the signals from the anchors at a sampling rate of 10 Hz. The SVD is performed in a single step without the need for additional stopping rules or regularization threshold. As a result, the computation time to produce one positioning result is only 0.078 ms [32]. Through the enhanced two-stage method, the tag position can be determined.
The effect of the large condition number on positioning results is also investigated. At a tilt angle of θ = 3.4 ° , the condition number is 606 and the smallest singular value λ 3 = 0.0634 . This small value indicates high sensitivity to measurement errors, which can significantly affect 2D positioning results. Figure 10 shows a comparison of 2D positioning results for the original and enhanced two-stage methods. Using SVD significantly improves x-y positioning accuracy and reduces the error amplification caused by the high condition number.
Table 1 compares the root-mean-square-error (RMSE) on the x-, y- and z-axis, the condition number, and the CPU time per point for the original two-stage method, the SVD-enhanced two-stage method, and the Taylor series algorithm. As can be seen, the condition number is reduced by double digits. The largest RMSE occurs on the x-axis, decreasing from 1.0191 m to 0.0485 m, and the z-axis RMSE decreases from 1.358 m to 0.521 m, while maintaining comparable accuracy on the y-axis. The SVD method also achieves the lowest computational cost of 0.078 ms, thereby validating its efficiency and robustness under tilted-plane conditions. For comparison, the Taylor-series algorithm was implemented using the original two-stage method as the initial guess, ensuring best performance. This real-world positioning experimental finding highlights the effectiveness of the SVD enhanced two-stage method in maintaining high positioning accuracy even when the anchors are not placed on a flat surface.

4. Bridge Inspection Experiment

4.1. Experiment Settings

4.1.1. Hardware

The hardware is shown in Figure 11. The UAV has a three-axis gimbal primarily mounted upward to fulfill the inspection requirements beneath the bridge. In addition, it is equipped with a video transmission transmitter and receiver module to facilitate long-distance wireless transmission of aerial footage captured by the drone. This model also boasts a wind resistance capability equivalent to Beaufort scale 4, ensuring that it is suitable for handling sudden strong winds that may occur under the bridge.
The UWB used in this experiment supports IEEE 802.15.4a and offers multiple channels. We primarily operate within the channel range of 4.25 GHz to 4.75 GHz, with a packet transmission rate of 50 Hz. In this experiment, UWB anchors are deployed outside or under the bridge. First, the longitude and latitude coordinates of the anchors outside the bridge are measured using RTK technology. With the above information, the coordinates of other anchors under the bridge can be obtained using the positioning algorithm (see Figure 12).

4.1.2. Software

The drone’s flight control system uses Pixhawk 2.4, which is supported by the open-source firmware ArduPilot Copter 4.2. This tool enables us to leverage a range of packages to facilitate various tasks in this experiment. For instance, we can utilize the package GPSInput of MAVProxy version 1.8 to simulate GPS signals to achieve drone positioning. Additionally, tools like Mission Planner 1.3.80 can be used to set flight paths and monitor the drone’s flight in real time.

4.2. Selection of Validation Bridges

This study selected two bridges for experimental validation: a small bridge, Bridge A, located in a mountainous area, and a river-crossing bridge, Bridge B, in an urban area.
These two bridges were selected based on specific considerations. According to the Taiwan Ministry of Transportation’s “National Bridge Basic Information Table,” Bridge A requires drone inspections because its minimum underpass height is 1.2 m, making it difficult for large equipment to enter. Bridge B, a vital role as a transportation link connecting downtown Taipei with its suburbs, spans the Jingmei River, measuring 166 m in length and 29.5 m in width. The bridge piers are in the stream, making traditional bridge inspection difficult. Figure 12 and Figure 13 describe the UWB network deployment for Bridges A and B, respectively.
The two validation bridges represent distinct deployment environments, as summarized in Table 2. Bridge A is a compact structure in a mountainous setting where seven UWB anchors were installed, including two placed beneath the bridge to maintain UAV stability during low-altitude flights under the bridge. In contrast, Bridge B is a long-span urban bridge spanning 166 m over the river. To ensure continuous coverage across the wide water gaps and distant piers, 27 UWB anchors were distributed along the bridge deck and temporarily installed on both riverbanks. This contrast demonstrates the adaptability of the proposed UWB-RTK UAV positioning system to both confined and extended bridge environments.

4.3. Image Acquisition

For drone image acquisition, this study uses Mission Planner to generate flight paths compatible with the drone, and save them in the waypoint file format. This planning process involves pre-defining the drone’s shooting path for key bridge components and setting parameters such as altitude, flight speed, and dwell time at waypoints to ensure that the drone’s flight trajectory aligns with site requirements during route planning. Figure 14 illustrates the drone’s flight path around and under a bridge as an example. The actual flight path, as shown by the purple line in the figure, is in good agreement with that defined by the Mission Planner.
Drone route planning involves three main steps, as shown in Figure 15. First, as in (1), switch the Mission Planner screen to the “Flight Plan” page and designate the return position (H). By default, the drone will autonomously return to this position after completing its mission. Next, as in (2), select the waypoint locations to chart the route. Once the location of each waypoint is confirmed, proceed to set the dwell seconds and altitude (Alt) for each point. In this case, the route required the drone to pause for 5 s at each point, and the altitude was set to 2.5 m. The altitude specified here is relative to the initial position. Finally, as in (3), upload the planned route to the drone to perform the mission.
To issue commands for capturing images through the Mission Planner and embed the coordinate information provided by UWB into the image, this study employed an on-screen display (OSD) module and a serial bus (SBUS) signal conversion module to facilitate the UAV imaging process (see Figure 16). This process is primarily divided into two parts: transmitting flight control information to the camera module (red section) and issuing flight control instructions for image capture (blue section).
Finally, the drone performed the shooting task according to the pre-defined paths and obtained images of each bridge component. Figure 17 shows images captured through the pre-planned flight paths. Coordinate information is embedded in each image, allowing the results to be subsequently mapped to actual locations.
Figure 18 displays the results of bridge inspection using deep learning models for the collected bridge images. The model automatically identifies the locations and extent of bridge damage, providing objective and precise detection results that reduce subjectivity in manual inspections. The model’s overall precision and recall were 0.832 and 0.945, respectively, with an F1 score of 0.885, indicating robust performance across all types of damage.
It should be mentioned that the experiments performed in this study were in good weather conditions. This considers the fact that the current structural inspections are typically practiced in good weather conditions to ensure clear inspection and personnel safety. Certainly, worse conditions (e.g., windy or rainy weather, heavy traffic and/or radio inference) should also be concerned issues but are not the main emphases and thus excluded in this study.

5. Conclusions and Discussion

The structural safety of bridges is a critical public concern. However, traditional visual inspection methods are labor-intensive and time-consuming. This study is dedicated to the research and development of key technologies for bridge inspection. In terms of data collection, through a comprehensive UAV route planning process, the study has completed the collection of images of several key bridge inspection components. The establishment of a UWB network addresses the difficulty of UAVs collecting images from under bridges, representing a significant breakthrough in autonomous image collection. The study has deployed a UWB environment on Bridges A and B, successfully improving the UAV’s position accuracy to 0.2–0.5 m. Through route planning, the study automatically collects images of key bridge components previously difficult for UAVs to capture, including supports and beam webs.
Another core aspect of this project is UAV image processing and analysis management. This bridge inspection framework, based on deep learning and computer vision, assists engineers in conducting bridge inspections quickly and efficiently. This AI recognition process is divided into three modules: image localization, degradation identification, and damage assessment. It can more effectively assess problems such as concrete cracks, damage, and exposed steel bars. Compared to previous research on degradation identification, this focuses more on establishing a comprehensive bridge inspection framework to assist engineers in patrol inspections. Inspection of two practical bridges shows that UAVs reveal more deterioration areas than visual inspections. By collecting a wider range of inspection angles, UAVs can provide on-site personnel with a more comprehensive basis for bridge repairs.
Still, this study does not consider certain real-world scenarios, such as the presence of beams and columns beneath bridges, which may block or diffract UWB signals during UAV flight, thus affecting positioning accuracy. Therefore, detecting UWB signal outliers is an important consideration for future bridge inspection methods. On the other hand, for long-span bridges, determining the optimal locations of anchors under the bridges while ensuring precise positioning is also an essential issue. We will integrate multipath detection/mitigation for beam/column reflections and explore adaptive anchor layouts and simulation studies for long-span bridges.
Another important issue for long-span bridges is to figure out the total number of required anchors for inspection. A rough estimate is possible since the UWB has an effective measurement range of about 40 m. A conservative estimate is to set an anchor every 20 m. For example, Bridge B has its size of 166 × 29.5 m2. The estimate is thus 3 × 9 = 27, because 29.5/20 + 1 = 3 and 166/20 + 1 = 9. The bipartite graph algorithm proposed in [28] can help optimize it. In fact, it is used to process the same bridge, but with the layout in this study as the initial arrangement for optimization. The result shows that 27 anchors can be reduced to 14, indicating a reduction factor of about 0.5. Based on this discussion, the scaling can be roughly extended. Therefore, for a long bridge, say a length of 2 km and a width of 60 m, a reasonable estimate of the number of anchors is 4 × 60 × 0.5 = 120 anchors.
Overall, there may be other factors not discussed here that need to be considered in real-world scenarios. However, by addressing the issues raised, bridge inspection technology can advance further.

Author Contributions

Conceptualization, J.-Y.H. and R.-B.W.; methodology, R.-B.W.; software, J.-H.B., C.-R.H. and J.-Y.H.; validation, J.-Y.H. and R.-B.W.; formal analysis, J.-H.B. and C.-R.H.; investigation, R.-B.W.; resources, R.-B.W.; data curation, J.-H.B. and C.-R.H.; writing—original draft preparation, J.-H.B. and C.-R.H.; writing—review and editing, R.-B.W.; visualization, J.-H.B. and J.-Y.H.; supervision, J.-Y.H. and R.-B.W.; project administration, R.-B.W.; funding acquisition, R.-B.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded jointly by the Ministry of Science and Technology, Taiwan, under grant MOST 110-2221-E-002-172 and the Xin Tai Asset Management Co., Ltd., Taipei, Taiwan.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

J.-H. Bai thanks P.-H. Wang and J.-L. Tsai for providing the experimental data for this paper.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Vera, A. Concrete Bridge Railing Collapses onto Tennessee Interstate, Injuring One Person. CNN, 3 April 2019. [Google Scholar]
  2. Lin, J.J.; Ibrahim, A.; Sarwade, S.; Golparvar-Fard, M. Bridge inspection with aerial robots: Automating the entire pipeline of visual data capture, 3D mapping, defect detection, analysis, and reporting. J. Comput. Civ. Eng. 2021, 35, 04020064. [Google Scholar] [CrossRef]
  3. Morgenthal, G.; Hallermann, N.; Kersten, J.; Taraben, J.; Debus, P.; Helmrich, M.; Rodehorst, V. Framework for automated UAS-based structural condition assessment of bridges. Autom. Constr. 2019, 97, 77–95. [Google Scholar] [CrossRef]
  4. Yang, Y.; Khalife, J.; Morales, J.J.; Kassas, M. UAV waypoint opportunistic navigation in GNSS-denied environments. IEEE Trans. Aerosp. Electron. Syst. 2021, 58, 663–678. [Google Scholar] [CrossRef]
  5. Khalife, J.; Kassas, Z.M. On the achievability of submeter-accurate UAV navigation with cellular signals exploiting loose network synchronization. IEEE Trans. Aerosp. Electron. Syst. 2022, 58, 4261–4278. [Google Scholar] [CrossRef]
  6. Khalife, J.; Kassas, Z.M. Opportunistic UAV navigation with carrier phase measurements from asynchronous cellular signals. IEEE Trans. Aerosp. Electron. Syst. 2019, 56, 3285–3301. [Google Scholar] [CrossRef]
  7. Tomiczek, A.P.; Bridge, J.A.; Ifju, P.G.; Whitley, T.J.; Tripp, C.S.; Ortega, A.E.; Poelstra, J.J.; Gonzalez, S.A. Small unmanned aerial vehicle (sUAV) inspections in GPS denied area beneath bridges. Struct. Congr. 2018, 205–216. [Google Scholar] [CrossRef]
  8. Whitley, T.; Tomiczek, A.; Tripp, C.; Ortega, A.; Mennu, M.; Bridge, J.; Ifju, P. Design of a small unmanned aircraft system for bridge inspections. J. Sens. 2020, 20, 5358. [Google Scholar] [CrossRef]
  9. Abiko, S.; Sakamoto, S.Y.; Hasegawa, T.; Shimaji, N. Development of constant altitude flight system using two dimensional laser range finder with mirrors. In Proceedings of the 2017 IEEE International Conference on Advanced Intelligent Mechatronics (AIM), Munich, Germany, 3–7 July 2017. [Google Scholar]
  10. Petritoli, E.; Leccese, F.; Leccisi, M. Inertial navigation systems for UAV: Uncertainty and error measurements. In Proceedings of the 2019 IEEE 5th International Workshop on Metrology for AeroSpace (MetroAeroSpace), Torino, Italy, 19–21 June 2019; pp. 1–5. [Google Scholar]
  11. Scaramuzza, D.; Zhang, Z. Visual-inertial odometry of aerial robots. arXiv 2019, arXiv:1906.03289. [Google Scholar] [CrossRef]
  12. Zhang, J.; Wu, Y.; Liu, W.; Chen, X. Novel approach to position and orientation estimation in vision-based UAV navigation. IEEE Trans. Aerosp. Electron. Syst. 2010, 46, 687–700. [Google Scholar] [CrossRef]
  13. Balamurugan, G.; Valarmathi, J.; Naidu, V.P.S. Survey on UAV navigation in GPS denied environments. In Proceedings of the 2016 International conference on Signal Processing, Communication, Power and Embedded System (SCOPES), Odisha, India, 3–5 October 2016. [Google Scholar]
  14. Mansur, S.; Habib, M.; Pratama, G.N.P.; Cahyadi, I.A.; Ardiyanto, I. Real time monocular visual odometry using optical flow: Study on navigation of quadrotors UAV. In Proceedings of the 3rd International Conference on Science and Technology—Computer (ICST), Yogyakarta, Indonesia, 11–12 July 2017. [Google Scholar]
  15. El Bouazzaoui, I.; Florez, S.A.R.; El Ouardi, A. Enhancing RGB-d SLAM performances considering sensor specifications for indoor localization. IEEE Sens. J. 2021, 22, 4970–4977. [Google Scholar] [CrossRef]
  16. Warren, M.; Corke, P.; Upcroft, B. Long-range stereo visual odometry for extended altitude flight of unmanned aerial vehicles. Int. J. Robot. 2016, 35, 381–403. [Google Scholar] [CrossRef]
  17. Shan, M.; Bi, Y.; Qin, H.; Li, J.; Gao, Z.; Lin, F.; Chen, B.M. A brief survey of visual odometry for micro aerial vehicles. In Proceedings of the 42nd Annual Conference of the IEEE Industrial Electronics Society, Florence, Italy, 23–26 October 2016. [Google Scholar]
  18. Thai, V.P.; Zhong, W.; Pham, T.; Alam, S.; Duong, V. Detection, tracking and classification of aircraft and drones in digital towers using machine learning on motion patterns. In Proceedings of the 2019 Integrated Communications, Navigation and Surveillance Conference (ICNS), Herndon, VA, USA, 9–11 April 2019. [Google Scholar]
  19. Campos, C.; Elvira, R.; Rodríguez, J.J.G.; Montiel, J.M.; Tardós, J.D. Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam. IEEE Trans. Robot. 2021, 37, 1874–1890. [Google Scholar] [CrossRef]
  20. Qin, T.; Li, P.; Shen, S. Vins-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef]
  21. Bryson, M.; Sukkarieh, S. Observability analysis and active control for airborne SLAM. IEEE Trans. Aerosp. Electron. Syst. 2008, 44, 261–280. [Google Scholar] [CrossRef]
  22. Alkendi, Y.; Seneviratne, L.; Zweiri, Y. State of the art in vision-based localization techniques for autonomous navigation systems. IEEE Access 2021, 9, 76847–76874. [Google Scholar] [CrossRef]
  23. Ali, R.; Kang, D.; Suh, G.; Cha, Y.J. Real-time multiple damage mapping using autonomous UAV and deep faster region-based neural networks for GPS-denied structures. Autom. Constr. 2021, 130, 103831. [Google Scholar] [CrossRef]
  24. Jiang, S.; Wu, Y.; Zhang, J. Bridge coating inspection based on two-stage automatic method and collision-tolerant unmanned aerial system. Autom. Constr. 2023, 146, 104685. [Google Scholar] [CrossRef]
  25. Chen, Y.-E.; Liew, H.-H.; Chao, J.-C.; Wu, R.-B. Decimeter-accuracy positioning for drones using two-stage trilateration in a GPS-denied environment. IEEE Internet Things J. 2022, 10, 8319–8326. [Google Scholar] [CrossRef]
  26. Si, M.; Wang, Y.; Zhou, N.; Seow, C.; Siljak, H. A hybrid indoor altimetry based on barometer and UWB. J. Sens. 2023, 23, 4180. [Google Scholar] [CrossRef]
  27. Nguyen, T.M.; Zaini, A.H.; Guo, K.; Xie, L. An ultra-wideband-based multi-UAV localization system in GPS-denied environments. In Proceedings of the International Micro Air Vehicle Conference and Competition 2016, Beijing, China, 17–21 October 2016. [Google Scholar]
  28. Wang, P.-H.; Wu, R.-B. An ultra-wideband handover system for GPS-free bridge inspection using drones. Sensors 2025, 25, 1923. [Google Scholar] [CrossRef]
  29. Wang, Y.; Li, X. The IMU/UWB fusion positioning algorithm based on a particle filter. ISPRS Int. J. Geo-Inf. 2017, 6, 235. [Google Scholar] [CrossRef]
  30. Foy, W.H. Position-location solutions by Taylor-series estimation. IEEE Trans. Aerosp. Electron. Syst. 1976, 2, 187–194. [Google Scholar] [CrossRef]
  31. Chan, Y.T.; Ho, K.C. A simple and efficient estimator for hyperbolic location. IEEE Trans. Signal Process. 1994, 42, 1905–1915. [Google Scholar] [CrossRef]
  32. Tsai, C.-L.; Wu, R.-B. Enhanced UAV localization and outlier detection using SVD-enhanced UWB for bridge inspections. IEEE Internet Thing J. 2025, 12, 33111–33119. [Google Scholar] [CrossRef]
  33. He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask R-CNN. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 2961–2969. [Google Scholar] [CrossRef]
  34. Neuhold, G.; Ollmann, T.; Bulo, S.R.; Kontschieder, P. The mapillary vistas dataset for semantic understanding of street scenes. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017. [Google Scholar]
Figure 1. Bridge inspection framework. The colored boxes denote the major action steps.
Figure 1. Bridge inspection framework. The colored boxes denote the major action steps.
Drones 09 00678 g001
Figure 2. Schematic diagram of UAV flight paths, where the UAV positions are indicated and arrows denote the camera viewing directions.
Figure 2. Schematic diagram of UAV flight paths, where the UAV positions are indicated and arrows denote the camera viewing directions.
Drones 09 00678 g002
Figure 3. UWBs are deployed to form a network consisting of a UWB tag in DUT and multiple UWB modules in anchors. The DUT with UWB tag under the bridge receives UWB signals from anchors to determine its accurate location. Red lines denote UWB signals, and blue lines indicate GPS-RTK signals.
Figure 3. UWBs are deployed to form a network consisting of a UWB tag in DUT and multiple UWB modules in anchors. The DUT with UWB tag under the bridge receives UWB signals from anchors to determine its accurate location. Red lines denote UWB signals, and blue lines indicate GPS-RTK signals.
Drones 09 00678 g003
Figure 4. Main problems to be assessed in bridge inspection, with different colors indicating distinct degradation categories.
Figure 4. Main problems to be assessed in bridge inspection, with different colors indicating distinct degradation categories.
Drones 09 00678 g004
Figure 5. Three-dimensional bridge management system.
Figure 5. Three-dimensional bridge management system.
Drones 09 00678 g005
Figure 6. Flow chart of the handover process, including the DUT and an anchor in one of the zones. Once the anchor receives the character from the DUT that is defined in this zone, all anchors of this zone will be activated.
Figure 6. Flow chart of the handover process, including the DUT and an anchor in one of the zones. Once the anchor receives the character from the DUT that is defined in this zone, all anchors of this zone will be activated.
Drones 09 00678 g006
Figure 7. Experimental configuration diagram. The DUT moves in the direction of the dashed arrow. While it is in Zone_A, anchors that have corresponding character A are turned on, while others off.
Figure 7. Experimental configuration diagram. The DUT moves in the direction of the dashed arrow. While it is in Zone_A, anchors that have corresponding character A are turned on, while others off.
Drones 09 00678 g007
Figure 8. The x-y coordinates of the positioning results. The DUT moves along x direction. The estimated position (x, y) over time is plotted using UWB measurement with handover and compared to the estimated route (red lines). The arrows point to the corresponding vertical axes.
Figure 8. The x-y coordinates of the positioning results. The DUT moves along x direction. The estimated position (x, y) over time is plotted using UWB measurement with handover and compared to the estimated route (red lines). The arrows point to the corresponding vertical axes.
Drones 09 00678 g008
Figure 9. Anchor positions and walking path on tilted plane.
Figure 9. Anchor positions and walking path on tilted plane.
Drones 09 00678 g009
Figure 10. Comparison of original and SVD two-stage methods for 2D positioning at θ = 3.4 ° .
Figure 10. Comparison of original and SVD two-stage methods for 2D positioning at θ = 3.4 ° .
Drones 09 00678 g010
Figure 11. Hardware used in the experiment. (a) UAV, and (b) UWB sensor.
Figure 11. Hardware used in the experiment. (a) UAV, and (b) UWB sensor.
Drones 09 00678 g011
Figure 12. Anchor coordinates G1–G5 are determined via VRS-RTK, while G6–G7 coordinates are obtained using a positioning algorithm based on G1–G5.
Figure 12. Anchor coordinates G1–G5 are determined via VRS-RTK, while G6–G7 coordinates are obtained using a positioning algorithm based on G1–G5.
Drones 09 00678 g012
Figure 13. The 166 m long bridge is deployed with a total of 27 UWB anchors along its span.
Figure 13. The 166 m long bridge is deployed with a total of 27 UWB anchors along its span.
Drones 09 00678 g013
Figure 14. (a) Drone’s flight trajectory and (b) Pre-defining path via Mission Planner. The purple line represents the drone’s actual flight path, which closely matches the planned route.
Figure 14. (a) Drone’s flight trajectory and (b) Pre-defining path via Mission Planner. The purple line represents the drone’s actual flight path, which closely matches the planned route.
Drones 09 00678 g014
Figure 15. Illustration of a task comprising six waypoints. (1) displays the Flight Plan page panel, (2) lists the coordinates and dwell time of the waypoints, and (3) displays the extraction and recording of waypoints into the Mission Planner.
Figure 15. Illustration of a task comprising six waypoints. (1) displays the Flight Plan page panel, (2) lists the coordinates and dwell time of the waypoints, and (3) displays the extraction and recording of waypoints into the Mission Planner.
Drones 09 00678 g015
Figure 16. The framework for embedding coordinates into UAV images. The blue and red arrows represent the flow of control commands via SBUS module and information transmission via on-screen display (OSD) module, respectively, from the Pixhawk to the camera.
Figure 16. The framework for embedding coordinates into UAV images. The blue and red arrows represent the flow of control commands via SBUS module and information transmission via on-screen display (OSD) module, respectively, from the Pixhawk to the camera.
Drones 09 00678 g016
Figure 17. Images captured by UAV for the inspection of Bridge B.
Figure 17. Images captured by UAV for the inspection of Bridge B.
Drones 09 00678 g017
Figure 18. AI detection results for the bridge (cracks in blue; spalling in red; exposed rebar in green).
Figure 18. AI detection results for the bridge (cracks in blue; spalling in red; exposed rebar in green).
Drones 09 00678 g018
Table 1. Comparison of RMSE (x-, y- and z-axis), condition number, and CPU time using original and SVD two-stage methods and Taylor series algorithm.
Table 1. Comparison of RMSE (x-, y- and z-axis), condition number, and CPU time using original and SVD two-stage methods and Taylor series algorithm.
MethodRMSE (m)Condition NumberPer-Point CPU Time (ms)
X-AxisY-AxisZ-Axis
Two-stage (original)1.01910.20031.3586060.083
Two-stage (SVD)0.04850.18940.5216.520.078
Taylor-series algorithm0.08200.15210.3128NA0.16
Table 2. Summary of Validation Bridges and UWB Deployment.
Table 2. Summary of Validation Bridges and UWB Deployment.
Location (Case)TypeDimensions
(m2)
Number of UWB AnchorsAchieved AccuracyRemarks
Bridge A Small bridge30 × 107
(G1–G7)
High (sub-meter level)5 anchors placed around the perimeter (GPS-based), plus 2 beneath the bridge to enhance UAV stability during under-bridge flights; G5–G6 only 9 m apart due to site constraints.
Bridge B Long-span bridge166 × 29.5 (urban area)27High (sub-meter level)Anchors distributed along the entire span; additional temporary anchors placed on riverbanks to maintain network stability across large water gaps (>40 m between piers).
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bai, J.-H.; Hsu, C.-R.; Han, J.-Y.; Wu, R.-B. A GPS-Free Bridge Inspection Method Tailored to Bridge Terrain with High Positioning Stability. Drones 2025, 9, 678. https://doi.org/10.3390/drones9100678

AMA Style

Bai J-H, Hsu C-R, Han J-Y, Wu R-B. A GPS-Free Bridge Inspection Method Tailored to Bridge Terrain with High Positioning Stability. Drones. 2025; 9(10):678. https://doi.org/10.3390/drones9100678

Chicago/Turabian Style

Bai, Jia-Hau, Chin-Rou Hsu, Jen-Yu Han, and Ruey-Beei Wu. 2025. "A GPS-Free Bridge Inspection Method Tailored to Bridge Terrain with High Positioning Stability" Drones 9, no. 10: 678. https://doi.org/10.3390/drones9100678

APA Style

Bai, J.-H., Hsu, C.-R., Han, J.-Y., & Wu, R.-B. (2025). A GPS-Free Bridge Inspection Method Tailored to Bridge Terrain with High Positioning Stability. Drones, 9(10), 678. https://doi.org/10.3390/drones9100678

Article Metrics

Back to TopTop