# A Multiple Sensors Platform Method for Power Line Inspection Based on a Large Unmanned Helicopter

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. The Composition of the Multiple Sensors Platform

## 3. Planning of Flight Paths and Tasks of the Sensors

#### 3.1. Waypoints Planning of the Unmanned Helicopter

#### 3.1.1. Referential Waypoints Planning

#### 3.1.2. Optimization of Waypoints

- (a)
- For any one segment$\text{}\overline{{R}_{i}{R}_{i+1}}$, disperse it into $\mathrm{n}$ temporary points according to a step length.
- (b)
- Staring from$\text{}{R}_{i}$, we take each temporary point $\mathrm{p}$ as the center and ${D}_{3}$ as the radius to draw a circle. Based on the DSM data, estimate whether the distance between the coordinates of point$\text{}\mathrm{p}$ and the ground within the circle is greater than$\text{}{D}_{3}$ is estimated. If it is greater, the following steps should be continued; if it is less, the auxiliary waypoint$\text{}{R}_{p}$ is added at point$\text{}\mathrm{p}$. The height of$\text{}{R}_{p}\text{}$should enable the distance between$\text{}{R}_{p}\text{}$and all ground points within the circle to be greater than$\text{}{D}_{3}$. When point $\mathrm{p}$ is too close to a certain waypoint, this waypoint can be heightened to ensure that the height of point$\text{}\mathrm{p}$ meets the requirement, without the necessity to adding new auxiliary waypoints.
- (c)
- Taking point $\mathrm{p}$ as the center and taking$\text{}{D}_{1}\text{}$as the radius, re-draw a circle to estimate whether there are non-target power lines falling into the new circle, as shown in Figure 3.
- (d)
- If there are non-target power lines fall into the new circle and cross with axle b, the distance between point $\mathrm{p}$ and the power lines is ${\mathrm{r}}^{\prime}$, it need to add an auxiliary waypoint$\text{}{R}_{p}$ acquired by moving$\text{}\mathrm{p}$ outwardly with ${D}_{1}-{r}^{\prime}$.
- (e)
- If there are power lines fall into this circle and cross with axle a, it is necessary to estimate whether the vertical distance between point $\mathrm{p}$ and the power line is greater than$\text{}{D}_{2}$. If it is greater, nothing needs to be done; if it is less, it is necessary to add the auxiliary waypoint$\text{}{R}_{p}$, of which the $\left(x,y\right)\text{}$can be acquired by intersection point between the power lines and axle a, and z is the height of the power lines added with$\text{}{D}_{2}$. Similarly, when point $\mathrm{p}$ is too close to a certain waypoint, it is feasible to directly heighten this waypoint to ensure that the safe distance between point $\mathrm{p}$ and the lines can meet the requirement.
- (f)
- Repeat Steps b–e to complete the optimization of all waypoints. It needs to be noted that, once a new auxiliary waypoint was added in steps b, d, and e, it is necessary to recalculate the dispersed temporary points from the last waypoint, then repeat Steps b–e.

#### 3.1.3. Operating Parameters of LiDAR

#### 3.2. Planning of Tasks of the Sensors

- (a)
- According to the insulators’ number of ${T}_{i+1}$ towers on the right side, we can get the number$\text{}\mathrm{M}\text{}$of the task points for the long-focus camera. The centric coordinates of the insulators are used as target positions, and the shooting order is from bottom to top and from left to right as shown in Figure 5b. The insulators on the other side of the towers are inspected through bilateral flying.
- (b)
- To ensure that there is enough time for the implementation, set the minimum distance between every two task points as$\text{}\mathrm{ts}$, multiplied by$\text{}\mathrm{M}$, we can get $R{L}_{2}=ts\times M\text{}$which is the flight distance of the unmanned helicopter when the long-focus camera is tracking the insulators. The start point$\text{}{R}_{c}\text{}$of the task points of the long-focus camera can be obtained based on$\text{}R{L}_{2}$, and the spacing between the two task points is$\text{}\mathrm{ts}$.
- (c)
- Calculate the image width $\mathrm{W}$ according to FOV of the short-focus camera and the distance between camera and power lines, the distance here adopts ${D}_{1}+w/2$, which is mentioned in Section 3.1. Starting from$\text{}{T}_{i}$, calculate the times the short-focus camera need to take photos between $\overline{{T}_{i}{T}_{i+1}}$ in the horizontal direction as the overlap is $q,\text{}\mathrm{N}=TL/\left(\left(1-q\right)\times W\right)+1$. The centric position of each image can be used as the target position for tracking.
- (d)
- Then to plan uniformly-spaced task points for the short-focus camera according to $\mathrm{N}$ on the flight segment of $\overline{{R}_{i}{R}_{c}}$, the length of $\overline{{R}_{i}{R}_{c}}$ is $R{L}_{1}$. However, when $\mathrm{RL}<\left(N+M\right)\times ts$, all the task points of the short-focused camera need to be completed starting from$\text{}{R}_{i}$ based on a separation distance of$\text{}\mathrm{ts}$. The task points of the long-focus camera are put off orderly, and the exceeded task points are all set in the position of ${R}_{i+1}$. Meanwhile the unmanned helicopter is set to hovering at the position of ${R}_{i+1}$, and the hovering duration is determined according to the amount of the exceeded tasks and the minimum duration needed to implement the tasks.

## 4. Automatic Target Tracking

#### 4.1. Real-time Correction of Installation Errors

#### 4.2. Distance Control

#### 4.3. Attitude Control

- (a)
- Motion compensation: If the time-delay of the camera shooting is $t$, before calculating the heading and the pitch angle between the photographing center of the camera and the target, it needs to take the distance of the unmanned helicopter flying forward within $t$ into consideration. The current coordinates of the photographing center of the cameras are $\left({x}_{0},{y}_{0},{z}_{0}\right)$, the current speeds provided by POS are $\left({v}_{x},{v}_{y},{v}_{z}\right)$, and after the time $t$, the photographing center of the cameras reach the coordinates of $\left({x}_{1},{y}_{1},{z}_{1}\right)$. According to Equation (8), $\left({x}_{1},{y}_{1},{z}_{1}\right)$ can be obtained:$$\left[\begin{array}{c}{x}_{1}\\ {y}_{1}\\ {z}_{1}\end{array}\right]={R}_{l}^{c}\left({\alpha}^{\prime},{\beta}^{\prime},{\gamma}^{\prime}\right)\left[\begin{array}{c}{v}_{x}t\\ {v}_{y}t\\ {v}_{z}t\end{array}\right]+\left[\begin{array}{c}\mathrm{\Delta}x\\ \mathrm{\Delta}y\\ \mathrm{\Delta}z\end{array}\right]+\left[\begin{array}{c}{x}_{0}\\ {y}_{0}\\ {z}_{0}\end{array}\right]$$In the formula above, ${R}_{l}^{c}\left({\alpha}^{\prime},{\beta}^{\prime},{\gamma}^{\prime}\right)$ is the rotation matrix from the IMU coordinate system to the coordinate system of the camera. $\left(\mathrm{\Delta}x,\mathrm{\Delta}y,\mathrm{\Delta}z\right)$ is the eccentric component between the IMU center and the center of the camera.
- (b)
- Angle calculation: Coordinates of the photographing center of the camera are $\left({x}_{1},{y}_{1},{z}_{1}\right)$; coordinates of the target are $\left({x}_{2},{y}_{2},{z}_{2}\right)$. We can calculate the heading and pitch at according to Equations (9) and (10):$$\mathrm{heading}=\{\begin{array}{c}\pi /2-\epsilon \text{}\left({x}_{2}{x}_{1},{y}_{2}\ne {y}_{1}\right)\hfill \\ 3\pi /2-\epsilon \left({x}_{2}{x}_{1},{y}_{2}\ne {y}_{1}\right)\hfill \\ 0\text{}\left({x}_{2}={x}_{1},{y}_{2}{y}_{1}\right)\hfill \\ \pi \text{}\left({x}_{2}={x}_{1},{y}_{2}{y}_{1}\right)\hfill \end{array}$$$$\mathrm{pitch}={\mathrm{sin}}^{-1}\left(\left({z}_{1}-{z}_{2}\right)/dist\right)$$$$k=\left({y}_{2}-{y}_{1}\right)/\left({x}_{2}-{x}_{1}\right)$$$$\mathsf{\epsilon}={\mathrm{tan}}^{-1}\left(k\right)$$$$dist=\sqrt{{\left({x}_{2}-{x}_{1}\right)}^{2}+{\left({y}_{2}-{y}_{1}\right)}^{2}+{\left({z}_{1}-{z}_{2}\right)}^{2}}$$
- (c)
- Attitude control: To send angle adjustment instructions to the stabilized platform. Setting an angle threshold to conduct real-time estimation on whether the attitude of the platform meets the requirement for the threshold value. If it meets the requirement, photographing instructions are sent to the cameras; if it does not meet the requirement, we repeat Steps a-c until the requirement is met.

## 5. Experiments and Analyses

## 6. Conclusions

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Bjarnadottir, S.; Li, Y. Risk-based economic assessment of mitigation strategies for power distribution poles subjected to hurricanes. Struct. Infrastruct. Eng.
**2014**, 10, 740–752. [Google Scholar] [CrossRef] - França, G.B.; Oliveira, A.N. A Fire-Risk-Breakdown System for Electrical Power Lines in the North of Brazil. J. Appl. Meteorol. Climatol.
**2014**, 53, 813–823. [Google Scholar] [CrossRef] - Pouliot, N.; Montambault, S. LineScout Technology: From Inspection to Robotic Maintenance on Live Transmission Power Lines. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009. [Google Scholar]
- Richard, P.L.; Pouliot, N. Introduction of a LIDAR-based obstacle detection system on the LineScout power line robot. In Proceedings of the IEEE International Conference on Advanced Intelligent Mechatronics (AIM), Besancon, France, 8–11 July 2014. [Google Scholar]
- Debenest, P.; Guarnieri, M. Expliner—Robot for inspection of transmission lines. In Proceedings of the IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, 19–23 May 2008. [Google Scholar]
- Debenest, P.; Guarnieri, M. Field experiences using LineScout Technology on large BC transmission crossings. In Proceedings of the 1st International Conference on Applied Robotics for the Power Industry, Montreal, QC, Canada, 5–7 October 2010. [Google Scholar]
- Phillips, A.; Engdahl, E. Autonomous Overhead Transmission Line Inspection Robot (TI) Development and Demonstration. In Proceedings of the 2nd International Conference on Applied Robotics for the Power Industry, ETH Zurich, Zurich, Switzerland, 11–13 September 2012. [Google Scholar]
- Byambasuren, B.E.; Kim, D. Inspection Robot Based Mobile Sensing and Power Line Tracking for Smart Grid. Sensors
**2016**, 16, 250. [Google Scholar] [CrossRef] [PubMed] - Seok, K.H.; Kim, Y.S. A State of the art of Power Transmission Line Maintenance Robots. J. Electr. Eng. Technol.
**2016**, 9, 1412–1422. [Google Scholar] [CrossRef] - Jones, D. Aerial inspection of overhead power lines using video: Estimation of image blurring due to vehicle and camera motion. IEE Proc. Vision Image Signal Process.
**2000**, 147, 157–166. [Google Scholar] [CrossRef] - Whitworth, C.C.; Duller, A.W.G. Aerial video inspection of overhead power lines. Power Eng. J.
**2001**, 15, 25–32. [Google Scholar] [CrossRef] - Wang, B.; Chen, X. Power line inspection with a flying robot. In Proceedings of the 1st International Conference on Applied Robotics for the Power Industry, Montreal, QC, Canada, 5–7 October 2010. [Google Scholar]
- Dai, L.; Qi, J. Camera selection for unmanned helicopter power line inspection. In Proceedings of the 3rd IEEE PES Innovative Smart Grid Technologies—Asia (ISGT Asia), Tianjin, China, 21–24 May 2012. [Google Scholar]
- Li, H.M.; Wang, B. The design and application of SmartCopter: An unmanned helicopter based robot for transmission line inspection. In Proceedings of the Chinese Automation Congress, Changsha, China, 7–8 November 2013. [Google Scholar]
- Luque-Vega, L.F.; Castillo-Toledo, B. Power line inspection via an unmanned aerial system based on the quadrotor helicopter. In Proceedings of the 17th IEEE Mediterranean Electrotechnical Conference, Beirut, Lebanon, 13–16 April 2014. [Google Scholar]
- Rafique, S.F.; Bodla, M.K. Design and implementation of a UAV for power system utility inspection. In Proceedings of the 16th International Power Electronics and Motion Control Conference and Exposition (PEMC), Antalya, Turkey, 21–24 September 2014. [Google Scholar]
- Mulero-Pázmány, M.; Negro, J.J. A low cost way for assessing bird risk hazards in power lines: Fixed-wing small unmanned aircraft systems. J. Unmanned Veh. Syst.
**2014**, 2, 5–15. [Google Scholar] [CrossRef] - Liu, C.A.; Dong, R. A 3D Laboratory Test-platform for Overhead Power Line Inspection. Int. J. Adv. Robot. Syst.
**2016**, 13, 72. [Google Scholar] [CrossRef] - Zhang, Y.; Yuan, X. UAV Low Altitude Photogrammetry for Power Line Inspection. ISPRS Int. J. Geo-Inf.
**2017**, 6, 14. [Google Scholar] [CrossRef] - McLaughlin, R.A. Extracting transmission lines from airborne LIDAR data. IEEE Geosci. Remote Sens. Lett.
**2006**, 3, 222–226. [Google Scholar] [CrossRef] - Yoonseok, J.; Gunho, S. A Piecewise Catenary Curve Model Growing for 3D Power Line Reconstruction. Photogramm. Eng. Remote Sens.
**2012**, 78, 1227–1240. [Google Scholar] [CrossRef] - Connie, K.O.; Remmel, T.K.; Sohn, G. Mapping tree genera using discrete LiDAR and geometric tree metrics. Bosque
**2012**, 33, 313–319. [Google Scholar] [CrossRef] - Kim, H.B.; Sohn, G. Point-based Classification of Power Line Corridor Scene Using Random Forests. Photogramm. Eng. Remote Sens.
**2013**, 79, 821–833. [Google Scholar] [CrossRef] - Zhou, X.Y.; Jia, Y. Experimental Validation of a Compound Control Scheme for a Two-Axis Inertially Stabilized Platform with Multi-Sensors in an Unmanned Helicopter-Based Airborne Power Line Inspection System. Sensors
**2016**, 16, 366. [Google Scholar] [CrossRef] [PubMed] - Shanmugavel, M.; Tsourdos, A. Co-operative path planning of multiple UAVs using Dubins paths with clothoid arcs. Control Eng. Pract.
**2010**, 18, 1084–1092. [Google Scholar] [CrossRef] - Lin, Y.; Saripalli, S. Path planning using 3D Dubins Curve for Unmanned Aerial Vehicles. In Proceedings of the International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014. [Google Scholar]
- Wang, Y.; Zhu, X.P. Path Planning Based on PH Curves for Unmanned Aerial Vehicles. Comput. Simul.
**2013**, 30, 76–79. [Google Scholar]

**Figure 1.**Appearance of the multiple sensors platform: (

**a**) Overall picture of the multiple sensors platform; (

**b**) Appearance of the sensor pod.

**Figure 3.**The interference of non-target power lines: (

**a**) Power lines in parallel; (

**b**) Power lines cross over.

**Figure 5.**The principle of task planning: (

**a**) The layout of task points for two cameras; (

**b**) The order to shoot insulators of the long-focus camera.

**Figure 9.**The result of planned flight path and tasks, and true flight path, the elevation in this area is color-coded. (

**a**) The planned flight path; (

**b**) The planned task points; (

**c**) The zoomed image of task points at tower No. 320; (

**d**) The true flight path during power line inspection.

**Figure 10.**Offset between image center and the target. (

**a**) Offset in horizontal direction; (

**b**) Offset in vertical direction.

**Figure 11.**The data acquired by sensors during power line inspection. (

**a**) The LiDAR point cloud, the compass on the left displays elevation, while the compass on the right displays azimuth; (

**b**) The thermal infrared data, H, L, M and C represent the maximum temperature, minimum temperature, temperature at image center, and temperature at crosshair position respectively; (

**c**) The photons acquired by ultraviolet camera, superimposed on the optical image; (

**d**) The image of tower acquired by the short-focus camera; (

**e**) The zoomed image of the insulator acquired by the long-focus camera.

**Figure 12.**Two anomalies on the power line. (

**a**) Broken insulator on tower No. 319; (

**b**) A torsional dampers fallen off at tower No. 327.

Sensor Name | Product Model | Focal Length (mm) | FOV ($\mathit{H}\times \mathit{V}$) | Data Type |
---|---|---|---|---|

LiDAR | Rigel VZ400 | - | - | Point cloud |

Thermal camera | customized | 100 | ${9}^{\xb0}\times {6}^{\xb0}$ | Video |

Ultraviolet camera | customized | - | ${9}^{\xb0}\times {6.75}^{\xb0}$ | Video |

The short-focus camera | Canon 5D Mark II | 35 | ${54}^{\xb0}\times {38}^{\xb0}$ | Image |

The long-focus camera | Canon 5D Mark II | 180 | ${11}^{\xb0}\times {7.6}^{\xb0}$ | Image |

Parameter | Value |
---|---|

Scanning speed | 40 line/s |

Pulse repetition rate | 300 kHz |

Starting scanning angle | ${40}^{\xb0}$ |

Terminational scanning angle | ${105}^{\xb0}$ |

Parameter | Value |
---|---|

Safe distance to power line in horizontal direction | 30 m |

Safe distance to power line in vertical direction | 40 m |

Safe distance of unmanned helicopter | 50 m |

The distance of $\mathrm{S}$ | 50 m |

The overlay of image | 30% |

The minimum distance for task | 5 m |

The minimum time for task | 2 s |

Sensor | Planned Task Number | Image Number | Success Rate |
---|---|---|---|

The short-focus camera | 163 | 163 | 100% |

The long-focus camera | 78 | 73 | 93.5% |

Total | 241 | 236 | 98% |

Direction | Mean (m) | Standard Deviation (m) |
---|---|---|

Horizontal | −0.05 | 0.71 |

Vertical | 0.04 | 0.69 |

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Xie, X.; Liu, Z.; Xu, C.; Zhang, Y.
A Multiple Sensors Platform Method for Power Line Inspection Based on a Large Unmanned Helicopter. *Sensors* **2017**, *17*, 1222.
https://doi.org/10.3390/s17061222

**AMA Style**

Xie X, Liu Z, Xu C, Zhang Y.
A Multiple Sensors Platform Method for Power Line Inspection Based on a Large Unmanned Helicopter. *Sensors*. 2017; 17(6):1222.
https://doi.org/10.3390/s17061222

**Chicago/Turabian Style**

Xie, Xiaowei, Zhengjun Liu, Caijun Xu, and Yongzhen Zhang.
2017. "A Multiple Sensors Platform Method for Power Line Inspection Based on a Large Unmanned Helicopter" *Sensors* 17, no. 6: 1222.
https://doi.org/10.3390/s17061222