An Embedded Platform for Positioning and Obstacle Detection for Small Unmanned Aerial Vehicles

.


Introduction
Flexibility, safety, customizability, high mobility and increasingly low costs, together with efforts in the fields of innovative materials, energy storage, sensors, imaging devices, electronics and computer science, have resulted in the last decade in a phenomenal development of consumer-grade and professional-grade semi-autonomous (Remotely Piloted Aircraft Systems, RPAS) or autonomous aerial vehicles (Unmanned Aircraft Systems, UAS), as well as remotely operated vehicles for ground and sea applications (terrestrial rovers, unmanned ships, underwater drones) [1][2][3].Research on UASs has developed very rapidly due to the wide number of military and civilian applications [4][5][6][7], and to the new challenges presented by different mission scenarios and flight profiles, such as lowelevation flights, hovering, high-dynamic maneuvers, site revisiting, etc.In particular, rotary-wing UASs have been shown to benefit with regards to easier landing and takeoff, improved maneuverability, and capabilities of infrastructure inspection or the monitoring of small areas.Therefore, much effort is being put into raising the level of autonomy of unmanned vehicles, devising strategies for low-level autonomous flight control, positioning and environment perception, as well as high-level path planning, navigation and obstacle detection, with vision-based techniques or lowcost sensor arrangements (inertial, sonic, imaging, active radars, etc.) [8][9][10][11][12], all of which has motivated the development of specialized control architectures and solutions [13,14].The implementation of a UAV intelligent system requires autonomous navigation, control and manoeuvring algorithms in dynamic environments (for autonomous or remotely piloted modes), in order to achieve satisfactory performances in the framework of the selected mission [15][16][17][18].
This work, extending the preliminary conceptual design recently presented in a congress paper [19], deals with the development of a UAS with a Positioning, field mapping, Obstacle Detection and Avoiding (PODA) embedded system [20], exploiting lightweight, low-cost and fastresponse sensors.The chosen sensors are a 10-Degrees-of-Freedom (DoF) Inertial Measurement Unit (IMU) and a Light Detection and Ranging (LiDAR) sensor.LiDAR sensors, first used in remote sensing applications, have been proposed for the effective assessment of landing zones for small helicopters [21], but recent advances in LiDAR technology have allowed UAV users to load reliable, small, low-price sensors [22].
In this investigation, the LiDAR is used for measuring the distance from the ground, whereas the IMU, using a combination of accelerometers, gyroscopes and magnetometers, allows us to estimate the platform velocity, attitude and gravitational forces [23][24][25].The platform is modeled as a linear dynamic system perturbed by white Gaussian noise, with measurements linearly related to the state but corrupted by additive white Gaussian noise.With these assumptions, an optimal estimation (in the sense of minimum mean squared estimation error) of the system state variables from corrupted sensor readings (acceleration measurements) is achieved by real-time Kalman filtering, implemented on a microcontroller (Arduino Mega 2560, Arduino Srl, Monza, Italy).With respect to the other experimental configurations previously developed by the authors [26,27], additional information provided by the LiDAR and IMU gives more accurate attitude estimations and position evolution.
A block diagram of the PODA system is depicted in Figure 1, in which data acquired from the sensors are managed by the Arduino board, which in turn hosts the algorithms and the functions shown in the blocks.The estimates of position,  ̂= (, , )  , attitude,  ̂= (, , )  (roll, pitch and yaw angles) and distance,  ̂, are used for obstacle detection and flight planning.These capabilities are needed for typical UAS applications, such as general visual inspection (GVI) [28], Detect-and-Avoid (DAA) [29] and autonomous landing [30].The content of the paper is as follows.After this Introduction, Section 2 describes the commercial quadcopter (Parrot AR.Drone 2.0) used in this study, together with the PODA setup (sensors, microcontroller, and mathematical models for accelerometer calibration and system state estimation).Section 3 is devoted to experimental results and system validation, while conclusions and ideas for further work are outlined in Section 4.

Unmanned Aerial Vehicle
The commercial UAV selected for this research is the Parrot AR.Drone 2.0, shown in Figure 2, a small quad-rotor of the Micro UAV (MUAV, weight 0.1-1 kg, length 0.1-1 m) class.Since its inception, this fully electric quadcopter, originally designed for augmented reality games, caught the attention of several universities and research groups as a platform for educational and robotic research [30][31][32], due to its carbon-fiber support structure which consolidates modern aeronautic technology, a rigid design and easy maintainability into a very versatile vehicle.The AR.Drone 2.0 has four high-efficiency propellers installed on direct-current brushless motors (14.5-W power absorption, 28,500 rev/min) which allow the drone to attain speeds of over 5 m/s.The vehicle is equipped with a control computer based on the 1-GHz ARM Cortex A8 processor, with up to 1 GB of RAM and a software interface provided by the manufacturer, which allows communication with the drone via standard Wi-Fi networks (b, g or n).The aircraft is controllable using a PC, a tablet or a smartphone, and can also be operated outdoors during calm weather conditions (little or no wind).Two 1500-mAh high-density lythium polymer batteries can provide up to 36 min of flight [33].Both the control computer and the sensor suite of the drone (a 10-DoF IMU, an altitude ultrasound sensor, a bottom camera for measuring the ground speed) can be bypassed, allowing the user to build a customized programmable control board.The total mass without payload is 420 g with external frame (indoor hull), 380 g with internal frame (outdoor hull).The outdoor hull configuration was used during the experimental activities, gaining about 40 g (the cover weight) for the PODA system.

LiDAR Sensor
The small, low-power, low-cost Garmin LiDAR-Lite v3 uses pulse trains of near-infrared laser signals (with a 905-nm nominal wavelength and a 10-kHz to 20-kHz pulse train repetition frequency) to measure distances by calculating the round-trip time delay of the light pulses reflected by a target.Exclusive signal processing algorithms are used to achieve high sensitivity, speed and accuracy [34]; among them, the sensor implements a receiver bias correction procedure which takes into account changing ambient light levels and optimizes sensitivity.The device runs at 5 Vdc, with typical current absorption of 135 mA in continuous operation and a total laser peak power of 1.3 W, and it has a twowire I 2 C-compatible serial interface [35].It acquires data with up to a 400-kHz update rate (Fast Mode data transfer) and can be connected to an I 2 C bus as a slave device, under the control of an I 2 C master device.The sensor range is 0-40 m with a target reflecting 70% of the incident laser light, and the beam diameter at the laser aperture is 12 mm × 2 mm.The sensor characterization has been performed by measuring the distance from an obstacle in the range 30-180 cm, with 5-cm steps.Data were collected at a 4-Hz sampling frequency, and each static distance measurement was acquired for 2 min (480 samples per measurement) to allow for the estimation of residual sensor bias and measurement variances as a function of distance from the object.
Data were transferred to a PC via a simple terminal application for exchange data through USB connections (CoolTerm [36]) and successively post-processed in the Matlab ® (R2019a, The Mathworks, Inc., Natich, MA, USA) environment, evaluating mean and standard deviations for each "station" (31 stations; 14,880-sample total).Figure 4 shows the averaged LiDAR measurements in the above-mentioned distance range, and in Figure 5 variances for each measurement are plotted.Typical values of the measurement uncertainty were found to be in the range of 1.2-2.2cm (as shown in Figure 5,   2 , the LiDAR measurement variance, is in the range 1.5 ≤   2 ≤ 4.7 cm 2 ).The average residual bias, which was removed in the experimental campaigns described in Section 3, was less than 9 cm, which is in good agreement with the sensor specifications [34].
The relative position of the platform was estimated through the three components of the acceleration vector, by means of numerical double integration (integration of the velocity, i.e., of the integral of the acceleration) [42].Figure 6 shows a sample of accelerometer data acquired by putting the sensor in a horizontal plane (x and y axes), therefore detecting gravity along the z axis.The high sensitivity of acceleration measurements to platform vibrations, together with biases (offsets) or drifts, could give bad estimates of the double-integrated position.Other error sources are scale factor mismatch and misalignments between the accelerometer sensing axes and the platform body axes.The relationship between the raw measurements   ,   and   and the actual acceleration components is modeled as: where which is in the form   =  • , with  = [       1 ]  .The 12 unknown calibration parameters are the elements of the matrix .The least-squares solution gives for  the following espression: To estimate the parameters involved in Equation ( 2), a calibration technique based on six stationary acquisitions [43] was implemented and run as an Arduino sketch.Setting two axes in a horizontal plane, the third axis was oriented in a +1 g and a −1 g field, minimizing the cross-axis sensitivity effect.We neglected changes in gravity due to changes in altitude and in latitude, and assumed for g the "local" value at a 45-degree latitude,  45 = 9.8066 m/s 2 [44].As far as the ITG-3250 gyro is concerned, bias calibration (that is, zero output with stationary gyro) was performed using the standard functions (zeroCalibrate and getXYZ) included in the libraries hosted by the Arduino IDE (Integrated Development Environment) [45].

WiFi Module
To provide internet connectivity between the PODA platform and the PC-based ground station, the 3.3-V, 2.4-GHz Wi-Fi module ESP8266EX (Espressif Systems, Inc., Shangai, China) has been selected (Figure 7).This programmable, user friendly, low-cost module can work both as an access point and a station, fetching data from the microcontroller via an I 2 C interface, and uploading to the remote station, and can be used for a whole host of applications, from home automation to mobile devices, wearable electronics, Internet of Things (IoT) and Wi-Fi position system beacons.The module implements TCP/IP and full 802.11b/g/n WLAN MAC protocol, with an average operating current of 80 mA, integrating antenna switches, a power amplifier and a low-noise receive amplifier, together with filters and power management subsystems.Interfacing with external sensors and other devices is achieved via on-chip SRAM (at least 512 kB) and a 32-bit RISC processor (L106 Diamond series, Tensilica, Inc.San Jose, CA, USA) [46,47].

Microcontroller and Assembly
The Arduino Mega 2560 board acquires data from IMU and LiDAR, performs signal conditioning (a simple 1-D Kalman low-pass filtering for noise reduction of each acceleration component and of LiDAR measurements), and sends them to a PC laptop.Post-processing and mapping of the area that surrounds the UAV are performed in the Matlab ® environment.Figure 8 shows the data acquisition architecture: sensor data are gathered and managed through digital input pins, with standard I 2 C connection.Data were transferred to the PC-based ground station via USB communication port during the laboratory tests with the prototype not installed on the drone, and via the ESP8266EX Wi-Fi module during the data acquisition campaigns.Figure 9 shows the PODA prototypical version mounted onboard the quadcopter.Preliminary static tests were set up for accelerometer calibration, estimating the parameter matrix  of Equation ( 3), and to determine measurement noise and accelerometer bias.The LiDAR measured distances along the z axis, whereas the x-y position and the velocity components were estimated through Kalman filtering of the accelerometer data, as explained below.

Platform State Estimation
A minimum-mean-square-error (MMSE) estimation of the six-element platform state vector  = [, , ,   ,   ,   ]  (position and velocity components) is implemented using Kalman filtering, typically described as an iterative prediction-updating-correction strategy [48].Measurement noise  and system process noise  are assumed to be zero mean and Gaussian-distributed.The PODA platform is modeled as a discrete linear dynamic system: where   is the value of  at time   = ∆,   the state transition matrix, containing the (generally time-dependent) coefficients of the state terms in the state dynamics,   is the input matrix, relating the state to the inputs  −1 ,   is the process noise vector, with zero mean and covariance matrix   .The three calibrated and pre-filtered IMU-derived acceleration measurements (  = [  ,   ,   ]  ) are modeled as inputs to the system.The measurement model is: where   is the k th measurement vector,   is the observation matrix, and   is the measurement noise vector, with zero mean and covariance matrix   , all of appropriate dimensions, and generally dependent on   .The simple models used for the state transition matrix and the input matrix of Equation ( 4), and the observation matrix of Equation ( 5), are, respectively: where ∆ is the sampling time.The predictor stage is as follows: In Equation ( 7),  ̂|−1 is the predicted state (i.e., the state at discrete time   ), given the previous state  ̂−1 , evaluated at time  −1 .For simplicity, the 6 × 6 matrix   has been set to , ∀ (no process noise).Equation ( 8) predicts the state error covariance matrix  |−1 at   , which allows us to calculate the Kalman gain: In the corrector stage, the new estimate is updated from the current measurement, the old estimate and the value of   : where   −  ̂|−1 is the innovation, i.e., the measurement pre-fit residual.The state error covariance is updated as follows: where I is the 6 × 6 identity matrix.The value of   , estimated from static and dynamic test measurements (see Table 1), and assumed constant, has been found to be: The KF, implemented in the Arduino IDE, ran in real-time during the data acquisition sessions, using as input the x-, y-and z-components of   .Post-processed numerical double integration of the pre-filtered calibrated acceleration components gave the platform position in the xyz plane, which was in good agreement with the real-time KF-derived position.

Experimental Results and System Validation
During the experimental sessions (drone equipped with PODA), data were collected in 2-min acquisitions at 10-Hz sampling frequency (i.e., a measurement each ∆ = 0.1 s, 1200 samples per experiment).A first validation of the PODA system was performed by installing the platform on the drone and acquiring data during a 2-min landing procedure from a height of 150 cm (initial position [0 cm,0 cm,150 cm]) above ground level, hovering for about 20 s at five intermediate stations (120 cm, 90 cm, 60 cm and 30 cm).The LiDAR pointed towards the ground and gathered vertical distances, whereas the landing point was located at [100 cm,50 cm,0 cm].The distance between the starting and the landing points was 190 cm.
The 10-DoF IMU collected x-y positioning data (by double integration of   and   , after removing bias and measurement noise) and attitude (roll and pitch angles, by integration of the rate gyros measurements, after removing bias and drift).Double integration of the z-component of the acceleration was used for comparison and fusion with the distance data acquired by LiDAR, to attain an estimate ̂ of the z position during the descent.A simple data fusion technique [49], assuming the optimal (minimum-variance) estimate ̂ as a linear function of the LiDAR data (  ) and doubleintegrated accelerometer measurements (   ) , with variances   2 and   2 , respectively, was implemented to get : The rightmost expression of Equation (13) shows that the data fusion methodology is equivalent to a simple 1-D KF with =1,  =   2 ,  = 1,   being the measurement and   being the "state".Figure 10a shows LiDAR-derived distances (raw and filtered measurements) during the indoor landing simulation, with 20 s of hovering at the six stations previously mentioned, and a descent of 2 s to the successive station.In Figure 10b the attitude history (pitch and roll angles), recorded by the calibrated IMU's rate gyro, is shown.A small offset (in the order of ±0.2 m/s 2 ), due to small misalignment errors, is noticeable on both   and   .The offline calibration procedure previously described (Section 2.3) allowed us to compensate for this bias and remove drift in the double integration procedure.Figure 12 compares the true landing path to the path obtained from raw LiDAR and IMU data, and from filtered observations.The estimated trajectory (filtered data) show a reduction of the drift in the x and y positions due to the double-integrated acceleration measurements ("Raw data" in the figure).The z position has been obtained by data fusion (Equation ( 13)).Ground truth data, necessary for assessing the quality of the estimated trajectory, were collected by measuring, during each hovering interval (approximately 20 s), the distance from the ground and the xyz position with a laser range finder (Sndway ® ST-100, ±2-mm accuracy, built by Dongguan Senwey Electronic Co., Guangdong, China).The five measured stations are indicated by circles on the "Real data" trajectory.
The PODA platform was successively tested in indoor and outdoor data acquisition campaigns.Figure 13 shows raw and estimated platform positions during a takeoff (data acquisition began at zero-height), followed by a climb up to 150 cm.In Figure 14, a complete flight (3 min) with random maneuvers is shown, with data acquired and downlinked to the ground station just after the takeoff and up to few seconds before landing.Figure 15 shows data acquired during a landing phase, from 150 cm to zero-height.Finally, a GVI mission was simulated, using the PODA system onboard the drone during three indoor flights around an aluminum plate (dimensions 1300 mm × 1500 mm × 3 mm), located at a 400mm height.Results are plotted in Figure 16.Table 1 shows measurement uncertainties evaluated at three hovering stations of the simulated landing (120 cm, 90 cm and 30 cm, respectively).Variances of raw and filtered LiDAR measurements, LiDAR-IMU distance from the ground and acceleration components (  ,   ) are reported.The filtered data show up to 55% improved accuracy.For example, the variance of the LiDAR-derived distance at 120 cm decreased from 1.69 cm 2 (before filtering) to 1.00 cm 2 after filtering.The data fusion strategy adopted for height estimation gives even better accuracy (from 1.00 cm 2 to 0.40 cm 2 , 60% improvement).

Conclusions and Future Work
The main objective of this investigation is the development and validation of an embedded system for a Positioning, field mapping, Obstacle Detection and Avoidance system (PODA) for a UAV.Lightweight, low-cost sensors (LiDAR and 10-DoF IMU), a Wi-Fi module for data downlink to a ground station (PC or tablet) and a programmable microcontroller (Arduino Mega 2560) with standard I 2 C interface are the main components of the PODA platform, which has shown adaptability, sustainability and cost reduction.The platform is tailored to fit different aerial vehicles, and its functionality is suitable to different applications that require positioning, field mapping and obstacle detection.The sensors' calibration procedure and the platform's state estimation technique have been described, together with a real-time pre-filtering (signal conditioning) technique for measurement noise mitigation.A simple data fusion methodology has been applied to the distance from the ground measured by the LiDAR and the altitude derived by double integration of the vertical component of the acceleration.The algorithms were tested in both hardware-in-the-loop simulations and on an actual UAV.Experimental results from indoor and outdoor campaigns and different scenarios (landing and inspection flights) with the PODA platform onboard a commercial quadrotor (Parrot Ar.Drone 2.0) have shown that double integration of the calibrated acceleration data, fused with raw LiDAR measurements, reduces uncertainties to the centimetre level.This feature can be effective for maneuvering the drone in critical conditions, detecting obstacles or intruders, and use during precise hovering and landing procedures.As expected, the Kalman-based estimation of the platform state vector significantly reduces the uncertainties regarding the vehicle's position and trajectory.
Future research activity will focus on developing fully automatic obstacle detection within the capabilities of the PODA system, and on analyzing the influence of changes in angle to the target (i.e., platform attitude) on the accuracy of LiDAR distance measurements.More complex indoor and outdoor conditions will allow us to validate the algorithms in different scenarios, and to explore the influence of different external factors on system performance.In order to design controllers and implement a DAA strategy, which is one of the functionalities foreseen for the PODA system, a Simulink model of the quadcopter dynamics is currently being developed.Further capabilities of the PODA platform, such as object and/or pattern recognition for safe landing area identification, are being studied for future conceptual improvements, focusing on strategies that would require little computational effort and a minimal amount of hardware.A promising approach could be a "learned sensing" strategy, relying on machine learning techniques [50].Multisensor data fusion could also be improved by using methodologies that are useful in the context of non-Gaussian conditions, and unknown or complicated cross-correlations between sensors, such as the arithmetical average (AA) fusion rule [51,52].Finally, improved estimation techniques, such as extended KF to account for nonlinearities, unscented KF or particle filtering, will be tested for there capacity to give autonomous flight and collision avoidance capability, with a reliable and cheap sensor suite, while still keeping the computational cost low, and reducing maintenance and development costs.

Figure 1 .
Figure 1.Conceptual block diagram of the Positioning, field mapping, Obstacle Detection and Avoiding (PODA) subsystem.

Figure 3 Figure 3 .
Figure3shows typical LiDAR-derived measurements, from an obstacle at a nominal distance of 50 cm.Data were collected in static conditions for 120 s, at a sampling rate of 4 Hz (0.25 s sampling time).During the tests, a real-time noise-removal procedure developed by the authors[19] was performed using a simple 1-D low-pass Kalman filter (KF).

Figure 4 .
Figure 4. LiDAR distance measurements after bias removal.Each dot represents the average of 480 measurements at a sampling frequency of 4 Hz, i.e., 120-s acquisition time per distance value.

Figure 6 .
Figure6.Raw accelerometer data (static test, no filtering).The nonzero x-and y-components are due to non-optimal alignment between the accelerometer axes and the body reference system.

Figure 8 .
Figure 8. Electrical interface of the data acquisition and Wi-Fi communication system.

Figure 9 .
Figure 9. PODA platform (prototypical version) mounted onboard the quadcopter.Each square of the ruler on the left side of the image has 1-cm side length.

Figure 10 .
Figure 10.(a) Landing with intermediate hovering stations: raw and filtered LiDAR distances from the ground.(b) Raw and filtered calibrated attitude data (pitch and roll).

Figure 11a ,Figure 11 .
Figure 11a,b show the raw and filtered x-and y-components of the measured acceleration, respectively.The high-frequency components have been attenuated by the prefiltering.

Figure 13 .
Figure 13.Takeoff and climb to a 150-cm height: raw data and KF-based position estimation.

Figure 15 .
Figure 15.Landing from a 150-cm height to ground.
= (  ,   ,   )   is the 3 × 3 misalignment matrix,   ,   and   are the scale factors, and   ,   and   are the offsets.Equation (1) can be rearranged as: , with T denoting transposition, is the actual acceleration vector (calibrated and normalized, i.e., √

Table 1 .
Variances of raw and filtered LiDAR and IMU data (x-and y-components of accelerations, and double-integrated z-component, fused with LiDAR data).