Methodology for Indoor Positioning and Landing of an Unmanned Aerial Vehicle in a Smart Manufacturing Plant for Light Part Delivery

: Unmanned aerial vehicles (UAV) are spreading their usage in many areas, including last-mile distribution. In this research, a UAV is used for performing light parts delivery to workstation operators within a manufacturing plant, where GPS is no valid solution for indoor positioning. A generic localization solution is designed to provide navigation using RFID received signal strength measures and sonar values. A system on chip computer is onboarded with two missions: ﬁrst, compute positioning and provide communication with backend software; second, provide an artiﬁcial vision system that cooperates with UAV’s navigation to perform landing procedures. An Industrial Internet of Things solution is deﬁned for workstations to allow wireless mesh communication between the logistics vehicle and the backend software. Design is corroborated through experiments that validate planned solutions.


Introduction and State of the Art
Unmanned aerial vehicles (UAVs) have irrupted from military and domestic entertainment uses, to distribution centers, count inventory warehouses, and supply chain logistics [1,2]. UAVs have high potential for bundle delivery in civil applications [3], having in the key speed capability, showed in several pilot tests (Amazon, Google), although yet adoption intention is still vague, because of many factors [4,5]. In logistics area, the feasibility of these flying artifacts has also been stated [6]. To achieve a certain expected delivery time from depots to customers, the infrastructure level is usually determined by simulation [7], and task-assignment and routes have to be calculated [8]. These routes must accommodate the physical constraints (locations of goods, delivery points), as in the vehicle routing problem [9]. UAVs have shown capabilities for delivering six kg payload to sixteen-kilometer distances [5], and demand as well as benefits in metropolitan areas have been estimated [10]. In urban and rural areas, UAVs have shown even lower environmental impact than traditional delivery systems [11]. Their capabilities also include coordination with trucks for last-mile distribution [12][13][14][15][16][17], profitable under many scenarios [18], specially under modular design [19]. The context of urgent goods is where important research focused [20], especially for urgent medicine transportation [21], or when roads have challenges for urgent medical supplies [22,23]. Other uses, like a life-ring delivery system, are also present [24]. Of special interest in industry are the few studies related to the evaluation, design, simulation and modelling of a drone fleet for transporting materials in a manufacturing plant [25][26][27][28]. 2 of 26 To the best of our investigations, this is an area where research has not yet focused deeply, whilst it represents a promising alternative to traditional systems. Indoor delivery performed using UAV does not need important dedicated facilities and maintenance costs that conveyor belts and similar solutions require.
With regards UAV's autonomous landing, computer vision has been covered in the literature through two main approaches: (a) the detection of natural environment (using line features detection of natural scenes [29] or natural landmarks [30]); (b) the use of artificial markers, where an element with a specific imagen pattern is placed in the landing region to be discovered and provide positioning and orientation (traditional "H-shape" [31], square-shaped [32], specially ARTag [33], ApriTag [34], and ArUco [35][36][37]). Indoors, especially in manufacturing plants, requires artificial markers to be deployed to allow pattern recognition and support landing. In our research, a conventional camera is selected to create a simple and affordable solution, and the recognition algorithm is simplified to reduce computing needs, and permit onboarding the system. For worker's safety, and a proper use of the plant's volume, a relatively high distance between flying height and the landing spot will be considered; this represents a challenge as the computer vision system has to cope with long-distance recognition of the landing pad, and short-range accuracy to have a small landing table. ArUco markers, a synthetic square marker made of a wide black border and an inner binary matrix that determines its unique identifier within a dictionary, has been successfully used for object tracking [38] and landing purposes [39].
Other research has focused on providing computer vision autonomous landing using system-on-chip devices like the Raspberry pi, based on its lightweight and computing capabilities. In [40], the focus is on simplifying the image processing algorithm so that the Raspberry-Pi 3 could handle the computer vision task and flight commanding; nevertheless, the use of floor features cannot provide the combination of long and short range recognition needed. Other studies like [41] or [42] solve the autonomous landing based on computer vision with Raspberry-Pi computers as well, but again do not focus on solving the challenge of identifying the landing pad keeping an accuracy on the small landing spot.
Finally, the aerial vehicle needs, as well a communication method, to receive navigation commands, report its location, and send telemetry information to the back end; reporting location is key for two aspects: (a) as means for finding it in case it does not reach its destination; (b) to provide the capability of more than one UAV delivering parts in the same plant, so that common aeronautic traffic control mechanisms can be implemented.
Consequently, three main goals have been defined for this research: 1.
Provide an affordable indoor localization method for an unmanned aerial vehicle that delivers light parts inside a manufacturing plant.

2.
Provide an autonomous landing system based on computer vision and affordable equipment that can be onboarded, generalizable for any manufacturing plant. 3.
Provide an affordable wireless solution within a standard manufacturing plant for UAV messages (location, basic telemetry, and receiving simple commands) to be interchanged with the backend software that manages the delivery flights for these internal operations. 1.
The positioning system: the utilization of RFID in conjunction with sonar readings from a different point of view (using the RSS values to obtain a coarse location, and the sonar to provide fine trajectory tracking, aided by a specific corridor design). 2.
The landing system: the use of an onboarded camera that locates the right point to begin descent within the flying corridor, and that controls how the UAV descends (the combination of the circle shape for long-range and a special arrangement of four ArUco markers that provide accurate short-range control).
In conclusion, the novelty of this work lies in the use in conjunction RFID RSS values with sonar readings through a navigation algorithm, onboarding a system on chip and camera to perform positioning and landing, and the use of a mesh network for wireless communication within a manufacturing plant, in the scope of a UAV that delivers light parts to workstations operators. This paper is organized as follows: Section 2 depicts the environment where the research is developed; localization algorithm is described, and flying corridors are defined; flight control and manufacturing plant abstraction are also illustrated; computer vision is also described, including the description of the vertical descent; communication with a back-end software is also designed, and the materials used are also indicated. Section 3 shows laboratory assessments performed to test design; positioning system is evaluated to review localization capabilities; computer vision landing system is also evaluated through a series of tests. Section 4 summarizes conclusions, and possible future lines of work.

Environment
In this research paper, an unmanned aerial vehicle is used for a light manufacturing parts delivery inside a plant, without human's intervention. For test purposes, a four copter with a cuvette underneath for small and light part transportation within the plant was used. Table 1 depicts the most significant characteristics of the UAV: Although the proposed solution is aimed to be generic, a 190 × 100 m manufacturing plant was used for design purposes, as displayed in Figure 1.  Although the proposed solution is aimed to be generic, a 190 × 100 m manufacturing plant was used for design purposes, as displayed in Figure 1. On the top right corner, we have the UAV wait zone for the vehicle to stand-by until required for a delivery. Next to it, four team leaders prepare pallets containing the materials to be assembled in assembler workstations; manufacturing orders are received by the team leader, who prepares On the top right corner, we have the UAV wait zone for the vehicle to stand-by until required for a delivery. Next to it, four team leaders prepare pallets containing the materials to be assembled in assembler workstations; manufacturing orders are received by the team leader, who prepares pallets with all necessary parts to be assembled; those pallets are taken to assembly workstations by feeders team, so that assemblers focus on assembling. Team leaders will also fetch from warehouse extra parts that might be exceptionally required by assemblers in case there is a wrong/mission part; those "urgently needed" parts are the ones to be delivered by the UAV directly to the assembler that raised that manufacturing incident.
Operators working in an assembly area can be seen in a three columns per eleven rows matrix. Every assembler has three working tables, where the left-most one is a landing table for the UAV to deliver the parts for that workstation. The UAV must navigate until finds the landing table of the assembler who raised the incident, perform a vertical landing, stop its propellers, wait until operator confirms that he has taken the part from its cuvette, to perform a vertical take-off until its cruise height, and come back to UAV's wait zone.
The flight must be accomplished through a security flight corridor that is in the perimeter of the plant. It consists of an aluminum "L" (0.9 m width, 0.75 m tall) attached to the wall of the plant as displayed in Figure 2: Electronics 2020, 9, x FOR PEER REVIEW  5 of 27 pallets with all necessary parts to be assembled; those pallets are taken to assembly workstations by feeders team, so that assemblers focus on assembling. Team leaders will also fetch from warehouse extra parts that might be exceptionally required by assemblers in case there is a wrong/mission part; those "urgently needed" parts are the ones to be delivered by the UAV directly to the assembler that raised that manufacturing incident. Operators working in an assembly area can be seen in a three columns per eleven rows matrix. Every assembler has three working tables, where the left-most one is a landing table for the UAV to deliver the parts for that workstation. The UAV must navigate until finds the landing table of the assembler who raised the incident, perform a vertical landing, stop its propellers, wait until operator confirms that he has taken the part from its cuvette, to perform a vertical take-off until its cruise height, and come back to UAV's wait zone.
The flight must be accomplished through a security flight corridor that is in the perimeter of the plant. It consists of an aluminum "L" (0.9 m width, 0.75 m tall) attached to the wall of the plant as displayed in Figure 2:  Under the "L" confined passage, a protection net is issued to prevent any accident in case there is a problem with the UAV's flight. The net is attached to the "L" and to the wall, having openings in the vertical of the landing tables.
Abstracting the path to every workstation as a combination of a perimetral approach and a transversal translation suits the case of manufacturing plants were the workstations are aligned, distributed in rows and columns. Whenever the floor layout shows scattered workstations that are not aligned, the number of transversal corridors could potentially be too high. The maximum recommended distance from the center of a landing table to the rectilinear line that their neighbors form, would be the width of the flying corridor.

Distance Calculation
RFID positioning has been studied deeply through literature, and its challenges for obtaining an accurate distance have been already stated (as described in the Introduction). The proposal to overcome those issues and for providing UAV localization is based on the idea of RSS measurement aided by a positioning technique (bilateration), sonar readings, and a proper navigation algorithm.

RSS Measurement
RSS measurement is obtained onboarding a RFID reader module and an UHF (ultra-high frequency) antenna to detect, during UAV's progress, a series of passive tags that will be strategically placed in the "corridor" reserved for its flight. The position of the tags is known and is key to obtain Under the "L" confined passage, a protection net is issued to prevent any accident in case there is a problem with the UAV's flight. The net is attached to the "L" and to the wall, having openings in the vertical of the landing tables.
Abstracting the path to every workstation as a combination of a perimetral approach and a transversal translation suits the case of manufacturing plants were the workstations are aligned, distributed in rows and columns. Whenever the floor layout shows scattered workstations that are not aligned, the number of transversal corridors could potentially be too high. The maximum recommended distance from the center of a landing table to the rectilinear line that their neighbors form, would be the width of the flying corridor.

Distance Calculation
RFID positioning has been studied deeply through literature, and its challenges for obtaining an accurate distance have been already stated (as described in the Introduction). The proposal to overcome those issues and for providing UAV localization is based on the idea of RSS measurement aided by a positioning technique (bilateration), sonar readings, and a proper navigation algorithm. RSS measurement is obtained onboarding a RFID reader module and an UHF (ultra-high frequency) antenna to detect, during UAV's progress, a series of passive tags that will be strategically placed in the "corridor" reserved for its flight. The position of the tags is known and is key to obtain better results in terms of reducing interferences between different tags and reducing multipath effect due to reflections. Figure 3 shows a plan view of the proposed method: Electronics 2020, 9, x FOR PEER REVIEW 6 of 27 better results in terms of reducing interferences between different tags and reducing multipath effect due to reflections. Figure 3 shows a plan view of the proposed method: The UAV is flying through the perimeter corridor finding, on its course, two tags. The tags are placed left and right to avoid reflections; that arrangement and a proper distance between them allows the vehicle to receive wave fronts from the two following tags on its progression. RFID tags are placed on a 10 × 10 × 10 cm 3d-printed PTEG (poly-ethylene terephthalate glycol) wall anchor. On the front the tag is glued, so that it has direct line of sight with the onboarded reader. On the right side, the attachment on the wall is done using double-sided tape. On the back, a stainless-steel sheet coated with a radio wave propagation prevention (HSF54) is attached, in order to prevent receiving reading from surpassed tags. Confined flying "L" structure is also coated to avoid reflections and allow the reader to receive just direct readings from tags.
In free space, the power received at the antenna will be attenuated as defined in the propagation laws. Attenuation is a function of the distance to the emitter, and is explained by Friis law: where Pr is the power received in the antenna of our locator; Gr is the gain of the antenna that is receiving the signal; Gt is the gain of the antenna that is emitting the radio frequency wave; λ is the length of the electromagnetic wave; d corresponds to the distance that exists, in direct vision, between both antennas; n is an experimental variable, the nature of which depends on different parameters such as the difference in height between antennas versus the wavelengths involved, and the transmission medium itself (given that the tags have been placed so that they are practically at the same height as in the drone's flight line, and that the medium we are going to use will always be air, it is common to use a value of 2 for n); L is used to reflect the losses of the emitter-receiver set, which are not determined by the wave's own propagation in the medium. The Friis equation can be expressed in logarithmic power terms, as is most common when dealing with signal theory: The UAV is flying through the perimeter corridor finding, on its course, two tags. The tags are placed left and right to avoid reflections; that arrangement and a proper distance between them allows the vehicle to receive wave fronts from the two following tags on its progression. RFID tags are placed on a 10 × 10 × 10 cm 3d-printed PTEG (poly-ethylene terephthalate glycol) wall anchor. On the front the tag is glued, so that it has direct line of sight with the onboarded reader. On the right side, the attachment on the wall is done using double-sided tape. On the back, a stainless-steel sheet coated with a radio wave propagation prevention (HSF54) is attached, in order to prevent receiving reading from surpassed tags. Confined flying "L" structure is also coated to avoid reflections and allow the reader to receive just direct readings from tags.
In free space, the power received at the antenna will be attenuated as defined in the propagation laws. Attenuation is a function of the distance to the emitter, and is explained by Friis law: where Pr is the power received in the antenna of our locator; Gr is the gain of the antenna that is receiving the signal; Gt is the gain of the antenna that is emitting the radio frequency wave; λ is the length of the electromagnetic wave; d corresponds to the distance that exists, in direct vision, between both antennas; n is an experimental variable, the nature of which depends on different parameters such as the difference in height between antennas versus the wavelengths involved, and the transmission medium itself (given that the tags have been placed so that they are practically at the same height as in the drone's flight line, and that the medium we are going to use will always be air, it is common to use a value of 2 for n); L is used to reflect the losses of the emitter-receiver set, which are not determined by the wave's own propagation in the medium.
The Friis equation can be expressed in logarithmic power terms, as is most common when dealing with signal theory: Passive tags using UHF frequency have been selected, and an 871,228 MHz (located in the 860 and 960 MHz range for Gen2 tags) was the operation frequency; the corresponding wavelength for this electromagnetic radiation is λ = 0.3443 m.

Bilateration
Once a theoretical measure of the distance to a specific tag is obtained, the next step is to determine the distance to the other tag that is within the range and, based on them, perform bilateration, as seen in Figure 4: Passive tags using UHF frequency have been selected, and an 871,228 MHz (located in the 860 and 960 MHz range for Gen2 tags) was the operation frequency; the corresponding wavelength for this electromagnetic radiation is = 0.3443 m.

Bilateration
Once a theoretical measure of the distance to a specific tag is obtained, the next step is to determine the distance to the other tag that is within the range and, based on them, perform bilateration, as seen in Figure 4: The mechanism for calculating the location in two dimensions, x and y, can be found using the circumference equation. The z dimension is not really variable, since the flight altitude is predetermined to be constant, equal to the height of the confined corridor where the UAV flight must take place. Expressed in Cartesian, the distance d1 to a tag = ( , ) and d2 to a tag = ( , ) is given by: Dixon proposed a method to obtain x and y as [68]:

Second Measurement
UAVs are actually autonomous flying robots. The use of sensors is very common in autonomous robotics systems (like in robotic arms [69] or mobile robots [70]), and therefore equipping the UAVs with sensors is mandatory for the sake of positioning and collision/deadlock avoidance; combining different technologies also provides a certain redundancy in case one of the systems fails and it is a common practice, as in the aerospace world [71]. According to these ideas, and since RSS readings The mechanism for calculating the location in two dimensions, x and y, can be found using the circumference equation. The z dimension is not really variable, since the flight altitude is predetermined to be constant, equal to the height of the confined corridor where the UAV flight must take place. Expressed in Cartesian, the distance d 1 to a tag X 1 = (x 1 , y 1 ) and d 2 to a tag X 2 = (x 2 , y 2 ) is given by: Dixon proposed a method to obtain x and y as [68]: Electronics 2020, 9, 1680 8 of 26

Second Measurement
UAVs are actually autonomous flying robots. The use of sensors is very common in autonomous robotics systems (like in robotic arms [69] or mobile robots [70]), and therefore equipping the UAVs with sensors is mandatory for the sake of positioning and collision/deadlock avoidance; combining different technologies also provides a certain redundancy in case one of the systems fails and it is a common practice, as in the aerospace world [71]. According to these ideas, and since RSS readings tend to suffer from discrepancy from the ideal model [72], a sonar sensor is added to help positioning. Selected technology has been proven in the terrestrial robotics system historical measuring distance in different ranges [73]. Five sonars corresponding to north-south-east-west and vertical (z dimension) provide distance to the walls inside confined flying corridor. Vertical sonar helps maintain the right flight level, and the other four keep the distance with respect to the walls in the confined flying corridor.

Navigation Algorithm
Sonar sensors working perpendicular to the forward direction of the UAV will allow keeping it centered with respect to the confined corridor. Given that vehicle's dimensions and flying corridors are known by design, centering the UAV inside the corridor is straightforward. Flight strategy inside the plant is based on abstracting corridors and workstation landing tables (destination point) as nodes, performing navigation as shown in Figure 5:

Navigation Algorithm
Sonar sensors working perpendicular to the forward direction of the UAV will allow keeping it centered with respect to the confined corridor. Given that vehicle's dimensions and flying corridors are known by design, centering the UAV inside the corridor is straightforward. Flight strategy inside the plant is based on abstracting corridors and workstation landing tables (destination point) as nodes, performing navigation as shown in Figure 5: To perform UAV's navigation safely inside manufacturing plants, confined flying corridors with net protections are proposed. For the UAV to get to its destination carrying the load, it will climb up vertically to the flying altitude, and traverse the perimetral corridor (step 1). When it finds the transversal corridor corresponding to the landing table of the operator that raised the incident, it will perform a turn (step 2). The next steps are finding destination, performing a vertical land, waiting until the operator confirms reception, climbing again to cruise altitude, and finally getting back to its wait zone (4 and 5). The UAV will keep a centered flight inside the confined corridor, detecting tags during its course and deciding directions according to tags layout, stored on its Raspberry Pi computer. Every tag is defined as a node, to map their arrangement in memory as shown in Figure  6: To perform UAV's navigation safely inside manufacturing plants, confined flying corridors with net protections are proposed. For the UAV to get to its destination carrying the load, it will climb up vertically to the flying altitude, and traverse the perimetral corridor (step 1). When it finds the transversal corridor corresponding to the landing table of the operator that raised the incident, it will perform a turn (step 2). The next steps are finding destination, performing a vertical land, waiting until the operator confirms reception, climbing again to cruise altitude, and finally getting back to its wait zone (4 and 5). The UAV will keep a centered flight inside the confined corridor, detecting tags during its course and deciding directions according to tags layout, stored on its Raspberry Pi computer. Every tag is defined as a node, to map their arrangement in memory as shown in Figure 6: The algorithm is based on following the sequence of nodes through perimetral corridor until finding tags at the crossing between perimetral and transversal corridors, where a decision has to be taken about turning or not. In Figure 6, the destination node is on transversal corridor 2, so the UAV must turn at the second decision point, follow transversal corridor, deliver its payload at destination table, progress through transversal corridor until reaching the crossing with perimetral, and finish its journey at the wait zone again, via perimetral corridor. To obtain the shortest path to a specific goal we follow a well-known solution to this problem, that is similar to the all-pairs shortest-paths one; the idea is to use the shortest-path results calculated in previous stages to help determining the shortest paths in future ones [74], a performance oriented variation of Dijkstra's algorithm. Figure 7 depicts the procedure for performing turns to (and from) transversal corridors: The algorithm is based on following the sequence of nodes through perimetral corridor until finding tags at the crossing between perimetral and transversal corridors, where a decision has to be taken about turning or not. In Figure 6, the destination node is on transversal corridor 2, so the UAV must turn at the second decision point, follow transversal corridor, deliver its payload at destination table, progress through transversal corridor until reaching the crossing with perimetral, and finish its journey at the wait zone again, via perimetral corridor. To obtain the shortest path to a specific goal we follow a well-known solution to this problem, that is similar to the all-pairs shortest-paths one; the idea is to use the shortest-path results calculated in previous stages to help determining the shortest paths in future ones [74], a performance oriented variation of Dijkstra's algorithm.
Over the course of most of the corridors, the UAV is having line of sight with two tags at the same time. UAV at stage 1 detects tags T113 and T114 providing location, and has two sonars (left and right) providing accurate centering, keeping the same "x" distance with respect to both sides of the corridor; it is within the 2-sonars zone. When it progresses to stage 2, only one tag (T115) is reachable, as predicted by node map. When it loses left sonar readings (the next wall is out of range), the algorithm must evaluate if a turn is required to follow the right path towards destination. If it is not the case, the UAV will keep flying straight, maintaining the same distance as the average value from previous leg, using sonar readings; it is within the 1-sonar zone. Afterwards, the tag at the intersection with the transversal corridor (in this case, T116) will be visible, announcing that 1-sonar zone is about to finish; when reading from left sonar come back to values within range, it will be confirmed that UAV is back at 2-sonar zone (stage 3), similar to stage 1. In the case that, being at stage 2, the algorithm requires a turn to reach transversal corridor, a left turn is performed at the beginning of the 1-sonar zone. At that moment (stage 4), navigation is similar to the one at perimetral corridor; at the end of transversal corridor the next turn will take the UAV back to the perimetral again, to resume to its wait zone. Over the course of most of the corridors, the UAV is having line of sight with two tags at the same time. UAV at stage 1 detects tags T113 and T114 providing location, and has two sonars (left and right) providing accurate centering, keeping the same "x" distance with respect to both sides of the corridor; it is within the 2-sonars zone. When it progresses to stage 2, only one tag (T115) is reachable, as predicted by node map. When it loses left sonar readings (the next wall is out of range), the algorithm must evaluate if a turn is required to follow the right path towards destination. If it is not the case, the UAV will keep flying straight, maintaining the same distance as the average value from previous leg, using sonar readings; it is within the 1-sonar zone. Afterwards, the tag at the intersection with the transversal corridor (in this case, T116) will be visible, announcing that 1-sonar zone is about to finish; when reading from left sonar come back to values within range, it will be confirmed that UAV is back at 2-sonar zone (stage 3), similar to stage 1. In the case that, being at stage 2, the algorithm requires a turn to reach transversal corridor, a left turn is performed at the beginning of the 1-sonar zone. At that moment (stage 4), navigation is similar to the one at perimetral corridor; at the end of transversal corridor the next turn will take the UAV back to the perimetral again, to resume to its wait zone.
While on stage 4, UAV progresses through the transversal corridor until it reaches the tags corresponding to the landing table where its load has to be delivered (in Figure 8, S85 and S86). Getting passed previous tag in the nodes list will be the sign to reduce speed up to 30%, until the hole in the net is reached and the landing table is seen from the UAV. At that moment, it will descend vertically using vertical sonar to control distance to the table, and the visual marks in the table to center descent, as seen in Figure 8: While on stage 4, UAV progresses through the transversal corridor until it reaches the tags corresponding to the landing table where its load has to be delivered (in Figure 8, S85 and S86). Getting passed previous tag in the nodes list will be the sign to reduce speed up to 30%, until the hole in the net is reached and the landing table is seen from the UAV. At that moment, it will descend vertically using vertical sonar to control distance to the table, and the visual marks in the table to center descent, as seen in Figure 8:

Computer Vision Landing Operation
Once the UAV reaches the opening area in the flying corridor, marked by its tags, it will activate the camera to begin recognition. Proposed markers on landing table include a circle for long-range recognition and a set of four ArUco marker for short range, as depicted in Figure 9:

Computer Vision Landing Operation
Once the UAV reaches the opening area in the flying corridor, marked by its tags, it will activate the camera to begin recognition. Proposed markers on landing table include a circle for long-range recognition and a set of four ArUco marker for short range, as depicted in Figure 9:

Computer Vision Landing Operation
Once the UAV reaches the opening area in the flying corridor, marked by its tags, it will activate the camera to begin recognition. Proposed markers on landing table include a circle for long-range recognition and a set of four ArUco marker for short range, as depicted in Figure 9: The dimensions of the circle allow its visibility from the flying corridor, and the long-range part of the descent; when the UAV approaches the landing table, the circle is not visible, and the ArUco markers provide control at this stage. The following steps are defined for landing: Finding the Opening at the Corridor to Begin Descent At this step at the transversal corridor, the first requirement is to determine the point where descent must begin. While progressing through the corridor, sonar is not operative in the forward direction and RFID suffers from certain error. Fiducial markers in the landing pad of the workstation table, vertically aligned with the descent point, will be looked for and used to determine where UAV should begin descent; consequently, we need to control the x axis. The OpenCV library is used to alter the image from RGB to greyscale using discrete values from 0 to 255, and then convert it to a black and white binary image. It is morphologically transformed then to eliminate image noise and The dimensions of the circle allow its visibility from the flying corridor, and the long-range part of the descent; when the UAV approaches the landing table, the circle is not visible, and the ArUco markers provide control at this stage. The following steps are defined for landing: Finding the Opening at the Corridor to Begin Descent At this step at the transversal corridor, the first requirement is to determine the point where descent must begin. While progressing through the corridor, sonar is not operative in the forward direction and RFID suffers from certain error. Fiducial markers in the landing pad of the workstation table, vertically aligned with the descent point, will be looked for and used to determine where UAV should begin descent; consequently, we need to control the x axis. The OpenCV library is used to alter the image from RGB to greyscale using discrete values from 0 to 255, and then convert it to a black and white binary image. It is morphologically transformed then to eliminate image noise and extract the contours and silhouettes of the shapes found in an opening operation (erosion and dilation processes).
Next steps are aimed to determine whether detected elements in the image correspond to a circle, as described in [75]: aspect ratio (AR) and solidity (SL) should have values similar to 1, and extension similar to π 4 . The aspect ratio is the quotient between the width and the height of the shape; the solidity is the quotient between the areas of the contour and the convex hull (the convex perimeter around the points of the circle); finally, the extension refers to the quotient between the areas of the contour and the bounding box around it. When the circle shape is confirmed, the center is obtained, and the distance to it (x, y) is determined; given that left and right sonars are keeping the UAV centered in the corridor, a value of x = 0 distance will determine descent point. We must transform the distance in pixels to meters, using the pinhole camera model [76]: where D is the diameter of the real circle and d is the diameter in pixels; d x is the distance to the descent point.

Finding the Short-Range Markers
The UAV performs descent controlling the z variable with sonar until gets out of the flying corridor; when the ArUco markers become visible, they will be used for short-range descent, where the size of the outer circle gets out of sight; after some tests, the sizes of the markers and the circle are chosen to have constant positioning. Whilst one ArUco marker can provide pose estimation for the UAV [77], in this research four are used to: (a) use a fusion algorithm that conveniently combines the information from them; (b) provide another mechanism to obtain desired landing point, as the intersection of the two diagonals that form the top-left squares of the markers, as shown in Figure 10: contour and the bounding box around it. When the circle shape is confirmed, the center is obtained, and the distance to it (x, y) is determined; given that left and right sonars are keeping the UAV centered in the corridor, a value of = 0 distance will determine descent point. We must transform the distance in pixels to meters, using the pinhole camera model [76]: where D is the diameter of the real circle and d is the diameter in pixels; dx is the distance to the descent point.

Finding the Short-Range Markers
The UAV performs descent controlling the z variable with sonar until gets out of the flying corridor; when the ArUco markers become visible, they will be used for short-range descent, where the size of the outer circle gets out of sight; after some tests, the sizes of the markers and the circle are chosen to have constant positioning. Whilst one ArUco marker can provide pose estimation for the UAV [77], in this research four are used to: (a) use a fusion algorithm that conveniently combines the information from them; (b) provide another mechanism to obtain desired landing point, as the intersection of the two diagonals that form the top-left squares of the markers, as shown in Figure 10: In this case, the image is transformed to a greyscale, detection of its contour is performed, and a polygonal approximation is made [78], resulting in every item having binary value due to the threshold limit. Marker's unique identifier confirms that it is one of the four selected ArUco items, and consequently recognition is confirmed. The next step is to solve the well-known PnP problem (pinhole camera model), to describe the projection of a point from the 3d world coordinate system to a 2d image model, [79], as seen in Figure  11: In this case, the image is transformed to a greyscale, detection of its contour is performed, and a polygonal approximation is made [78], resulting in every item having binary value due to the threshold limit. Marker's unique identifier confirms that it is one of the four selected ArUco items, and consequently recognition is confirmed. The next step is to solve the well-known PnP problem (pinhole camera model), to describe the projection of a point from the 3d world coordinate system to a 2d image model, [79], as seen in Figure 11: Electronics 2020, 9, x FOR PEER REVIEW 13 of 27 Figure 11. PnP model procedure [80].
For a point p in the real-world coordinate system centered at w0, the PnP model finds its projection on the image plane, represented by the imagen coordinate system , through the camera coordinate system (this is, PnP maps a three-dimensional scene to a two-dimensional image, in an image plane). It is a three-step procedure that first transforms the coordinates of p into the camera coordinate system (using a rotation plus a translation), then the point p is projected to the image Figure 11. PnP model procedure [80]. For a point p in the real-world coordinate system {W} centered at w 0 , the PnP model finds its projection on the image plane, represented by the imagen coordinate system {I}, through the camera coordinate system {C} (this is, PnP maps a three-dimensional scene to a two-dimensional image, in an image plane). It is a three-step procedure that first transforms the coordinates of p into the camera coordinate system (using a rotation plus a translation), then the point p is projected to the image plane (using the camera focal length), and finally the coordinates in the image coordinate system are discretized (considering the size of each pixel in the CCD, the charge-coupled device of the camera) [79].
There are several solutions in the literature for the PnP problem, and the Efficient PnP (EPnP) [81] was selected for its efficiency that allows performing the algorithm with four ArUco markers at a time.
A linear system is generated as: where x is the transposed vector of unknows and M is the matrix that combines camera intrinsic calibration matrix, the 2d projections of the reference points, the scalar projective parameters, the 3d coordinates of the n control points; the method simplifies the complex problem by expressing the 3d points as a weighted sum of four virtual control points [81].
Since four markers are used, fusion estimation is performed to combine the four pose estimations into a more accurate one. The problem is within a general multisensor linearly weighted estimation fusion case, that extends the Gauss-Markov estimation to the random parameter under estimation [82]; in our case, given that m j are the pose estimations (for j in the 1-4 interval) and r' is the unbiased estimate for m as: where w j represent the weights to be calculated. Using the Lagrange multipliers, the variance is minimized when [83]: The kinematics and dynamic model for this type of UAV has been intensely studied in the literature [84,85]. In Figure 12 a simplified representation of the quadcopter is shown, with the four thrust vectors T1-T4 generated by the electric motors. A body fixed reference system {B}, centered on the UAV and attached to its structure, defines the attitude of the vehicle, using the roll, pitch, and yaw angles (ϕ, θ, ψ). A world reference system {W} helps define the position of the UAV as P W = P W x , P W y , P W z . Accordingly, defining g as the gravity acceleration, m as the mass of the vehicle, and f b as the force applied to the drone in the body fixed reference system, the dynamics can be written as [86]: .. .. ..
Once the computer vision system provides the actual location (x a , y a , z a ) and yaw angle (ψ a ) of the destination using OpenCV, they are injected in two standard PID (proportional integra derivative) controllers: one for controlling position (considering the desired destination point), and the other for the attitude control (fed by desired yaw angle, and the result of position controller). The result is injected in the multirotor speed controller to regulate the electric motors. The PID gains were adjusted in preliminary tests to obtain a smooth trajectory. Electronics 2020, 9,   Accordingly, defining g as the gravity acceleration, m as the mass of the vehicle, and fb as the force applied to the drone in the body fixed reference system, the dynamics can be written as [86]: = − (− cos + sin ) = − cos (12) Once the computer vision system provides the actual location (xa, ya, za) and yaw angle (ψa) of the destination using OpenCV, they are injected in two standard PID (proportional integra derivative) controllers: one for controlling position (considering the desired destination point), and the other for the attitude control (fed by desired yaw angle, and the result of position controller). The result is injected in the multirotor speed controller to regulate the electric motors. The PID gains were adjusted in preliminary tests to obtain a smooth trajectory.

Communication
Zigbee provides an affordable communication mechanism for deploying a wireless network in the manufacturing plant, capable of deliver messages between UAV and the backend software. A mesh network is proposed to send UAV location to the backend, and to receive commands. In Figure  13, the proposed Zigbee node deployment is depicted:

Communication
Zigbee provides an affordable communication mechanism for deploying a wireless network in the manufacturing plant, capable of deliver messages between UAV and the backend software. A mesh network is proposed to send UAV location to the backend, and to receive commands. In Figure 13, the proposed Zigbee node deployment is depicted: Taking advantage of the presence of operators' workstations in the plant, a Zigbee node will be deployed in each of them. Keeping visual line of sight between workstations allows messages to be forwarded along any standard manufacturing plant. In the figure, the green node represents UAV's onboarded Zigbee node, progressing through flying corridor. Light blue nodes represent on-theground Zigbee node, that retransmits messages. Should any node be inoperative (red node), the network itself updates routing to guarantee message delivery, using the shortest operative link at any specific time. Dark blue node (one network coordinator per PAN) manages the network and is also connected to the wired LAN (local area network); consequently, it will forward every message between wired and wireless networks.

Setup
Proposed method was tested in the laboratory using onboarded and on-the-ground systems Taking advantage of the presence of operators' workstations in the plant, a Zigbee node will be deployed in each of them. Keeping visual line of sight between workstations allows messages to be forwarded along any standard manufacturing plant. In the figure, the green node represents UAV's onboarded Zigbee node, progressing through flying corridor. Light blue nodes represent on-the-ground Zigbee node, that retransmits messages. Should any node be inoperative (red node), the network itself updates routing to guarantee message delivery, using the shortest operative link at any specific time. Dark blue node (one network coordinator per PAN) manages the network and is also connected to the wired LAN (local area network); consequently, it will forward every message between wired and wireless networks.

Setup
Proposed method was tested in the laboratory using onboarded and on-the-ground systems with the following hardware, displayed in Tables 2 and 3:  RFID module reader is a small sized (57 × 37 × 7 mm), low weight (112 gr), low power consumption (0.15 to 5 W) UHF device, with transmission capabilities up to 30 dBm. It provides two external interfaces: the antenna connector, and UART pinouts for microcontroller communication. The UART to USB adapter is plugged to the Raspberry Pi (on the USB side), and to the RFID reader (on the UART pins). Winnix antenna is a UHF compact (130 × 130 × 21 mm) lightweight (210 gr) ABS product with a net 4 dBi gain, that interfaces with RFID module reader. Selected sensor is a MaxBotix 1232 IC2 small (<22 mm) lightweight (6 gr) ultrasonic sensor, emitting a narrow beam with one-centimeter resolution in its 20-765 cm range; it supports I2C for an easy communication. Raspberry Pi allows handling computer vision and communication and calculating positioning; it is connected to the flight controller via serial connection. NoIR camera is a lightweight compatible solution to be used for capturing frames for landing operations. Zigbee communication is provided by a Xbee PRO module, which fits over the Zigbee USB adapter, that is plugged to one of the four Raspberry Pi USB ports.
On the ground, every workstation is to be equipped with a Raspberry Pi computer to become a forwarding node in the mesh architecture. Every node should at least have visual line of sight with another workstation for the mesh to be able to forward messages.

Positioning System
Using the specifications given by chosen hardware manufacturers, Equation (2) can be written as: First experiment is for evaluating the value of n in Equation (2), that provides a relationship between distance and power received. An isolated tag was initially put at 1.5 m distance from reader in the laboratory, with no other tags present. Power values were registered ten times, at twenty second intervals to provide an average value for that distance. The experiment was repeated 15 times, each time reducing the distance 10 cm. In Figure 14, the theoretical model is represented; the figure also displays average real power values from experimentation: Using the specifications given by chosen hardware manufacturers, Equation (2) First experiment is for evaluating the value of n in Equation (2), that provides a relationship between distance and power received. An isolated tag was initially put at 1.5 m distance from reader in the laboratory, with no other tags present. Power values were registered ten times, at twenty second intervals to provide an average value for that distance. The experiment was repeated 15 times, each time reducing the distance 10 cm. In Figure 14, the theoretical model is represented; the figure also displays average real power values from experimentation: As it can be seen, the theoretical curve must be adjusted to fit real data. Using a value n = 2.6, we have a reasonable match between the fitted model and real values, as seen in Figure 15: As it can be seen, the theoretical curve must be adjusted to fit real data. Using a value n = 2.6, we have a reasonable match between the fitted model and real values, as seen in Figure 15: Consequently, the final model can be expressed as Equation (14) shows: Once a definitive model is achieved, another experiment is performed, to test localization in the laboratory. Five UHF passive tags representing perimetral corridor were located as shown in Figure  5. The reader progressed through it, while logging calculated distance using the model. Figure 16 displays (green) real distance measured with sonar, and the calculated distances to the five test tags. The experiment was repeated ten times and the values shown are the average: Consequently, the final model can be expressed as Equation (14) shows: Once a definitive model is achieved, another experiment is performed, to test localization in the laboratory. Five UHF passive tags representing perimetral corridor were located as shown in Figure 5. The reader progressed through it, while logging calculated distance using the model. Figure 16 displays (green) real distance measured with sonar, and the calculated distances to the five test tags. The experiment was repeated ten times and the values shown are the average: Once a definitive model is achieved, another experiment is performed, to test localization in the laboratory. Five UHF passive tags representing perimetral corridor were located as shown in Figure  5. The reader progressed through it, while logging calculated distance using the model. Figure 16 displays (green) real distance measured with sonar, and the calculated distances to the five test tags. The experiment was repeated ten times and the values shown are the average: Figure 16. Distance to tags as reader traverses perimetral corridor.
The figure shows how, at every moment, at least two tags are providing distance readings, and consequently giving location with respect navigation nodes.
As expected, distance values suffer certain fluctuation with respect actual values. Figure 17 displays sonar readings (blue) versus average calculated values (orange) when using RSS from a tag (repeating the experiment ten times again):  Figure 17 displays percentage error as well, between calculated and actual distance. Two points at small range can be considered outliers because of the extraordinary difference with respect to actual values. Table 4 depicts those error values including all points (all), or eliminating outliers:   Figure 17 displays percentage error as well, between calculated and actual distance. Two points at small range can be considered outliers because of the extraordinary difference with respect to actual values. Table 4 depicts those error values including all points (all), or eliminating outliers:

Landing System
The landing procedure was tested in the laboratory floor using a 0.9 × 0.9 m MDF black board as landing table on a 3 × 3 m white bed, and the proposed landing pad marker in the center of the table. It was repeated 15 times and each iteration was split into three steps: The first step is long range landing pad detection, as shown in Figure 18:  Figure 17 displays percentage error as well, between calculated and actual distance. Two points at small range can be considered outliers because of the extraordinary difference with respect to actual values. Table 4 depicts those error values including all points (all), or eliminating outliers:

Landing System
The landing procedure was tested in the laboratory floor using a 0.9 × 0.9 m MDF black board as landing table on a 3 × 3 m white bed, and the proposed landing pad marker in the center of the table. It was repeated 15 times and each iteration was split into three steps: The first step is long range landing pad detection, as shown in Figure 18: Reasonable results were obtained for the location of the landing center as displayed in Figure  19; the circle was recognized and kept with visual range in every iteration, and every captured frame obtained its bounding box, ellipse approximation, and center calculation: Reasonable results were obtained for the location of the landing center as displayed in Figure 19; the circle was recognized and kept with visual range in every iteration, and every captured frame obtained its bounding box, ellipse approximation, and center calculation: Electronics 2020, 9, x FOR PEER REVIEW 19 of 27 Figure 19. Bounding box area and detected circle. The ArUco markers are still not within range, but the circle helps obtaining a reference from long-range distance.
The second step is to obtain positioning from both methods for when distance to the landing pad keeps the circle in visual range and the inner markers enter visual range as well. Once the UAV leaves transversal corridor height, the ArUco markers will become visible, as shown in Figure 20: Figure 19. Bounding box area and detected circle. The ArUco markers are still not within range, but the circle helps obtaining a reference from long-range distance.
The second step is to obtain positioning from both methods for when distance to the landing pad keeps the circle in visual range and the inner markers enter visual range as well. Once the UAV leaves transversal corridor height, the ArUco markers will become visible, as shown in Figure 20: Figure 19. Bounding box area and detected circle. The ArUco markers are still not within range, but the circle helps obtaining a reference from long-range distance.
The second step is to obtain positioning from both methods for when distance to the landing pad keeps the circle in visual range and the inner markers enter visual range as well. Once the UAV leaves transversal corridor height, the ArUco markers will become visible, as shown in Figure 20: Consistency between the two previous methods was checked to evaluate the discrepancy of the (x, y) coordinates location given by both. Average absolute calculated error among the 15 sets of Consistency between the two previous methods was checked to evaluate the discrepancy of the (x, y) coordinates location given by both. Average absolute calculated error among the 15 sets of values where 7 mm for the x coordinate, and 8 mm for the y coordinate. Coordinates are represented in Figure 21, showing an acceptable discrepancy between the values: Electronics 2020, 9, x FOR PEER REVIEW 20 of 27 values where 7 mm for the x coordinate, and 8 mm for the y coordinate. Coordinates are represented in Figure 21, showing an acceptable discrepancy between the values: The third step is for close range positioning, when UAV's height is low enough as to lose sight of the outer circle and perform pose estimation and locate landing point just with ArUco markers. Algorithm stops circle detector after 100 continuous frames with no circle obtained. Figure 22 displays OpenCV's coordinate reference systems for each of the markers, according to their rotation:  Algorithm stops circle detector after 100 continuous frames with no circle obtained. Figure 22 displays OpenCV's coordinate reference systems for each of the markers, according to their rotation: Figure 21. Discrepancy between center coordinates evaluated via circle versus via ArUco markers.
The third step is for close range positioning, when UAV's height is low enough as to lose sight of the outer circle and perform pose estimation and locate landing point just with ArUco markers. Algorithm stops circle detector after 100 continuous frames with no circle obtained. Figure 22 displays OpenCV's coordinate reference systems for each of the markers, according to their rotation: Finally, once landing was finished, the landing spot was compared to the real intersection between ArUco marker's top left corner; as a landing spot it was considered vertical from the camera, and the distance in (x, y) coordinates was measured as displayed in Figure 23. The average absolute error in the x coordinate was 19 mm, and 23 mm for the y coordinate: Finally, once landing was finished, the landing spot was compared to the real intersection between ArUco marker's top left corner; as a landing spot it was considered vertical from the camera, and the distance in (x, y) coordinates was measured as displayed in Figure 23. The average absolute error in the x coordinate was 19 mm, and 23 mm for the y coordinate: A certain interference with landing table is affecting the last centimeters approach, as a ground effect, generating horizontal x and y displacements when the UAV is about to finish landing; ground effect is expected to be affecting the dynamic of the operation as it has been already discussed in the literature [87]. Since laboratory results are within acceptable error, this issue has not been addressed within this research.
All image processing takes place at the on-boarded Raspberry computer, that performs the computer vision algorithm. Figure 24 shows computation time during landing procedure, where four different stages can be seen. First, the camera is not active, while in the flying corridor. Second, when the camera is activated, in the vicinity of the opening of the corridor, and it is searching the longrange marker. Third, when the circle is detected, the long-range algorithm is controlling descent, and the short-range markers are looked for. Four, when the ArUco markers are found, and the shortrange algorithm is performing the descent until landed. Computing time stays under 19 ms even in A certain interference with landing table is affecting the last centimeters approach, as a ground effect, generating horizontal x and y displacements when the UAV is about to finish landing; ground effect is expected to be affecting the dynamic of the operation as it has been already discussed in the literature [87]. Since laboratory results are within acceptable error, this issue has not been addressed within this research.
All image processing takes place at the on-boarded Raspberry computer, that performs the computer vision algorithm. Figure 24 shows computation time during landing procedure, where four different stages can be seen. First, the camera is not active, while in the flying corridor. Second, when the camera is activated, in the vicinity of the opening of the corridor, and it is searching the long-range marker. Third, when the circle is detected, the long-range algorithm is controlling descent, and the short-range markers are looked for. Four, when the ArUco markers are found, and the short-range algorithm is performing the descent until landed. Computing time stays under 19 ms even in the period where the two algorithms (long and short range) are working simultaneously.
A certain interference with landing table is affecting the last centimeters approach, as a ground effect, generating horizontal x and y displacements when the UAV is about to finish landing; ground effect is expected to be affecting the dynamic of the operation as it has been already discussed in the literature [87]. Since laboratory results are within acceptable error, this issue has not been addressed within this research.
All image processing takes place at the on-boarded Raspberry computer, that performs the computer vision algorithm. Figure 24 shows computation time during landing procedure, where four different stages can be seen. First, the camera is not active, while in the flying corridor. Second, when the camera is activated, in the vicinity of the opening of the corridor, and it is searching the longrange marker. Third, when the circle is detected, the long-range algorithm is controlling descent, and the short-range markers are looked for. Four, when the ArUco markers are found, and the shortrange algorithm is performing the descent until landed. Computing time stays under 19 ms even in the period where the two algorithms (long and short range) are working simultaneously.   1  28  55  82  109  136  163  190  217  244  271  298  325  352  379  406  433  460  487  514  541  568  595  622  649  676  703  730  757  784  811  838  865  892  919  946  973  1000  1027  1054  1081  1108  1135  1162  1189 Computation time (ms) Frame Figure 24. Computation time during landing procedure.

Conclusions and Future Work
In this research, a positioning system and a landing system are defined for an indoor light part delivery UAV within a manufacturing plant, providing a mesh network for WIFI communication with the back-end software that manages the operation. The scope for the activities of the aerial vehicle is within the perimeter of the plant, where no wind-gusts conditions are expected. The research has considered the drone for internal logistics operations, although other factory activities such as automatic inventory or surveillance could be also be appliable with the same positioning and landing system. The overall control of the UAV is provided by an onboarded Raspberry Pi, performing localization, computer vision for landing, and communication. The System on Chip computer is also providing nodes for the mesh network on the ground as well forwarding incoming command messages for the UAV and outgoing location telemetry from the flying vehicle.
As for localization, a combination of RFID, sonar, and a proper definition of flying corridors with operator's safety in mind is done. The conjunction of the three elements provide a solution to avoid the well-known problem of RSS readings. Sonars provide accurate distance to the confined flying corridor and keep the UAV centered in the right track. An improved mechanism to deploy tags, and the use of a specific coating to prevent reflections help evading the multipath problem; consequently, the whole solution the initial lack of accuracy of RSS positioning method, and the average error and RMSE are kept within acceptable values. Manufacturing plant layout is abstracted by representing it as a series of nodes, and an improved performance algorithm is used to allow finding operator's workstation. Horizontal flight planning has been simplified as a graph of perimetral and transversal corridors, what also allows providing required safety network for operators' safety. Nodes' graph and tag distribution are kept in the onboard Raspberry Pi computer to provide autonomous flight. Laboratory experiments showed reasonable results for keeping UAV's location under back-end software control.
As for autonomous landing, an affordable computer vision is designed to provide long-range and short-range localization of the landing pad and pose estimation. Descent is split into three different steps. First, finding the point where transversal flying corridor must be left; a long-range detection of a circle helps determining when the UAV should stop going forward, and begin descent; descent begins and continues until the short-range detection mechanism, based on four ArUco markers, is within reach. Second, performing most of the descent using the short-range algorithm, while keeping the long-range still active to double-check destination point; fusion estimation is used to leverage the existence of more than one element and provide a better estimation; we even take advantage of marker's arrangement and orientation, to obtain another reference (diagonal intersection); laboratory experiments provided acceptable discrepancy between these complementary method. Finally, the last stage uses only the ArUco markers to perform short-range approximation to the landing table; experiments show a reasonable difference between landing goal and actual landing spot.
There are still many relevant fields to be further investigated and improved. First, a single UAV is considered to be flying in this research, but it is expected that, according to the number of incidents to be attended concurrently, more than one flying vehicle needs a suitable mission control and the management of path intersection, as well as a system to regulate which UAV is attending each of the incidents raised. Second, the capability of the vehicle to attend more than one incident in a flight; in this research one flight was considered to attend one incident, but it could be improved by delivering to more than one workstation in a single run; payload's characteristics should be taken into consideration so that the system could decide when two parts could be delivered together. Third, ground effect should be addressed; it was visually observed in the experiments that when the UAV is just a few centimeters far from ground, horizontal displacements were suffered, resulting in worst results than the ones achieved in the rest of the descent; the particular dynamics when the vehicle is about to perform landing should be added to improve landing accuracy. Fourth, the challenges that would arise when applying this methodology to a case where the UAV should perform activities inside and outside the manufacturing plant are also an interesting continuation for this research, that focused on in-door activities.
It is expected that these next steps can be taken in the near future to extend the coverage of the research already performed.