Next Article in Journal
Multiple Unmanned Aerial Vehicle Autonomous Path Planning Algorithm Based on Whale-Inspired Deep Q-Network
Previous Article in Journal
A Visual Odometry Pipeline for Real-Time UAS Geopositioning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Contributions to Image Transmission in Icing Conditions on Unmanned Aerial Vehicles

by
José Enrique Rodríguez Marco
1,2,*,
Manuel Sánchez Rubio
1,
José Javier Martínez Herráiz
2,
Rafael González Armengod
1 and
Juan Carlos Plaza Del Pino
1,3
1
National Institute of Aerospace Technology, 28850 Madrid, Spain
2
Department of Computer Science, University of Alcalá, 28805 Madrid, Spain
3
National University of Education at Distance, 28015 Madrid, Spain
*
Author to whom correspondence should be addressed.
Drones 2023, 7(9), 571; https://doi.org/10.3390/drones7090571
Submission received: 31 July 2023 / Revised: 25 August 2023 / Accepted: 31 August 2023 / Published: 5 September 2023
(This article belongs to the Section Drone Communications)

Abstract

:
In terms of manned aircraft, pilots usually detect icing conditions by visual cues or by means of ice detector systems. If one of these cues is seen by the crew or systems detect icing conditions, they have to apply the evasive procedure as defined within the aircraft flight manual (AFM). However, as regards unmanned aircraft, there are not pilots on board and, consequently, nobody can act immediately when icing conditions occur. This article aims to propose new techniques of sending information to ground which make possible to know the aircraft performance correctly in icing conditions. For this goal, three contributions have been developed for the unmanned aircraft Milano. Since icing conditions are characterized quantitatively by the droplet size, the liquid water content, and the total air temperature, when these parameters are between certain limits ice formation on aircraft may occur. As a result of these contributions, in that moment, high-quality images of the wing leading edge, tail leading edge and meteorological probes will be captured and sent to ground making possible that remote pilots or artificial intelligent (AI) systems can follow the appropriate procedures, avoid encounters with severe icing conditions and perform real-time decision making. What is more, as information security is becoming an inseparable part of data communication, it is proposed how to embed relevant information within an image. Among the improvements included are image compression techniques and steganography methods.

1. Introduction

Unmanned aerial systems (UAS) typically consists of an airborne part called unmanned aerial vehicle (UAV) with an appropriate payload (PL) and a ground control station (GCS). INTA (Instituto Nacional de Técnica Aeroespacial) has gained plenty of reputation over the last decades in the development of new UAS. One of the UAVs developed and integrated by INTA is the Milano [1], a Type III UAV (according to NATO classification [2,3]) that provides real-time reconnaissance [4], surveillance and target acquisition. What is more, a wide variety of scientific payloads can be integrated into the system due to the modularity of the design, the weight, volume and electrical power capacity of the aircraft, and the advanced communications system implemented.
One of the future objectives of INTA is to carry out flight tests in order to study the effects of in-flight icing on UAV’s aerodynamic performance [5]. Some unmanned aerial vehicles [6] have data acquisition systems (DAS) with telemetry (TM) and telecommand (TC) units [7], others have flight control systems with data transmission through radio-modem and others have the two systems working at the same time [8], this is the case of the Milano UAV.
Milano is a long-endurance [9] tactical unmanned aerial system used at medium altitudes, whose geometry produces a very small radar signature, which reduces the possibility of being detected during missions [10]. Moreover, the Milano UAV, made mostly in carbon-fiber composite, has a wingspan of 12.5 m, a total length of 8.5 m and a height of 2.3 m. The aircraft has an operational ceiling of 7500 m above sea level and an endurance of 20 h of flight time (see Table 1).

1.1. Objective

The main purpose of this article is to put forward various models that improve images transmission to ground on UAVs when icing conditions [11,12] take place and allow the remote operator or the autonomous artificial intelligent [13,14,15] to act immediately if they would be necessary.
In order to accomplished this objective, the following specific goals are put forward:
  • Define when aircraft icing is likely to occur.
  • Develop a model to classify parameters into groups according to their importance for the prediction and sensing of ice formation.
  • Analyse initial conditions, taking into consideration bandwidth limits and the equipment used.
  • Define various models based on theoretical estimates that enable to improve the initial capabilities.

1.2. Justification

Aircraft icing can affect flight performance and safety in many ways [16,17]. Ice on an airfoil changes its properties, lessening lift and increasing drag for a given angle of attack. What’s more, ice on a propeller blade reduces the efficiency of the propeller and thrust falls off. In addition, the weight of the ice increases the aircraft weight and therefore the lift required (see Figure 1). Because for a given airspeed the angle of attack must be increased to provide the necessary lift, the critical stall speed will increase when ice is present [18].
By disrupting the airflow over the ailerons or elevator, small amounts of ice can alter the aerodynamic balance of the controls and potentially render the aircraft uncontrollable. Wing icing can lead to a decrease in the airfoil stall angle of attack, in other words, an ice-contaminated wing will stall at a lower angle of attack or higher airspeed than a clean wing. Moreover, it can disrupt the airflow over the ailerons and cause the aircraft to behave in unusual ways. The aileron may deflect without autopilot input and cause an uncommanded roll.
Tailplane icing can disrupt the airflow under the stabilizer and make the elevator less effective. The elevator may oscillate without remote pilot input and cause an uncommanded pitch change. Therefore, increasing tailplane angle of attack with ice on the tail may cause a tailplane stall, especially in flap downwash flows. What is more, in extreme cases, the yoke might snatch forward, immediately pitching the nose down [19].
To that could be added that icing also increases fuel consumption of aircraft and thus any solution to the problem will be environmentally-friendly [20].
Currently, there are no many previous studies that take into consideration unmanned aircraft in icing conditions. How icing affects an unmanned aircraft’s performance and ways to mitigate the impact of icing on airframes is a major area of Research and Develoment (R&D). What is more, flight tests [21,22] on unmanned aircraft has the advantage that no human life is put at risk [23] and more severe conditions can be tested, for instance higher levels of ice formation or carry out flight test in different types of clouds beyond those contemplated in the engineering standard [24,25] for atmospheric icing (14 CFR Appendix C to Part 25).
In this context the proposed improvements will allow remote pilots to know accurately when icing conditions take place and acquire detailed knowledge of ice formation on wing and horizontal stabiliser in real time.

1.3. Limitations

These improvements have been implemented for Milano and can be extended to other UAV with radio-modem such as ALO [26] (Spanish acronym for Light Observation Aircraft) or with data acquisition system for instance SIVA (Spanish acronym for Integrated Aerial Surveillance System), both conceived as a real-time reconnaissance and surveillance drones and also developed by INTA.
The instrumented aerial vehicle is controlled and monitored in real time from a ground control station either by radio-electrical line-of-sight link or by satellite link. On the one hand, line-of-sight communication has the transmitter and receiver antennas in visual contact with each other without any sort of an obstacle between them. Potential obstructions can include structures, natural features like mountains or trees, or meteorological features such as clouds and fog. Not only can the signal be lost by obstacles, but also antennas can lose effectiveness in certain maneuvers performed by the aircraft or even may result from intentional or inadvertent jamming of the signals. On the other hand, satellite link uses the Spanish network of governmental satellites (XTAR and SPAINSAT) and allows the execution of missions beyond the horizon line [27,28].
It is important to say that flight data are recorded on the onboard computer, but nowadays it is not possible to store data from flight test instrumentation (FTI), icing data or the images provided by the payloads. Therefore, if during the flight any obstruction between the UAV and the remote pilot or ground control station takes place, this information may be lost.
Recommended Standard 232 (RS-232) uses low-speed data transmission, that it decreases with distance due to capacitance effects. In opposition to this, modern data acquisition systems [29] use bit rates of up to 12 Mbits/s, and, consequently, requirements of flight test are better covered.

1.4. Initial Considerations

Data link is the term used to describe communications between a UAV and a GCS [30]. On the one hand, there are a downlink with telemetry signals and an uplink with telecommand signals. It can be said that among the telemetry data are FTI data, icing parameters and images of wing and tailplane leading edges and meteorological probes. All this information is provided by the DAS. On the other hand, the Milano UAV has a radio-modem (RM) link [31], a redundant link both for telemetry and telecommand. Radio-modem uses a RS-232 serial communication and information is transmitted to GCS in the 902-928 MHz band. It is important to highlight that in this second link does not take the DAS into account, and, telemetry enables only transmission of flight data, in other words, information from air data system (ADS), magnetometer, inertial measurement unit (IMU) and global positioning system [32] (GPS). In addition, it must be added that Milano has a radio control (RC) link that will be used if the other links do not work.
Figure 2 includes a block diagram of flight test instrumentation used for the acquisition and transmission of information related to flight test.
Figure 3 illustrates the main antennas included in the Milano UAV: TM antenna, TC antenna, RM antenna, RC antenna and GPS antenna.
In addition, sensors included on the unmanned aircraft have the qualification for measuring the following parameters (see Figure 4):
It is important to outline that the Milano UAV has two underwing probes [33]. Under the left wing, there is a cloud aerosol and precipitation spectrometer [34] (CAPS). This probe measures concentration and records images of cloud particles from approximately 50 to 1600 μ m in diameter. What is more, CAPS probe measures total air temperature and cloud droplet and aerosol concentrations within the size range from 0.5 to 50 microns. Consequently, it provides MVD (Median Volumetric Diameter), LWC (Liquid Water Content) and TAT (Total Air Temperature) parameters. Under the right wing, there is a passive cavity aerosol spectrometer probe [35] (PCASP). This probe measures aerosol particles in the 0.1 to 3 μ m range. As a result, it provides the MVD parameter.
All parameters are perfectly defined in floating point according to ANSI/IEEE Standard 754 in 32-bit (4-bytes) or in 16-bit integer format (2-bytes). Of the 40 parameters, thirty-seven are FTI parameters and three are icing parameters [36,37].
On the one hand, data is grouped into three categories. Firstly, those parameters with a very high sampling rate are included, for instance, accelerations and angular velocities in the X, Y and Z axis. Secondly, several measurements of angles are included (of attack, sideslip, flight path, pitch, roll and yaw) and others such as LWC and MVD. Within the last group are the parameters with the lowest sampling rate. The number of samples sent to ground does not change during all phases of flight. Therefore, data reception in the ground station is uniform and linear.
It is important to note that MVD, LWC and TAT parameters define when aircraft icing is likely to occur [38,39]. For the icing phenomenon to happen, the ideal limits are the following: total air temperature from 0 C to −10 C, median volumetric diameter greater than 10 μ m, and liquid water content greater than 0.5 g/m 3 .
Figure 5 illustrates the main sensors inclueded in the Milano unmanned aircraft.
On the other hand, in order to be able to observe icing in the most critical areas, cameras are integrated in the leading edges of the wing and stabilisers. Both wing and stabilisers are narrowed, so icing will start at the wing and stabilisers tips. Therefore, four cameras are installed to monitor the last 0.5 m of each wingtip and stabiliser. To improve the visual detection of the ice, high contrast lines are painted to allow a better view of the ice, especially in the case of glaze ice [40]. As the aircraft skin is white, these lines are painted matt black for greater color contrast.
Moreover, two cameras more are integrated in the lower forward fuselage to monitor ice formation on CAPS and PCASP probes. In addition, the own probes can generate images from droplets passing through them. Figure 6 shows icing instrumentation integrated in the UAV.
Figure 7 shows high contrast lines and the location of one of the cameras included in the Milano UAV.
The inter-range instrumentation group (IRIG) pulse code modulation (PCM) coding is part of the instrumentation and data acquisition system in relation to the operation of the flight test vehicle. Its role is to provide a safe transmission allowing the sending to ground, recording and monitoring of flight data in real time.
PCM encoding is the process of sampling, quantization, and encoding, to produce an encoded signal in a binary format. PCM data are transmitted one bit at a time (serial transfer) in words of 16 bits. In the case of telemetry transmission of data, serial transfer saves radio frequency (RF) spectrum bandwidth.
The mode used for telemetry from the aircraft to the ground station is synchronous transmission, and, as the name implies, data are sampled and sent in a regular, predictable and continuous stream. Each data parameter is identifiable by its own unique timeslot. Even if a parameter has the same value as the last time it was sampled, it must be transmitted regardless because that timeslot has been assigned to it. Obviously, if most of the data is static, much redundant data is transmitted. Therefore, the disadvantage of this method of data transmission is that for data changing value much slower than the sampling rate, bandwidth is wasted [41].
In accordance with the IRIG 106 standard [42], once data are sampled and digitized, the information is organized in a matrix form called PCM frame. This is made up of minor frames (also called subframe) and a major frame (also called frame): the first one is defined as the data structure in time sequence from the beginning of a minor frame synchronization pattern to the beginning of the next minor frame synchronization pattern, while the second one contains the number of minor frames required to include one sample of every parameter in the format.
Furthermore, a pseudo-random sequence of N bits (easy to recognize on reception) is included in the first words of each subframe to achieve synchronization and detection of subframes, these words are known as synchronism words (SYNC). Similarly, subframe synchronization method is the method to identify the major frames by including a parameter called subframe identifier (SFID) in all minor frames at a known position in the array [43].
In Figure 8 a frame of 128 words can be observed. This is made up of 8 subframes of 16 words.
It is important to note that:
  • S 1 and S 2 are SYNC words used for synchronization of the incoming data with the receiver.
  • From I d 0 to I d 7 (column 7) are SFID words used to identify each subframe.
  • From P 1 to P 21 are measured data:
-
P 1 and P 2 are supercommutated words and signal is sampled multiple times per minor frame.
-
P 3 , P 4 and P 5 are commutated words and signal is sampled once per minor frame.
-
From P 6 to P 21 are subcommutated words and signal is sampled less than once per minor frame.
Data and images are transmitted in a C-band frequency granted to INTA. For that purpose, several transmitters are used to apply diversity combining techniques that combine the multiple received signals of a diversity reception device into a single improved signal [44].

2. Contributions

The minimum unit of the digital image is a pixel. The number of pixels that make up an image is a technical indicator of photographic quality and is called spatial resolution. The higher the spatial resolution of the image, the larger the file size, and consequently, the number of bits needed for binary enconding.
What is more, the computer file of an image contains the number of bits used to describe the brightness and color of each of the pixels of the image. This is what is called quantization, color depth or bit depth. For instance, a bitonal image is represented by pixels consisting of 1 bit each, which can represent two tones (typically black and white), using the values 0 for black and 1 for white or vice versa. In contrast, a color image is tipically represented by a color depth from 8 to 24 bits [45]. In this case, images captured by the cameras use initially 16-bit color.
Table 2 illustrates the number of color tones available with different bit depths.
Therefore, the size of an image is determined by the number of samples that make up the image (number of pixels, n p ), the size in bits of each sample (color depth, c d ) and the compression level (if there is any). To calculate the size in bits of an uncompressed photographic image of 8 bits per channel, the following formula will be used:
I m a g e s i z e = n p · c d ( in bits )
where n p = n h · n v , being n h the number of horizontal pixels and n v the number of vertical pixels.
Nowadays, digital systems offer some flexibility when it comes to defining image resolution and the frequency of captured images. The higher quality images are, the better knowledge of ice formation is obtained and a better performing real-time decision making will be done. In this sense, it is important to point out that DAS included in the Milano UAV allows the acquisition of camera images of three different resolutions: 352 × 288 pixels, 704 × 288 pixels and 704 × 576 pixels. In contrast, images captured by probes only can be acquired in two different resolutions: 126 × 360 pixels and 126 × 720 pixels.
Therefore, this article aims to propose new techniques of image transmission to a ground station which enable to obtain higher quality images in real time and more information about the area studied.
The bit rate indicates the number of bits per second (bit/s) that are transmitted. To calculate the bit rate of uncompressed images, the following formula will be used:
B i t r a t e = n p · c d · n i p s ( in bits / s )
The DAS included in the Milano UAV uses a bit rate of 12 Mbits/s, and, consequently, many parameters and images could be provided. On the one hand, data from FTI parameters and icing parameters have few bandwidth requirements and, on the other hand, images from the six cameras and CAPS and PCASP probes have high bandwidth requirements.
Firstly, the aircraft measures a total of 38 FTI and icing parameters. As has been mentioned, of the 38 parameters, 6 of them belong to the first group, 8 of them to the second group and 24 of them to the third category. It was decided that the six parameters with the greatest variation in their measurements ( N H ) are sampled 150 times per second ( S R H ). In addition, the eight parameters with a medium sampling rate ( N I ) are sampled 60 times per second ( S R I ). Lastly, the twenty-four parameters with the lowest variation in their measurements ( N L ) are sampled 20 times per second ( S R L ). Knowing that all parameters are sent in words of 16 bits, the bit rate used is:
B R d a t a = 16 · N H · S R H + N I · S R I + N L · S R L = 16 · 6 · 150 + 8 · 60 + 26 · 20 = 30400 bits / s
Secondly, images are sent with a resolution of 352 × 288 pixels. In addition, 16 bits are used to describe the brightness and color of each of the pixels and 1 image/s is transmitted to ground. Knowing that DAS receives the signal from 6 cameras, the bit rate used is:
B R i m a g e s = 6 · n p · c d · n i p s = 6 · 352 · 288 · 16 · 1 = 9.73 Mbits / s
Thirdly, images of supercooled water droplets from CAPS and PCASP probes are sent with a resolution of 126 × 360 pixels. In addition, 16 bits are used to describe the brightness and color of each of the pixels and 1 image/s is transmitted to ground. Knowing that DAS receives the signal from the two probes, the bit rate used is:
B R i m a g e s = 2 · n p · c d · n i p s = 2 · 126 · 360 · 16 · 1 = 1.45 Mbits / s
As may be noted, data bit rates are practically insignificant. Therefore, the improvements addressed by this article are only based on bit rates used by signals provided by the 6 cameras and generated by CAPS and PCASP probes.

2.1. First Contribution

In the first contribution, the six cameras integrated in the leading edge of the wing, in the leading edge of the stabiliser and in the forward part of the fuselage will send ice formation images using 8-bit color. In addition, the two meteorological probes will also send images about the shape of supercooled water droplets using 8-bit color. This provides significant savings in bandwidth by being able to send higher image resolution than in the initial case.
Thanks to this contribution, three options can be taken. In the first case, images can be sent with the same resolution using half of the bandwidth. In the second case, camera resolution can be increased up to 704 × 288 pixels and images resolution of probes can be increased up to 126 × 720 pixels. Thus, the six cameras and meteorological probes can send higher resolution images with a frequency of 1 image per second. In the third case, camera resolution can be increased up to 704 × 576 pixels with a frequency of 0.5 image per second.
Knowing that DAS receives the signal from the cameras and probes, the bit rate used in the first option is:
B R 1 s t , 1 = B R c a m e r a s 1 6 + B R p r o b e s 7 8 = 6 · n p 1 6 · c d 1 6 · n i p s 1 6 + 2 · n p 7 8 · c d 7 8 · n i p s 7 8 = 6 · 352 · 288 · 8 · 1 + 2 · 126 · 360 · 8 · 1 = 5.59 Mbits / s
As a result, images of 352 × 288 pixels will use 0.81 Mbits/s each one. The total amount is 5.59 Mbits/s, falling below the theoretical limit of 12 Mbits/s.
Knowing that DAS receives the signal from the cameras and probes, the bit rate used in the second option is:
B R 1 s t , 2 = B R c a m e r a s 1 6 + B R p r o b e s 7 8 = 6 · n p 1 6 · c d 1 6 · n i p s 1 6 + 2 · n p 7 8 · c d 7 8 · n i p s 7 8 = 6 · 704 · 288 · 8 · 1 + 2 · 126 · 720 · 8 · 1 = 11.19 Mbits / s
As a result, images of 704 × 288 pixels will use 1.62 Mbits/s each one. The total amount is 11.19 Mbits/s, falling below the theoretical limit of 12 Mbits/s.
Knowing that DAS receives the signal from the cameras and probes, the bit rate used in the third option is:
B R 1 s t , 2 = B R c a m e r a s 1 6 + B R p r o b e s 7 8 = 6 · n p 1 6 · c d 1 6 · n i p s 1 6 + 2 · n p 7 8 · c d 7 8 · n i p s 7 8 = 6 · 704 · 576 · 8 · 0.5 + 2 · 126 · 720 · 8 · 1 = 11.19 Mbits / s
As a result, images of 704 × 576 pixels will use 1.62 Mbits/s each one. The total amount is 11.19 Mbits/s, falling below the theoretical limit of 12 Mbits/s.
Figure 9 shows a comparison between the initia case and the first contribution.

2.2. Second Contribution

In the second contribution, a field of view (FOV) modification technique is used, being FOV defined as the maximum area of a sample that a camera can capture. This contribution affects only the six cameras because probes cannot change their FOV.
The idea of this contribution is that, initially, the camera uses a given FOV, and when ice formation is detected in a particular point of the wing and stabiliser tips or the CAPS and PCASP probes, the FOV is changed to a smaller one obtaining more useful information than in the initial case.
Instantaneous field of view (IFOV) is an angular projection of a single pixel of a camera’s focal plane array (FPA) detector. The field of view (FOV) is the area accross the camera can image (see Figure 10).
Horizontal and vertical IFOV can be calculated as follows:
I F O V h ( m r a d ) = ( F O V / n h ) · [ ( 3.14 / 180 ) · ( 1000 ) ]
I F O V v ( m r a d ) = ( F O V / n v ) · [ ( 3.14 / 180 ) · ( 1000 ) ]
The camera lens can use a horizontal FOV between 20 and 50 . In the initial case it is used a FOV of 50 and when ice formation is detected in a particular point of the wing or stabiliser tips a FOV of 20 will be used. To do the calculations it is assumed that this particular point is located 0.2 m from the camera.
In the initial case, each camera uses a FOV of 50 and images of 352 × 288 pixels. Thus, horizontal and vertical IFOV in mrad are:
I F O V h ( m r a d ) = ( 50 / 352 ) · [ ( 3.14 / 180 ) · ( 1000 ) ] = 2.48 mrad
I F O V v ( m r a d ) = ( 50 / 288 ) · [ ( 3.14 / 180 ) · ( 1000 ) ] = 3.03 mrad
The horizontal and vertical IFOV in mm are:
I F O V h ( m m ) = ( 2.48 / 1000 ) · 200 = 0.50 mm
I F O V v ( m m ) = ( 3.03 / 1000 ) · 200 = 0.61 mm
These values of IFOV are the measurable size in mm of a single pixel. In other words, the camera can measure a point of 0.55 mm × 0.61 mm at a distance of 0.2 m. A correct value will not be obtained as a result of a phenomenon called optical dispersion. So the IFOV in mm is multiplied by three as follows:
I F O V h ( m m ) = 0.50 · 3 = 1.5 mm
I F O V v ( m m ) = 0.61 · 3 = 1.83 mm
These values of IFOV are the measurable size in mm of 3 pixels which means that the camera can measure a point of 1.5 mm × 1.83 mm to at a distance of 0.2 m. These values are more accurate.
In the specific moment in which ice formation is detected, a FOV of 20 will be used. Using the same image resolution (352 × 288 pixels), horizontal and vertical IFOV in mrad are:
I F O V h ( m r a d ) = ( 20 / 352 ) · [ ( 3.14 / 180 ) · ( 1000 ) ] = 0.99 mrad
I F O V v ( m r a d ) = ( 20 / 288 ) · [ ( 3.14 / 180 ) · ( 1000 ) ] = 1.21 mrad
Assuming that ice formation is located 0.2 m from the camera, the horizontal and vertical IFOV in mm are:
I F O V h ( m m ) = ( 0.99 / 1000 ) · 200 = 0.2 mm
I F O V v ( m m ) = ( 1.21 / 1000 ) · 200 = 0.24 mm
In this case, the camera can measure a point of 0.2 mm × 0.24 mm at a distance of 0.2 m. A correct value will not be obtained as a result of a phenomenon called optical dispersion. So the IFOV in mm is multiplied by three as follows:
I F O V h ( m m ) = 0.2 · 3 = 0.6 mm
I F O V v ( m m ) = 0.24 · 3 = 0.72 mm
These values of IFOV are the measurable size in mm of 3 pixels which means that the camera can measure a point of 0.24 mm × 0.72 mm to at a distance of 0.2 m. These values are more accurate.

2.3. Third Contribution

In the third contribution, least significant bit (LSB) steganography is used. LSB insertion is a common and simple technique to embed information in a cover image [46,47,48]. The least significant bit of pixels of the image is replaced with data bits. When using a 16-bit image, a bit of each of the two bytes can be used.
In other words, 2 bits in each pixel can be stored. Thus, an image 352 × 288 pixels can store a total amount of 202,752 bits (25,344 bytes) of embedded data. An image of 704 × 288 pixels can store a total amount of 405,504 bits (50,688 bytes) of embedded data. An image of 704 × 576 pixels can store a total amount of 811,008 bits (101,376 bytes) of embedded data.

3. Results

Once the three contributions have been exposed, the main improvements of the three contributions can be observed clearly in results section. Therefore, images with different resolutions, color depth and FOV are shown.

3.1. First Contribution

Firstly, images of the wing leading edge will be shown with three different resolutions: 352 × 288 pixels (see Figure 11), 704 × 288 pixels (see Figure 12) and 704 × 576 pixels (see Figure 13). Figure 11 uses a color depth of 16 bits while Figure 12 and Figure 13 use 8-bit color.
In addition, images measured by CAPS and PCASP probes about the shape of supercooled water droplets will be shown with two different resolutions: 126 × 360 (see Figure 14) and 126 × 720 pixels (see Figure 15). In this case, Figure 14 and Figure 15 use 8-bit color.
As illustrated in the previous figures, the higher the resolution, the more pixels there are in an image, which means it can display more visual information. As can be observed in the specific cases of CAPS and PCASP images, information of a greater number of droplets is included.
Moreover, using 8-bit color instead of 16-bit color is imperceptible to the human eye. As has been said before, this makes possible significant savings in bandwidth.

3.2. Second Contribution

Next, images captured with 20 and 50 of FOV are shown. Figure 16 and Figure 17 are captured by fuselage icing cameras to monitor ice formation on CAPS probe.
As can be observed in the previous figures, FOV is changed to a smaller one obtaining more detailed information about areas of CAPS probe where ice is formed.

3.3. Third Contribution

In the third contribution the least significant bits are used to embed information in an image. For that purpose, a grid for 4 pixels of a 16-bit image can be as follows:
Drones 07 00571 i001
When the number 196, whose binary representation is 11000100, is embedded into the least significant bits of this part of the image, the resulting grid is as follows:
Drones 07 00571 i002
Although data is embedded into the first 8 bytes of the grid, only the 3 underlined bits needed to be changed according to the embedded message. On average, only half of the bits in an image will need to be modified to hide a secret message using the maximum cover size [49,50]. Since there are 65,536 colors available per pixel, using this technique to hide information results in small changes in the intensity of the colors. If these changes are not perceived by the human eye, it means that the message is successfully hidden. With a well-chosen image, the message can be even hidden in the least as well as second to least significant bit and still not see the difference.
In the above example, consecutive bytes of the image data (from the first byte to the end of the message) are used to embed the information. A slightly more secure system would be that the transmitter and receiver share a secret key that specifies only certain pixels to be changed [51]. Thus, whether an adversary suspect that LSB steganography has been used, there is no way of knowing which pixels to target without the secret key. This can be very useful in aerial surveillance operations [52].

4. Discussion

The improvements developed in the paper will provide a better characterization of an UAV such as Milano in icing conditions. The Milano UAV was originally limited to send images of 352 x 288 pixels using 16-bit color. These new methods will allow to improve images transmission to ground and know the aircraft behavior accurately. In addition, these contributions can be extended to other UAVs owned by the Spanish army.
The most relevant contributions resulting from the three improvements implemented are: the development of new models that allow to send to ground images with a higher resolution, a field of view modification technique that uses different FOV to obtaing more detailed knowledge about points of interest and the use of a steganography method to embed information in a cover image.
With the first contribution can be observed that lowering color depth of the image provides greater savings in bandwidth and, thus, a higher image resolution. As a result, images are made up of more pixels and more visual information can be displayed. It is important to point out that this is possible only when the difference between bit depths is imperceptible to the human eye. The second contribution allows to obtain more information about a specific area of study. Finally, the last contribution shows as the least significant bit of pixels of the image is replaced with data bits, allowing to hide information successfully.

5. Future Research Lines

INTA seeks to improve the fundamental knowledge of the ice accretion process on unmanned aerial vehicles. The analysis performed in this field can be improved. A future research line could be to add other modern meteorological instruments that allow improving the detection of ice formation in UAVs and thus, to obtain the icing parameters more accurately.
Another line of inquiry could be the development of new image compression models in the background of unmanned aerial vehicles. It could be useful to implement better image compression algorithms, but above all it could be useful to analyse their real-time influence, taking into account that response times on a ground station had better not surpass 100 milliseconds. In addition, it would be interesting to display the images captured by the cameras in real time, without significant delays or losses.
Another future project is to carry out an Ansys’ CFD (Computational Fluid Dynamics) approach to simulate and analyse in-flight icing and ice accretion and compare to experimental results.
Finally, the study of computational algorithms for the detection and resolution of errors in data reception could be a good research line.

Author Contributions

Conceptualization, J.E.R.M.; Methodology, M.S.R., J.J.M.H. and R.G.A.; Validation, J.E.R.M.; Formal analysis, J.E.R.M.; Investigation, J.E.R.M.; Writing—original draft, J.E.R.M.; Visualization, J.C.P.D.P.; Supervision, M.S.R., J.J.M.H. and R.G.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

The authors would like to express their deep thanks to the area of Flight Testing of Spanish National Institute for Aerospace Technology (INTA), especially Miguel Marco and Óscar Serrano, for their help in the instrumentation of the UAV and in the data analysis.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ALOAvión Ligero de Observación
ADSAir Data System
AFMAircraft Flight Manual
AIArtificial Intelligent
CAPSCloud Aerosol and Precipitation Spectrometer
CFDComputational Fluid Dynamics
DASData Acquisition System
FTIFlight Test Instrumentation
GCSGround Control Station
GPSGlobal Positioning System
IMUInertial Measurement Unit
INTAInstituto Nacional de Técnica Aeroespacial
IRIGInter-Range Instrumentation Group
LSBLeast Significant Bit
LWCLiquid Water Content
MVDMedian Volumetric Diameter
PCASPPassive Cavity Aerosol Spectrometer Probe
PCMPulse Code Modulation
PLPayload
R&DResearch and Development
RCRadio Control
RFRadio Frequency
RMRadio-modem
RS-232Recommended Standard 232
SFIDSubframe Identifier
SIVASistema Integrado de Vigilancia Aérea
SYNCSubframe Sincronization
TCTelecommand
TMTelemetry
TATTotal air temperature
UAVUnmanned Aerial Vehicle
UASUnmanned Aerial System

References

  1. Vidal, I.; Sanchez-Aguero, V.; Valera, F.; Nogales, B.; Cabezas, J.; Vidal, C.; López, A.; González, D.; Díez, J.; Berrazueta, L.; et al. Milano: Una visión futura para un UAS táctico. VI Congreso Nacional de I+D en Defensa y Seguridad. 2018. Available online: https://e-archivo.uc3m.es/handle/10016/28959 (accessed on 30 August 2023).
  2. NATO STANAG 4670. Recommended Guidance for the Training of Designated Unmanned Aerial Vehicle Operator (DUO), 1st ed.; NATO Standardization Agency: Brussels, Belgium, 2006.
  3. NATO STANAG 4670 —ATP-3.3.7. Guidance for the Training of Unmanned Aircraft Systems (UAS) Operators, 3rd ed.; NATO Standardization Agency: Brussels, Belgium, 2014; Available online: http://everyspec.com/NATO/NATO-STANAG/SRANAG-4670_ED-3_52054/ (accessed on 30 August 2023).
  4. Berger, J.; Barkaoui, M.; Boukhtouta, A. A hybrid genetic approach for airborne sensor vehicle routing in real-time reconnaissance missions. Aerosp. Sci. Technol. 2007, 11, 317–326. [Google Scholar] [CrossRef]
  5. El-Salamony, M.; Aziz, M.A. Solar Panel Effect on Low-Speed Airfoil Aerodynamic Performance. Unmanned Syst. 2021, 9, 333–347. [Google Scholar] [CrossRef]
  6. Skorobogatov, G.; Barrado, C.; Salamí, E. Multiple UAV Systems: A Survey. Unmanned Syst. 2020, 8, 149–169. [Google Scholar] [CrossRef]
  7. Strock, O.J. Telemetry Computer Systems: The New Generation; Instrument Society of America: Pittsburgh, PA, USA, 1988. [Google Scholar]
  8. Dantsker, O.D.; Mancuso, R.; Selig, M.S.; Caccamo, M. High-Frequency Sensor Data Acquisition System (SDAC) for Flight Control and Aerodynamic Data Collection. In Proceedings of the 32nd AIAA Applied Aerodynamics Conference, Session: Aerodynamic Testing: Flight, Wind Tunnel and Numerical Correlations III, Atlanta, GA, USA, 16–20 June 2014. [Google Scholar] [CrossRef]
  9. Khoshnoud, F.; Esat, I.I.; De Silva, C.W.; Rhodes, J.D.; Kiessling, A.A.; Quadrelli, M.B. Self-Powered Solar Aerial Vehicles: Towards Infinite Endurance UAVs. Unmanned Syst. 2020, 8, 95–117. [Google Scholar] [CrossRef]
  10. Romero, S.F.; Rodríguez, P.L.; Bocanegra, D.E.; Martínez, D.P.; Cancela, M.A. Comparing Open Area Test Site and Resonant Chamber for Unmanned Aerial Vehicles’s High-Intensity Radiated Field Testing. IEEE Trans. Electromagn. Compat. 2018, 60, 1704–1711. [Google Scholar] [CrossRef]
  11. Jeck, R.K. Other ways to characterize the icing atmosphere. In Proceedings of the 32nd Aerospace Sciences Meeting & Exhibit, Reno, NV, USA, 10–13 January 1994. [Google Scholar] [CrossRef]
  12. Mingione, G.; Barocco, M.; Denti, E.; Bindi, F.G. Flight in Icing Conditions Summary. On behalf of: French DGAC. 2006. Available online: https://www.ecologie.gouv.fr/sites/default/files/Icing_flight_manual.pdf (accessed on 30 August 2023).
  13. Meyrowitz, A.L.; Blidberg, D.R.; Michelson, R.C. Autonomous Vehicles; Institute of Electrical and Electronics Engineers (IEEE): Piscataway, NJ, USA, 1996; Volume 84, pp. 1147–1164. [Google Scholar] [CrossRef]
  14. Matthew, U.O.; Kazaure, J.S.; Amaonwu, O.; Daniel, O.O.; Muhammed, I.H.; Okafor, N.U. Artificial Intelligence Autonomous Unmanned Aerial Vehicle (UAV) System for Remote Sensing in Security Surveillance. In Proceedings of the 2020 IEEE 2nd International Conference on Cyberspac (CYBER NIGERIA), Abuja, Nigeria, 23–25 February 2021. [Google Scholar] [CrossRef]
  15. Li, S.; Qin, J.; He, M.; Paoli, R. Fast Evaluation of Aircraft Icing Severity Using Machine Learning Based on XGBoost. Aerospace 2020, 7, 36. Available online: https://www.mdpi.com/2226-4310/7/4/36 (accessed on 30 August 2023). [CrossRef]
  16. Zhou, C.; Li, Y.; Zheng, W.; Wu, P.; Dong, Z. Safety Analysis for Icing during Landing Phase Based on Reachability Anaysis. Math. Probl. Eng. 2018. [Google Scholar] [CrossRef]
  17. Yamazaki, M.; Jemcov, A.; Sakaue, H. A Review on the Current Status of Icing Physics and Mitigation in Aviation. Aerospace 2021, 8, 188. [Google Scholar] [CrossRef]
  18. European General Aviation Safety Team. In Flight Icing; European Union Aviation Safety Agency (EASA): Cologne, Germany, 2015. Available online: https://www.easa.europa.eu/downloads/24118/en (accessed on 30 August 2023).
  19. Cao, Y.; Tan, W.; Wu, Z. Aircraft icing: An ongoing threat to aviation safety. Aerosp. Sci. Technol. 2018, 75, 353–385. [Google Scholar] [CrossRef]
  20. Boubeta-Puig, J.; Moguel, E.; Sánchez-Figueroa, F.; Hernández, J.; Preciado, J.C. An Autonomous UAV Architecture for Remote Sensing and Intelligent Decision-making. IEEE Internet Comput. 2018, 22, 6–15. [Google Scholar] [CrossRef]
  21. Kang, K.; Prasad, J.V.R. Development and Flight Test Evaluations of an Autonomous Obstacle Avoidance System for a Rotatory-Wing UAV. Unmanned Syst. 2013, 1, 3–19. [Google Scholar] [CrossRef]
  22. Darrah, M.; Wilhelm, J.; Munasinghe, T.; Duling, K.; Yokum, S.; Sorton, E.; Rojas, J.; Wathen, M. A Flexible Genetic Algorithm System for Multi-UAVs Surveillance: Algorithm and Flight Testing. Unmanned Syst. 2015, 3, 49–62. [Google Scholar] [CrossRef]
  23. Huo, M.; Duan, H.; Ding, X. Manned Aircraft and Unmanned Aerial Vehicle Heterogeneous Formation Flight Control via Heterogeneous Pigeon Flock Consistency. Unmanned Syst. 2021, 39, 227–236. [Google Scholar] [CrossRef]
  24. Federal Aviation Requirements—Part 25: Airworthiness Standards: Transport Category Airplanes, USA. Federal Aviation Administration (FAA). Available online: https://www.law.cornell.edu/cfr/text/14/part-25 (accessed on 30 August 2023).
  25. Kohlman, D.L.; Sand, W. Aircraft ICING: Meteorology, Protective Systems, Instrumentation and Certification; The University of Kansas: Lawrence, KS, USA, 1996. [Google Scholar]
  26. Rodríguez, J.E.; Segurado, D.; Sánchez, M.; Martínez, J.J.; González, R. Contributions to data transmission at critical moments on unmanned aerial vehicles. Chin. J. Aeronaut. 2023, 36, 247–257. [Google Scholar] [CrossRef]
  27. Szabolcsi, R. UAV Operator Training—Beyond Minimum Standards. In Proceedings of the Scientific Research and Education in the Air Force: AFASES; 2016. Available online: https://www.afahc.ro/ro/afases/2016/RP/SZABOLCSI.pdf (accessed on 30 August 2023).
  28. Stevenson, J.D.; O’Young, S.; Rolland, L. Beyond Line of Sight Control of Small Unmanned Aerial Vehicles Using a Synthetic Environment to Augment First Person Video. Procedia Manuf. 2015, 3, 960–967. [Google Scholar] [CrossRef]
  29. Agrawal, P.; Prabhu, D.; Chen, Y. Development of an Integrated Data Acquisition System for a Small Flight Probe. In Proceedings of the 50th AIAA Aerospace Sciences Meeting including the New Horizons Forum and Aerospace Exposition, Nashville, TN, USA, 9–12 January 2012; pp. 1–7. [Google Scholar] [CrossRef]
  30. Smiljakovic, V.; Golubicic, Z.; Simic, D.; Obradovik, D.; Dragas, S.; Mikavica, M. Telemetry System of Light Unmanned Aerial Vehicle “Raven”. In Proceedings of the 4th International Conference on Telecommunications in Modern Satellite, Cable and Broadcasting Services, Nis, Yugoslavia, 13–15 October 1999. TELSIKS’99 (Cat. No.99EX365). [Google Scholar] [CrossRef]
  31. Recommended Standard EIA/TIA-232-F: Interface Between Data Terminal Equipment and Data Circuit: Termination Equipment Employing Serial Binary Data Interchange; Telecommunications Industry Association & Electronic Industries Alliance: Arlington, VA, USA, 1997.
  32. Zhao, L.; Wang, D.; Huang, B.; Xie, L. Distributed Filtering-Based Autonomous Navigation System of UAV. Unmanned Syst. 2015, 3, 17–34. [Google Scholar] [CrossRef]
  33. Weigel, R.; Spichtinger, P.; Mahnke, C.; Klingebiel, M. Thermodynamic correction of particle concentrations measured by underwing probes on fast flying aircraft. Atmos. Meas. Tech. 2016, 9, 5135–5162. [Google Scholar] [CrossRef]
  34. Cloud, Aerosol, and Precipitation Spectrometer with Depolarization (CAPS-DPOL). Specifications. Droplet Measurement Technologies. Available online: https://www.dropletmeasurement.com/product/cloud-aerosol-and-precipitation-spectrometer-with-depolarization/ (accessed on 30 August 2023).
  35. Passive Cavity Aerosol Spectrometer Probe (PCASP-100X). Droplet Measurement Technologies. Available online: https://www.dropletmeasurement.com/product/passive-cavity-aerosol-spectrometer-probe/ (accessed on 30 August 2023).
  36. IEEE Standard for Floating-Point Arithmetic (IEEE 754); Institute of Electrical and Electronics Engineers (IEEE): New York, NY, USA, 2019. [CrossRef]
  37. Inter-Range Instrumentation Group. Telemetry Applications Handbook; Online Publications and Standards; Inter-Range Instrumentation Group: Army White Sands Missile Range, NM, USA, 2006; Volume 119, Available online: https://apps.dtic.mil/sti/pdfs/AD1038198.pdf (accessed on 30 August 2023).
  38. García-Magariño, A. Water Droplet Deformation and Breakup in the Vicinity of the Leading Edge of an Incoming Airfoil. Ph.D. Thesis, Universidad Politćnica de Madrid, Madrid, Spain, 2016. Available online: https://oa.upm.es/44231/1/ADELAIDA_GARCIA_MAGARINO_GARCIA.pdf (accessed on 30 August 2023).
  39. Politovich, M.K.; McDonough, F.; Bernstein, B.C. Issues in Forecasting Icing Severity. In Proceedings of the 10th Conference on Aviation, Range, and Aerospace Meteorology, National Center for Atmospheric Research. Portland, OR, USA, 13–16 May 2002; Available online: https://ams.confex.com/ams/13ac10av/techprogram/paper_39172.htm (accessed on 30 August 2023).
  40. Politovich, M.K. Aircraft Icing; National Center for Atmospheric Research: Boulder, CO, USA, 2003; Available online: https://curry.eas.gatech.edu/Courses/6140/ency/Chapter5/Ency_Atmos/Aircraft_Icing.pdf (accessed on 30 August 2023).
  41. Bever, G.A. Digital Signal Conditioning for Flight Test; AGARD-AG-160; Volume 19. 1991. Available online: https://apps.dtic.mil/sti/pdfs/ADA240140.pdf (accessed on 30 August 2023).
  42. Inter-Range Instrumentation Group. Telemetry Standard; Online Publications and Standards; Inter-Range Instrumentation Group: Army White Sands Missile Range, NM, USA, 2006; Volume 106, Available online: http://www.irig106.org/docs/106-19/106-19_Telemetry_Standards.pdf (accessed on 30 August 2023).
  43. Aerospace Telemetry (IRIG 106 PCM & CHAPTER 10 intro). Dewesoft. Available online: https://training.dewesoft.com/online/course/telemetry (accessed on 30 August 2023).
  44. Harney, P.F. Diversity Techniques for Omnidirectional Telemetry Coverage of the HiMAT Research Vehicle; National Aeronautics and Space Administration (NASA): Washington, DC, USA, 1981. Available online: https://www.nasa.gov/centers/dryden/pdf/87940main_H-1133.pdf (accessed on 30 August 2023).
  45. Utray, F. Postproducción Digital. Una Perspectiva Contemporánea, 1st ed.; Capítulo 5: Codificación Digital de la Imagen; Dykinson: Madrid, Spain, 2015. [Google Scholar]
  46. Osman, B.; Yasin, A.; Omar, M. An analysis of alphabet-based techniques in text steganography. J. Telecommun. Electron. Comput. Eng. 2016, 8, 109–115. Available online: https://repo.uum.edu.my/id/eprint/20540/ (accessed on 30 August 2023).
  47. Nosrati, M.; Karimi, R.; Hariri, M. An introduction to steganography methods. World Appl. Program. 2011, 1, 191–195. Available online: https://www.researchgate.net/publication/308646775_An_introduction_to_steganography_methods (accessed on 30 August 2023).
  48. Sumathi, C.P.; Santanam, T.; Umamaheswari, G. A study of Various Steganographic Techniques Used for Information Hiding. Int. J. Comput. Sci. Eng. Surv. (IJCSES) 2013, 4. Available online: https://arxiv.org/ftp/arxiv/papers/1401/1401.5561.pdf (accessed on 30 August 2023).
  49. Krenn, J.R. Steganography and Steganalysis. 2004. Available online: http://www.krenn.nl/univ/cry/steg/article.pdf (accessed on 30 August 2023).
  50. Morkel, T.; Eloff, J.H.P.; Olivier, M.S. An Overview of Image Steganography; Information and Computer Security Architecture (ICSA) Research Group. 2005. Available online: https://www.researchgate.net/publication/220803240_An_overview_of_image_steganography (accessed on 30 August 2023).
  51. Wang, H.; Wang, S. Cyber warfare: Steganography vs. Steganalysis. Commun. ACM 2004, 47, 76–82. Available online: https://www.researchgate.net/publication/220427158_Cyber_warfare_steganography_vs_steganalysis (accessed on 30 August 2023). [CrossRef]
  52. Anderson, R.J.; Petitcolas, F.A.P. On the limits of steganography. IEEE J. Sel. Areas Commun. 1998, 16, 474–481. Available online: https://www.cl.cam.ac.uk/~rja14/Papers/jsac98-limsteg.pdf (accessed on 30 August 2023). [CrossRef]
Figure 1. Effects of icing on aircraft performance.
Figure 1. Effects of icing on aircraft performance.
Drones 07 00571 g001
Figure 2. Block diagram of flight test instrumentation.
Figure 2. Block diagram of flight test instrumentation.
Drones 07 00571 g002
Figure 3. Antennas included in the Milano UAV.
Figure 3. Antennas included in the Milano UAV.
Drones 07 00571 g003
Figure 4. Main parameters acquired by DAS in the Milano UAV.
Figure 4. Main parameters acquired by DAS in the Milano UAV.
Drones 07 00571 g004
Figure 5. Sketch of MILANO instrumentation.
Figure 5. Sketch of MILANO instrumentation.
Drones 07 00571 g005
Figure 6. Milano icing instrumentation.
Figure 6. Milano icing instrumentation.
Drones 07 00571 g006
Figure 7. Milano icing camera.
Figure 7. Milano icing camera.
Drones 07 00571 g007
Figure 8. PCM Frame.
Figure 8. PCM Frame.
Drones 07 00571 g008
Figure 9. Comparison between the initial case and the first contribution.
Figure 9. Comparison between the initial case and the first contribution.
Drones 07 00571 g009
Figure 10. FOV and IFOV for one of the Milano cameras.
Figure 10. FOV and IFOV for one of the Milano cameras.
Drones 07 00571 g010
Figure 11. Wing leading edge. Resolution: 352 × 288 pixels, 16-bit color.
Figure 11. Wing leading edge. Resolution: 352 × 288 pixels, 16-bit color.
Drones 07 00571 g011
Figure 12. Wing leading edge. Resolution: 704 × 288 pixels, 8-bit color.
Figure 12. Wing leading edge. Resolution: 704 × 288 pixels, 8-bit color.
Drones 07 00571 g012
Figure 13. Wing leading edge. Resolution: 704 × 576 pixels, 8-bit color.
Figure 13. Wing leading edge. Resolution: 704 × 576 pixels, 8-bit color.
Drones 07 00571 g013
Figure 14. (a) CAPS probe. 126 × 360 pixels. (b) PCASP probe. 126 × 360 pixels.
Figure 14. (a) CAPS probe. 126 × 360 pixels. (b) PCASP probe. 126 × 360 pixels.
Drones 07 00571 g014
Figure 15. (a) CAPS probe. 126 × 720 pixels. (b) PCASP probe. 126 × 720 pixels.
Figure 15. (a) CAPS probe. 126 × 720 pixels. (b) PCASP probe. 126 × 720 pixels.
Drones 07 00571 g015
Figure 16. CAPS probe. Resolution: 352 × 288 pixels, 16-bit color, FOV: 50 .
Figure 16. CAPS probe. Resolution: 352 × 288 pixels, 16-bit color, FOV: 50 .
Drones 07 00571 g016
Figure 17. CAPS probe. Resolution: 352 × 288 pixels, 16-bit color, FOV: 20 .
Figure 17. CAPS probe. Resolution: 352 × 288 pixels, 16-bit color, FOV: 20 .
Drones 07 00571 g017
Table 1. MILANO technical specifications.
Table 1. MILANO technical specifications.
Wingspan12.5 m
Total length8.5 m
Height2.3 m
Empty weight700 kg
Maximum payload150 kg
Maximum take-off weight950 kg
Cruise speed162 km/h (87 kts)
Maximum operating speed230 km/h (124 kts)
Operational ceiling7500 m
Scope2000 km
Endurance20 h
Table 2. Bit depth vs. Number of color tones.
Table 2. Bit depth vs. Number of color tones.
Bit DepthColor Tones
8256
124096
1665,536
2416,777,216
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rodríguez Marco, J.E.; Sánchez Rubio, M.; Martínez Herráiz, J.J.; González Armengod, R.; Del Pino, J.C.P. Contributions to Image Transmission in Icing Conditions on Unmanned Aerial Vehicles. Drones 2023, 7, 571. https://doi.org/10.3390/drones7090571

AMA Style

Rodríguez Marco JE, Sánchez Rubio M, Martínez Herráiz JJ, González Armengod R, Del Pino JCP. Contributions to Image Transmission in Icing Conditions on Unmanned Aerial Vehicles. Drones. 2023; 7(9):571. https://doi.org/10.3390/drones7090571

Chicago/Turabian Style

Rodríguez Marco, José Enrique, Manuel Sánchez Rubio, José Javier Martínez Herráiz, Rafael González Armengod, and Juan Carlos Plaza Del Pino. 2023. "Contributions to Image Transmission in Icing Conditions on Unmanned Aerial Vehicles" Drones 7, no. 9: 571. https://doi.org/10.3390/drones7090571

Article Metrics

Back to TopTop