Autonomous System for Wildﬁre and Forest Fire Early Detection and Control

: Recurring and increasing large-scale wildﬁres across the globe (e.g., Southern Europe, California, Australia), as a result of worsening climate conditions with record temperatures, drought, and strong winds, present a challenge to mankind. Early ﬁre detection is crucial for a quick reaction and e ﬀ ective ﬁreﬁghting operations, minimizing the risk to human lives as well as the destruction of assets, infrastructures, forests, and wildlife. Usually, ground ﬁreﬁghting relies on human intervention and dangerous exposition to high temperatures and radiation levels, proving the need for mechanisms and techniques to remotely or autonomously detect and combat ﬁre. This paper proposes an autonomous ﬁreﬁghting system built with a motorized water turret, narrow beam far infrared (FIR) sensors, and a micro-controller running novel algorithms and techniques. Experimental ﬁeld results validated the technical approach, indicating that when a small ﬁre front is within the ﬁeld of view of the FIR sensor and within the range of the water jet, it is possible to provide an early alarm and even autonomously extinguish or delay the approaching ﬁre front, increasing the chance for evacuation. A.T.d.A., A.P.C. and L.M.F.; methodology, A.P.C. and L.M.F.; software, L.M.F.; investigation, A.P.C. and L.M.F.; writing—original draft preparation, L.M.F.; writing—review and editing, A.T.d.A. and L.M.F.; visualization, L.M.F.; supervision, A.T.d.A., A.P.C.; project administration, A.T.d.A.; funding


Introduction
Over the past years, wildfires have become increasingly common, larger, and deadlier. The associated costs and losses also increased, representing billions of dollars in natural resources and property loss (p. 7, [1]). Recent climatic conditions are also to blame, because high temperatures, low precipitation, and regular strong winds favor the desiccation of vegetation, potentiating ignition and fire spread (p. 5, [2]). In the beginning of the millennia (2000), it was estimated that about 50,000 fires occur each year in the Mediterranean basin, affecting more than 600,000 ha (p. preface, [2]). According to the "Mediterranean Forestry Action Programme (FAO 1993)", between 1981 and 1997, Portugal, on average, had an area of 83,143 ha burned per year, representing 2.79% of the national forested area affected by fire (p. 8, [2]). However, in 2017, in Portugal alone, forest fires burned 442,400 ha (1,093,194 acres) of rural and urban areas [3], which corresponds to 4.8% of the total country's area. Please refer to Figure 1 for an overview of the average number of fires between 2000 and 2017 and Figure 2 for a comparison of the average burnt areas between 2008-2018 and 2019). A similar situation occurred in the state of California, where the 2018 wildfires burned 676,300 ha (1,671,214 acres) [4], caused billions of dollars in losses, and cost hundreds of lives. This recent recurring occurrence of cataclysmic wildfires across the globe, as a result of worsening climate conditions with record temperatures, drought, and strong winds, presents a challenge to minimize the risk to human lives as well as the destruction of assets, infrastructures, forests, and wildlife. As a result of climate change, forest fire risk is increasing globally, with areas such as Canada, Alaska, and Siberia being subject to large fires.  Early fire detection is critical for firefighting operations and population evacuation with a good chance of success. Unfortunately, early stage fire detection is rare, because monitoring and reports are usually made by humans, who rely on eyesight to detect smoke columns, which is often too late to allow the effective deployment of counter measures toward fire confinement. The general population is also at risk, in particular when a massive fire strikes without warning, rendering evacuation difficult or impossible.
On the last line of defense, current firefighting mechanisms almost always rely on direct control and handling by firefighters, who become exposed to the dangers of fire and harsh conditions such as high temperature and thermal radiation levels. Therefore, it is necessary to develop mechanisms and techniques to remotely or autonomously detect and deploy fire countermeasures anywhere.
Forest biomass (the "fuel") exchanges moisture with the air, in particular, light fuels, such as pine needles or dry vegetation which react quickly to changes in the relative humidity (RH) of the air, so when the RH drops, these materials dry quickly, literally providing support for a quick fire spread. Thicker woody fuels (bark, shrubs, branches, and foliage) either green or dead (dry), also react to variations of air RH, although at a slower pace because of their larger mass. However, an extended exposure to high temperatures, low precipitation, and regular winds favor their desiccation, contributing to potential ignition and fire spread risks.  Early fire detection is critical for firefighting operations and population evacuation with a good chance of success. Unfortunately, early stage fire detection is rare, because monitoring and reports are usually made by humans, who rely on eyesight to detect smoke columns, which is often too late to allow the effective deployment of counter measures toward fire confinement. The general population is also at risk, in particular when a massive fire strikes without warning, rendering evacuation difficult or impossible.
On the last line of defense, current firefighting mechanisms almost always rely on direct control and handling by firefighters, who become exposed to the dangers of fire and harsh conditions such as high temperature and thermal radiation levels. Therefore, it is necessary to develop mechanisms and techniques to remotely or autonomously detect and deploy fire countermeasures anywhere.
Forest biomass (the "fuel") exchanges moisture with the air, in particular, light fuels, such as pine needles or dry vegetation which react quickly to changes in the relative humidity (RH) of the air, so when the RH drops, these materials dry quickly, literally providing support for a quick fire spread. Thicker woody fuels (bark, shrubs, branches, and foliage) either green or dead (dry), also react to variations of air RH, although at a slower pace because of their larger mass. However, an extended exposure to high temperatures, low precipitation, and regular winds favor their desiccation, contributing to potential ignition and fire spread risks. Early fire detection is critical for firefighting operations and population evacuation with a good chance of success. Unfortunately, early stage fire detection is rare, because monitoring and reports are usually made by humans, who rely on eyesight to detect smoke columns, which is often too late to allow the effective deployment of counter measures toward fire confinement. The general population is also at risk, in particular when a massive fire strikes without warning, rendering evacuation difficult or impossible.
On the last line of defense, current firefighting mechanisms almost always rely on direct control and handling by firefighters, who become exposed to the dangers of fire and harsh conditions such as high temperature and thermal radiation levels. Therefore, it is necessary to develop mechanisms and techniques to remotely or autonomously detect and deploy fire countermeasures anywhere.
Forest biomass (the "fuel") exchanges moisture with the air, in particular, light fuels, such as pine needles or dry vegetation which react quickly to changes in the relative humidity (RH) of the air, so when the RH drops, these materials dry quickly, literally providing support for a quick fire spread. Thicker woody fuels (bark, shrubs, branches, and foliage) either green or dead (dry), also react to variations of air RH, although at a slower pace because of their larger mass. However, an extended exposure to high Inventions 2020, 5, 41 3 of 14 temperatures, low precipitation, and regular winds favor their desiccation, contributing to potential ignition and fire spread risks.
When wood is exposed to heat, some of its cellulose material compound molecules break apart, promoting the release of gases such as CH 2 O (formaldehyde), a.k.a. methanal, which has a flash point of 50-60 • C [7] (Flash Point: "the lowest temperature at which a substance vaporizes into a gas, which can be ignited with the introduction of an external source of fire"). Given enough concentration of that gas, a small flame or spark may be all it takes to start a self-sustaining reaction, i.e., when these gases ignite, more heat is released, promoting the release and ignition of even more gases, eventually heating the wood until its ignition temperature: 260 • C (Ignition Temperature: "the lowest temperature at which a volatile material will be vaporized into a gas which ignites without the help of any external flame or ignition source"). At this point, wood's compound molecules break apart and react with oxygen, producing flames and releasing heat, light, H 2 O, CO 2 , CO, and C, among others [8,9]. Ashes are formed by components found in wood that remain solid, such as silica, potassium and phosphorus. Fire will grow and spread as long as there is high temperature, fuel, and oxygen around it. (1) Fire and its effects can be detected with different types of technologies and sensors, such as temperature sensors, gas sensors (CO 2 , CO), optical smoke detectors, and also by analyzing specific bands of the electromagnetic spectrum, such as infrared, visible, and ultra-violet.
"Regular" thermal sensors, i.e., thermistors, thermocouples, resistance thermometers, or silicon bandgap temperature sensors, despite being reliable and precise, are not useful in the detection of an actual wildfire from a distance. However, a regular temperature sensor, associated with a humidity sensor, can be used to calculate the temperature and relative humidity (RH) of the air, which in turn gives valuable information regarding the fire risks [10].
Gas sensors, for CO and CO 2 , and optical smoke detectors can also be used to detect fire, but they require concentration levels that may never be reached in the open air, or that can be strongly influenced by wind, failing early detection even if a fire is just a few meters away. Gas sensors may also lack long-term stability and are subject to cross-sensitivities and false alarms [11].
Human or automated image-based detection of fire or smoke, in the visible spectrum, is possible, even at long distances [12,13], but only after fire has passed its initial ignition stage, and it may require multiple observation points to pinpoint a fire front on the terrain.
IR detection alone is efficient in the remote detection of hot temperatures, such as fire, through smoke and even some vegetation. Narrow field-of-view sensors are ideal to pinpoint hot points at long distances in a specific direction. However, apart from the amplitude, it is difficult to differentiate a distant fire from a close warm object, or even the sun on the horizon. As IR sensors have a specific field of view, the measured temperature equals the average temperature captured in its solid angle. Hence, a small, far away, and very hot spot may provide the same radiation sum as a closer, larger, and slightly warm object. Similarly, a large fire at a distance may provide the same radiation as a closer smaller fire. For example, as seen in Figure 4, the captured radiation of fire with intensity S1 at distance D1 is similar to the captured radiation of fire with intensity S2 = 4 × S1 at distance D2 = 2 × D1. IR detection alone is efficient in the remote detection of hot temperatures, such as fire, through smoke and even some vegetation. Narrow field-of-view sensors are ideal to pinpoint hot points at long distances in a specific direction. However, apart from the amplitude, it is difficult to differentiate a distant fire from a close warm object, or even the sun on the horizon. As IR sensors have a specific field of view, the measured temperature equals the average temperature captured in its solid angle. Hence, a small, far away, and very hot spot may provide the same radiation sum as a closer, larger, and slightly warm object. Similarly, a large fire at a distance may provide the same radiation as a closer smaller fire. For example, as seen in Figure 4, the captured radiation of fire with intensity S1 at distance D1 is similar to the captured radiation of fire with intensity S2 = 4 × S1 at distance D2 = 2 × D1. In the open air, wood flames flicker randomly, and they are usually modulated from 1 to 20 Hz [14]. This second signature can be used to differentiate an actual fire from a hot object in the background, complementing other sensing technologies. Ideally, optical radiation sensors used for the detection of fire should detect broad spectral IR bands and also specific wavelengths (2.9 µm, 4.3 µm…) in order to filter background radiation and false positives.  IR detection alone is efficient in the remote detection of hot temperatures, such as fire, through smoke and even some vegetation. Narrow field-of-view sensors are ideal to pinpoint hot points at long distances in a specific direction. However, apart from the amplitude, it is difficult to differentiate a distant fire from a close warm object, or even the sun on the horizon. As IR sensors have a specific field of view, the measured temperature equals the average temperature captured in its solid angle. Hence, a small, far away, and very hot spot may provide the same radiation sum as a closer, larger, and slightly warm object. Similarly, a large fire at a distance may provide the same radiation as a closer smaller fire. For example, as seen in Figure 4, the captured radiation of fire with intensity S1 at distance D1 is similar to the captured radiation of fire with intensity S2 = 4 × S1 at distance D2 = 2 × D1. In the open air, wood flames flicker randomly, and they are usually modulated from 1 to 20 Hz [14]. This second signature can be used to differentiate an actual fire from a hot object in the background, complementing other sensing technologies. Ideally, optical radiation sensors used for the detection of fire should detect broad spectral IR bands and also specific wavelengths (2.9 µm, 4.3 µm…) in order to filter background radiation and false positives. In the open air, wood flames flicker randomly, and they are usually modulated from 1 to 20 Hz [14]. This second signature can be used to differentiate an actual fire from a hot object in the background, complementing other sensing technologies. Ideally, optical radiation sensors used for the detection of fire should detect broad spectral IR bands and also specific wavelengths (2.9 µm, 4.3 µm . . . ) in order to filter background radiation and false positives.
Sensors that can be used to detect fire vary in their detection coverage area, sensitivity versus distance, complexity, cost, and most importantly their reliability, including stability and performance over extended periods of time. Several authors have proposed the use of wireless sensor networks for forest fire detection [15][16][17] as a possible solution to help detect the occurrence of forest fires early. One of the main difficulties is the lack of low-cost thermal image sensors that can be used in combination with wireless sensor networks. This paper proposes a low-cost autonomous firefighting system, by combining a motorized water turret, an advanced sensory unit with a matrix of narrow beam far infrared (FIR) sensors, and a micro-controller. This paper also presents software-defined autonomous firefighting algorithms and techniques integrated into the proposed hardware system.

Materials and Methods
The simplified hardware functional block diagram is depicted in Figure 5. A firefighter nozzle was coupled to a motorized pan and tilt unit and a solenoid valve. A low-cost compact far infrared (FIR) thermal sensor array with a resolution of 32 × 24 and a field of view (FoV) of 55 • × 35 • (block diagram depicted in Figure 6) built by Melexis was connected to a 32 bit micro-controller with integrated wireless communication capability, which also controlled the pan and tilt unit and the solenoid valve, providing remote wireless monitoring and control. According to Melexis, the IR sensor manufacturer: " . . . This infrared (IR) sensor offers a cost-effective alternative to more expensive high-end thermal cameras. This 32 × 24 pixel device is suited to safety and convenience applications that include fire prevention systems . . . has a −40 • C to 85 • C operational temperature range and can measure object temperatures between −40 and 300 • C. Maintaining high levels of precision across its full measurement scale, this infrared sensor delivers a typical target object temperature accuracy of ±1 • C. It also exhibits superior noise performance. Unlike microbolometer alternatives, the sensor does not need frequent re-calibration, thus ensuring continuous monitoring and lowering the system cost. The Melexis MLX90640 is supplied in a compact, 4-pin TO39 package incorporating the requisite optics . . . " (source: https://www.melexis.com/en/product/MLX90640/Far-Infrared-Thermal-Sensor-Array).
Sensors that can be used to detect fire vary in their detection coverage area, sensitivity versus distance, complexity, cost, and most importantly their reliability, including stability and performance over extended periods of time. Several authors have proposed the use of wireless sensor networks for forest fire detection [15][16][17] as a possible solution to help detect the occurrence of forest fires early. One of the main difficulties is the lack of low-cost thermal image sensors that can be used in combination with wireless sensor networks.
This paper proposes a low-cost autonomous firefighting system, by combining a motorized water turret, an advanced sensory unit with a matrix of narrow beam far infrared (FIR) sensors, and a micro-controller. This paper also presents software-defined autonomous firefighting algorithms and techniques integrated into the proposed hardware system.

Materials and Methods
The simplified hardware functional block diagram is depicted in Figure 5. A firefighter nozzle was coupled to a motorized pan and tilt unit and a solenoid valve. A low-cost compact far infrared (FIR) thermal sensor array with a resolution of 32 × 24 and a field of view (FoV) of 55° × 35° (block diagram depicted in Figure 6) built by Melexis was connected to a 32 bit micro-controller with integrated wireless communication capability, which also controlled the pan and tilt unit and the solenoid valve, providing remote wireless monitoring and control. According to Melexis, the IR sensor manufacturer: "…This infrared (IR) sensor offers a cost-effective alternative to more expensive high-end thermal cameras. This 32 × 24 pixel device is suited to safety and convenience applications that include fire prevention systems … has a −40 °C to 85 °C operational temperature range and can measure object temperatures between −40 and 300 °C. Maintaining high levels of precision across its full measurement scale, this infrared sensor delivers a typical target object temperature accuracy of ±1 °C. It also exhibits superior noise performance. Unlike microbolometer alternatives, the sensor does not need frequent re-calibration, thus ensuring continuous monitoring and lowering the system cost. The Melexis MLX90640 is supplied in a compact, 4-pin TO39 package incorporating the requisite optics…" (source: https://www.melexis.com/en/product/MLX90640/Far-Infrared-Thermal-Sensor-Array).   Sensors that can be used to detect fire vary in their detection coverage area, sensitivity versus distance, complexity, cost, and most importantly their reliability, including stability and performance over extended periods of time. Several authors have proposed the use of wireless sensor networks for forest fire detection [15][16][17] as a possible solution to help detect the occurrence of forest fires early. One of the main difficulties is the lack of low-cost thermal image sensors that can be used in combination with wireless sensor networks.
This paper proposes a low-cost autonomous firefighting system, by combining a motorized water turret, an advanced sensory unit with a matrix of narrow beam far infrared (FIR) sensors, and a micro-controller. This paper also presents software-defined autonomous firefighting algorithms and techniques integrated into the proposed hardware system.

Materials and Methods
The simplified hardware functional block diagram is depicted in Figure 5. A firefighter nozzle was coupled to a motorized pan and tilt unit and a solenoid valve. A low-cost compact far infrared (FIR) thermal sensor array with a resolution of 32 × 24 and a field of view (FoV) of 55° × 35° (block diagram depicted in Figure 6) built by Melexis was connected to a 32 bit micro-controller with integrated wireless communication capability, which also controlled the pan and tilt unit and the solenoid valve, providing remote wireless monitoring and control. According to Melexis, the IR sensor manufacturer: "…This infrared (IR) sensor offers a cost-effective alternative to more expensive high-end thermal cameras. This 32 × 24 pixel device is suited to safety and convenience applications that include fire prevention systems … has a −40 °C to 85 °C operational temperature range and can measure object temperatures between −40 and 300 °C. Maintaining high levels of precision across its full measurement scale, this infrared sensor delivers a typical target object temperature accuracy of ±1 °C. It also exhibits superior noise performance. Unlike microbolometer alternatives, the sensor does not need frequent re-calibration, thus ensuring continuous monitoring and lowering the system cost. The Melexis MLX90640 is supplied in a compact, 4-pin TO39 package incorporating the requisite optics…" (source: https://www.melexis.com/en/product/MLX90640/Far-Infrared-Thermal-Sensor-Array).   the jet, at a constant tilt, panning in solidarity with the nozzle. With this system, we could remotely control the pan, tilt, and activate the water jet. The cone opening of the water jet was set manually at a constant position that provided up to 15-20 m of range with 7 bar of water pressure.
Inventions 2020, 5, x FOR PEER REVIEW 6 of 14 In the first experiments, the IR sensor was mounted to the side of the nozzle, in the horizontal plane of the center of the jet, and moved (pan and tilt) in solidarity with it Figure 7. On later experiments, the IR sensor was mounted under the nozzle, near the base, in the same vertical plane as the center of the jet, at a constant tilt, panning in solidarity with the nozzle. With this system, we could remotely control the pan, tilt, and activate the water jet. The cone opening of the water jet was set manually at a constant position that provided up to 15-20 m of range with 7 bar of water pressure. The parabola and range of the water jet depend on the exit angle, pressure, aperture, and flow through the nozzle. A study was made to preview the shape and range of the water jet parabola based on water pressure and exit angle; however, as the water jet shape can be diverted and spread by wind, these variables alone cannot be used to preview the hit point of the jet. Figure 8 depicts a preview of the parabolic trajectory of a water jet with the following assumptions: our water jet acts as a stream of water spheres with a radius of 2.5 cm and 8 g each, launched with an angle of 10°, at a speed of 40 m/s. Assuming a drag coefficient of 0.5, and assuming there is no dispersion nor wind, we should get a horizontal range of around 17.5 m, a maximum height of 1.2 m, and a time of flight of approximately 1 s. This time of flight presents a challenge for automated control, as every action on the pan and tilt unit has a delayed effect on the water hit point. Wind also presents additional challenge, particularly with low pressures, as the water cone can spread randomly or be diverted. Water, either on the jet or as mist in the air, also attenuates or even blocks part of the view of the IR sensor, as depicted in Figure  9. This presents additional challenges but also opportunities, as it can be used to locate the fire front, as can be seen later in the discussion section.
Simulations were made to visualize the solid angles and projection matrix of the IR sensor at a distance. Figure 10 depicts the simulation of a target at 10 m hit by a cylindrical jet, which was The parabola and range of the water jet depend on the exit angle, pressure, aperture, and flow through the nozzle. A study was made to preview the shape and range of the water jet parabola based on water pressure and exit angle; however, as the water jet shape can be diverted and spread by wind, these variables alone cannot be used to preview the hit point of the jet. Figure 8  In the first experiments, the IR sensor was mounted to the side of the nozzle, in the horizontal plane of the center of the jet, and moved (pan and tilt) in solidarity with it Figure 7. On later experiments, the IR sensor was mounted under the nozzle, near the base, in the same vertical plane as the center of the jet, at a constant tilt, panning in solidarity with the nozzle. With this system, we could remotely control the pan, tilt, and activate the water jet. The cone opening of the water jet was set manually at a constant position that provided up to 15-20 m of range with 7 bar of water pressure. The parabola and range of the water jet depend on the exit angle, pressure, aperture, and flow through the nozzle. A study was made to preview the shape and range of the water jet parabola based on water pressure and exit angle; however, as the water jet shape can be diverted and spread by wind, these variables alone cannot be used to preview the hit point of the jet. Figure 8 depicts a preview of the parabolic trajectory of a water jet with the following assumptions: our water jet acts as a stream of water spheres with a radius of 2.5 cm and 8 g each, launched with an angle of 10°, at a speed of 40 m/s. Assuming a drag coefficient of 0.5, and assuming there is no dispersion nor wind, we should get a horizontal range of around 17.5 m, a maximum height of 1.2 m, and a time of flight of approximately 1 s. This time of flight presents a challenge for automated control, as every action on the pan and tilt unit has a delayed effect on the water hit point. Wind also presents additional challenge, particularly with low pressures, as the water cone can spread randomly or be diverted. Water, either on the jet or as mist in the air, also attenuates or even blocks part of the view of the IR sensor, as depicted in Figure  9. This presents additional challenges but also opportunities, as it can be used to locate the fire front, as can be seen later in the discussion section.
Simulations were made to visualize the solid angles and projection matrix of the IR sensor at a distance. Figure 10 depicts the simulation of a target at 10 m hit by a cylindrical jet, which was This time of flight presents a challenge for automated control, as every action on the pan and tilt unit has a delayed effect on the water hit point. Wind also presents additional challenge, particularly with low pressures, as the water cone can spread randomly or be diverted. Water, either on the jet or as mist in the air, also attenuates or even blocks part of the view of the IR sensor, as depicted in Figure 9. This presents additional challenges but also opportunities, as it can be used to locate the fire front, as can be seen later in the discussion section. launched with an exit angle of 13°, covering a parabolic trajectory with no dispersion. The far matrix denotes the view per pixel, and the radial lines denote the central plane of the sensor. The simulated FoV is 35° with the sensor mounted at the tip of the nozzle. Figure 9 repeats the simulation from the point of view of the IR sensor mounted at different positions. Table 1 describes the IR sensor full matrix versus pixel Field of View (FoV) per plane, and Table 2 describes the covered width and height as a function of distance from the IR sensor.   Simulations were made to visualize the solid angles and projection matrix of the IR sensor at a distance. Figure 10 depicts the simulation of a target at 10 m hit by a cylindrical jet, which was launched with an exit angle of 13 • , covering a parabolic trajectory with no dispersion. The far matrix denotes the view per pixel, and the radial lines denote the central plane of the sensor. The simulated FoV is 35 • with the sensor mounted at the tip of the nozzle. Figure 9 repeats the simulation from the point of view of the IR sensor mounted at different positions. Table 1 describes the IR sensor full matrix versus pixel Field of View (FoV) per plane, and Table 2 describes the covered width and height as a function of distance from the IR sensor. launched with an exit angle of 13°, covering a parabolic trajectory with no dispersion. The far matrix denotes the view per pixel, and the radial lines denote the central plane of the sensor. The simulated FoV is 35° with the sensor mounted at the tip of the nozzle. Figure 9 repeats the simulation from the point of view of the IR sensor mounted at different positions. Table 1 describes the IR sensor full matrix versus pixel Field of View (FoV) per plane, and Table 2 describes the covered width and height as a function of distance from the IR sensor.

Results
Initial field testing was made with a controlled small fire on a field with dry vegetation, which was recently cut to a height of less than 15 cm, as shown in Figure 11. This provided us with a uniform flame size and radial spread. Ignition was started at a distance of around 20 m from the prototype, and the IR sensor was able to detect and locate the fire hot spots from the instant the fire started, triggering an alarm. The pan and tilt unit, as well as the fire nozzle, were tested with a manual remote control and worked as expected, both when the jet of water was on or off. It was confirmed that it was possible to put out a small fire by remotely manually controlling the pan and tilt water turret by looking only at the IR sensor "image" and not the fire itself, proving that this could be controlled by a human operator far away from the fire front.

Results
Initial field testing was made with a controlled small fire on a field with dry vegetation, which was recently cut to a height of less than 15 cm, as shown in Figure 11. This provided us with a uniform flame size and radial spread. Ignition was started at a distance of around 20 m from the prototype, and the IR sensor was able to detect and locate the fire hot spots from the instant the fire started, triggering an alarm. The pan and tilt unit, as well as the fire nozzle, were tested with a manual remote control and worked as expected, both when the jet of water was on or off. It was confirmed that it was possible to put out a small fire by remotely manually controlling the pan and tilt water turret by looking only at the IR sensor "image" and not the fire itself, proving that this could be controlled by a human operator far away from the fire front. These initial tests demonstrated that the IR sensor could differentiate fire from the background and generate real-time usable data regarding fire location at several frames per second. The IR sensor could also detect an ignition behind tall and dense vegetation several seconds before smoke and flames were visible to the naked eye, proving that even at a low resolution, a low-cost IR matrix sensor is able to detect and pinpoint a small fire at close range. Figure 12a   These initial tests demonstrated that the IR sensor could differentiate fire from the background and generate real-time usable data regarding fire location at several frames per second. The IR sensor could also detect an ignition behind tall and dense vegetation several seconds before smoke and flames were visible to the naked eye, proving that even at a low resolution, a low-cost IR matrix sensor is able to detect and pinpoint a small fire at close range. Figure 12a displays a photo from the point of view of the IR sensor of a small fire front behind a bush, around 15 m away. Figure 12b depicts the data collected in real time from the IR sensor. Several seconds after the ignition, the IR sensor can detect peak temperatures of 260 • C, even though to the naked eye no flames are visible, only smoke.
Inventions 2020, 5, x FOR PEER REVIEW 9 of 14 depicts the data collected in real time from the IR sensor. Several seconds after the ignition, the IR sensor can detect peak temperatures of 260 °C, even though to the naked eye no flames are visible, only smoke. After activating the water jet, the attenuation of the temperature on the spot hit by water is clearly visible in Figure 13, and by adjusting the minimum and maximum thresholds of the collected data, prior to image conversion, it is somehow possible to discern the water jet from the background, indicating that if we know the temperature of the water, it might be possible to define thresholds to differentiate the water from the background and analyze its real trajectory. Since automated operation was the main objective of this work, a simple fire-tracking algorithm was developed. This algorithm autonomously tracked the hottest point within the field of view of the sensor, and when the temperature surpassed a threshold, e.g., 100 °C, the solenoid valve was activated and the water jet was oriented until the hottest point was at a predefined offset from the center of the matrix frame. These offsets are needed to relate the water hit point versus the IR sensor matrix. These offsets were defined, considering a fixed pressure water input and an estimated range After activating the water jet, the attenuation of the temperature on the spot hit by water is clearly visible in Figure 13, and by adjusting the minimum and maximum thresholds of the collected data, prior to image conversion, it is somehow possible to discern the water jet from the background, indicating that if we know the temperature of the water, it might be possible to define thresholds to differentiate the water from the background and analyze its real trajectory.
Inventions 2020, 5, x FOR PEER REVIEW 9 of 14 depicts the data collected in real time from the IR sensor. Several seconds after the ignition, the IR sensor can detect peak temperatures of 260 °C, even though to the naked eye no flames are visible, only smoke. After activating the water jet, the attenuation of the temperature on the spot hit by water is clearly visible in Figure 13, and by adjusting the minimum and maximum thresholds of the collected data, prior to image conversion, it is somehow possible to discern the water jet from the background, indicating that if we know the temperature of the water, it might be possible to define thresholds to differentiate the water from the background and analyze its real trajectory. Since automated operation was the main objective of this work, a simple fire-tracking algorithm was developed. This algorithm autonomously tracked the hottest point within the field of view of the sensor, and when the temperature surpassed a threshold, e.g., 100 °C, the solenoid valve was activated and the water jet was oriented until the hottest point was at a predefined offset from the center of the matrix frame. These offsets are needed to relate the water hit point versus the IR sensor matrix. These offsets were defined, considering a fixed pressure water input and an estimated range Since automated operation was the main objective of this work, a simple fire-tracking algorithm was developed. This algorithm autonomously tracked the hottest point within the field of view of the sensor, and when the temperature surpassed a threshold, e.g., 100 • C, the solenoid valve was activated and the water jet was oriented until the hottest point was at a predefined offset from the center of the matrix frame. These offsets are needed to relate the water hit point versus the IR sensor matrix. These offsets were defined, considering a fixed pressure water input and an estimated range of the jet versus the position of the hottest pixel area in the IR matrix. Field tests were performed, and the unit was installed on top of a fire rapid-response vehicle equipped with a water tank and a pump, as shown in Figure 14. This autonomous mode was partially successful, confirming an expected limitation: the offset between the actual water hit point and the hottest visible point. While this algorithm is effective at close range and high water pressure, it is prone to fail when the jet has low pressure and is aimed at a distant position, wind deflects or disperses water, or even if the terrain is not flat. We confirmed that depending on the position of the IR sensor, the parabolic water jet could obfuscate some hotspots by blocking the view of the sensor, as depicted in Figure 10.
Inventions 2020, 5, x FOR PEER REVIEW 10 of 14 of the jet versus the position of the hottest pixel area in the IR matrix. Field tests were performed, and the unit was installed on top of a fire rapid-response vehicle equipped with a water tank and a pump, as shown in Figure 14. This autonomous mode was partially successful, confirming an expected limitation: the offset between the actual water hit point and the hottest visible point. While this algorithm is effective at close range and high water pressure, it is prone to fail when the jet has low pressure and is aimed at a distant position, wind deflects or disperses water, or even if the terrain is not flat. We confirmed that depending on the position of the IR sensor, the parabolic water jet could obfuscate some hotspots by blocking the view of the sensor, as depicted in Figure 10. New algorithms were developed, namely one in particular that takes into account the fact that the water jet might obfuscate the view of the IR sensor. The sensor was relocated to the underside of the nozzle, and further field testing was made at a location where the only available "fuel" was contained in 2 adjacent small cylindrical metal baskets with a capacity of 3500 cc, which were filled with dry forest vegetation and placed 14.5 m from the sensor. Two tests were made: • Test A: Fire was started and no water jet was turned on.
• Test B: Fire was started and the water jet was launched with an inclination of 40°, pressure controlled to 2 bar, in order to fall just before hitting the baskets.
In test B, water droplets were big and disperse, falling to the ground between the 10 and 14.5 m region. Wind also led to a slight oscillation of the water jet to the left and right, altering its trajectory. Baskets were refilled with a similar amount of fuel between experiments. The initial measured baseline temperatures of over 100 °C can be explained by the fact that there were big metal objects on the background exposed to strong sun radiation. Figure 15 depicts the evolution of peak temperatures in both tests. Figure 15. Sensor obfuscation test-Peak temperatures vs. time. Test A: Fire was started, no water jet was turned on; Test B: Fire was started and the water jet had a launch elevation of 40°, pressure controlled to 2 bar, in order to fall just before hitting the baskets. New algorithms were developed, namely one in particular that takes into account the fact that the water jet might obfuscate the view of the IR sensor. The sensor was relocated to the underside of the nozzle, and further field testing was made at a location where the only available "fuel" was contained in 2 adjacent small cylindrical metal baskets with a capacity of 3500 cc, which were filled with dry forest vegetation and placed 14.5 m from the sensor. Two tests were made: • Test A: Fire was started and no water jet was turned on.

•
Test B: Fire was started and the water jet was launched with an inclination of 40 • , pressure controlled to 2 bar, in order to fall just before hitting the baskets.
In test B, water droplets were big and disperse, falling to the ground between the 10 and 14.5 m region. Wind also led to a slight oscillation of the water jet to the left and right, altering its trajectory. Baskets were refilled with a similar amount of fuel between experiments. The initial measured baseline temperatures of over 100 • C can be explained by the fact that there were big metal objects on the background exposed to strong sun radiation. Figure 15 depicts the evolution of peak temperatures in both tests.
Inventions 2020, 5, x FOR PEER REVIEW 10 of 14 of the jet versus the position of the hottest pixel area in the IR matrix. Field tests were performed, and the unit was installed on top of a fire rapid-response vehicle equipped with a water tank and a pump, as shown in Figure 14. This autonomous mode was partially successful, confirming an expected limitation: the offset between the actual water hit point and the hottest visible point. While this algorithm is effective at close range and high water pressure, it is prone to fail when the jet has low pressure and is aimed at a distant position, wind deflects or disperses water, or even if the terrain is not flat. We confirmed that depending on the position of the IR sensor, the parabolic water jet could obfuscate some hotspots by blocking the view of the sensor, as depicted in Figure 10. New algorithms were developed, namely one in particular that takes into account the fact that the water jet might obfuscate the view of the IR sensor. The sensor was relocated to the underside of the nozzle, and further field testing was made at a location where the only available "fuel" was contained in 2 adjacent small cylindrical metal baskets with a capacity of 3500 cc, which were filled with dry forest vegetation and placed 14.5 m from the sensor. Two tests were made: • Test A: Fire was started and no water jet was turned on.
• Test B: Fire was started and the water jet was launched with an inclination of 40°, pressure controlled to 2 bar, in order to fall just before hitting the baskets.
In test B, water droplets were big and disperse, falling to the ground between the 10 and 14.5 m region. Wind also led to a slight oscillation of the water jet to the left and right, altering its trajectory. Baskets were refilled with a similar amount of fuel between experiments. The initial measured baseline temperatures of over 100 °C can be explained by the fact that there were big metal objects on the background exposed to strong sun radiation. Figure 15 depicts the evolution of peak temperatures in both tests. Figure 15. Sensor obfuscation test-Peak temperatures vs. time. Test A: Fire was started, no water jet was turned on; Test B: Fire was started and the water jet had a launch elevation of 40°, pressure controlled to 2 bar, in order to fall just before hitting the baskets. Figure 15. Sensor obfuscation test-Peak temperatures vs. time. Test A: Fire was started, no water jet was turned on; Test B: Fire was started and the water jet had a launch elevation of 40 • , pressure controlled to 2 bar, in order to fall just before hitting the baskets.
As can be observed in Figure 15, Tests A and B display a bell curve of the peak temperatures generated by the burn of the limited fuel. Test A is used as a reference for subsequent tests. Test B displays a similar curve to Test A, but slightly attenuated; despite the odd peaks caused by the displacement of the water jet by wind, the overall attenuation of the measured temperature can be justified by the water falling in between the sensor and the fire, effectively absorbing or dispersing some of the radiation and slightly obfuscating the fire from the sensor.
This lead to the development of a new algorithm that should methodically be able to find and eliminate fire within the range of the water jet, independently of the variations in water pressure, wind deflection, or spread of the jet. As the maximum range of the jet can be estimated by using a constant pressure of water and an elevation of 45 • , it is possible to tilt the IR sensor in such a way that a known line on the array of data collected from the IR sensor in a given frame corresponds to the jet's maximum range, hence indicating that any hot spot above this line is out of range.
Currently, in the experimental tests, the sensor is placed near the base, under the water jet, panning in solidarity with the nozzle, while maintaining a fixed tilt. When a fire spot is detected, the nozzle is panned toward the fire and tilted to 45 • (providing a maximum range for the jet). When the jet is activated, if the fire spot is within reach-i.e., between the sensor and the water hit point-then, by tilting the jet down, an IR column (usually the center one) will eventually be clear of hot spots, because at some point in that column, the sensor will only "see" the water jet above and the ground below, indicating that the fire front is beyond the water hit point. By moving the jet back up and repeating the process in adjacent columns, it becomes possible to "sweep" and eliminate fire spots within the view and range of the jet. This combination of hardware and algorithm is now patent pending, and the algorithm is also under further research and development.

Discussion
Robotic firefighting systems are usually designed to cope with fire detection, control, and suppression, including the monitoring of hazardous conditions such as smoke, fire location, and also search and rescue. Fixed installations usually (re)act with alarms and fire sprinklers, but mobile systems can be equipped with water or foam hoses and some can even travel into unsafe areas. Several authors proposed autonomous firefighting robots [18][19][20], but the focus is mostly on mobile robot navigation and the location of fire spots, namely indoors. Some disaster relief robots, such as the "Water cannon Robot" developed by Mitsubishi Heavy Industries (MHI) [21] were designed to deal with extremely hazardous and difficult to reach locations such as fires in petrochemical plants. However, these solutions are composed of multi-robot systems, which according to the manufacturer can move autonomously, but the manufacturer does not specify if these are fully autonomous or require an operator to control the jet of water/foam. Public announcements have been made regarding military firefighting robots, such as the "Tactical Hazardous Operations Robot" (THOR) developed for the U.S. Navy, in which a humanoid robot capable of using hoses open doors and moves between unstable floors on a ship upon operator control [22]. The Thermite RS1 and RS3 [23] are other examples of robots created for the U.S. Army, using a small tank modified to include a controllable nozzle and hose; these are also remotely controlled by an operator. Finally, the "Fire Ox" [24] is another example of a remote controllable mobile robot with a built-in tank and a controllable nozzle.
Our system differs and innovates from the above solutions because it was designed to be used and mounted on a fixed location, specifically to address wildfires and with the main objective to be fully autonomous (even though manual remote operation is possible). This design was focused on the strengths and limitations of the pan and tilt motorized unit, the water jet range, the IR matrix sensor field of view, and possible disturbances such as the ones caused by wind or even a fire front out of reach, in such a way that a known ground area can be monitored and protected. This solution does not aim to halt a massive fire front or even terminate a large forest fire crossing to an urban interface. However, the system has the ability to issue early warning alarms when a fire front is near, and if in autonomous mode, it can locate the fire position and methodically attempt to suppress it. As part of the algorithm, when the fire is very close but still out of reach, the algorithm activates the jet and with a scanning motion sprays the floor with water, in order to preemptively minimize the probability of an ignition caused by a projection. If applied in a distributed manner and combined with proper terrain cleaning and distancing from trees, the system should be able to protect perimeters, mitigating the effects of fire and increasing the chance for evacuation.

Conclusions
This work led us to conclude that it is possible to develop innovative and low-cost mechanisms and techniques to remotely or autonomously detect and combat fires in a wide variety of environments. By combining a motorized pan and tilt water turret, an advanced sensory unit with a matrix of narrow beam far infrared (FIR) sensors, and a micro-controller, it is possible to detect and fight an approaching fire front, either in fully autonomous mode or in semi-autonomous mode in cooperation with the firemen. The developed solution also provides an opportunity for firefighters to minimize their exposition to harsh conditions such as high temperatures and heat radiation levels by avoiding manual control and handling.
If applied in a distributed manner near risky areas, such as forests, public and private infrastructures such as villages, touristic sites, and large outdoor events, which require the protection of the surrounding perimeters, then the proposed system can be used to delay the propagation of an emerging fire, mitigating its effects and increasing the chance for evacuation, avoiding the potential disasters caused by wild forest fires.
Experimental field results in a variety of natural environments validated the technical approach, indicating that when the fire is within the field of view of the FIR sensor and within the range of the water jet, such a system can provide an early fire detection alarm and autonomously mitigate an approaching fire front, minimizing the risk to human lives as well as the destruction of assets, infrastructures, forests, and wildlife, or in a worst-case scenario, increasing the chance for safe evacuation. The algorithms are also under further R&D, as they can be improved to provide optimized fire detection and autonomous firefighting techniques using several coordinated autonomous systems. Self-learning algorithms can further enhance the performance of this fire detection and control system, thus providing more comprehensive protection and optimizing the use of water.

Patents
International Patent Request: PCT/PT2020/050026-"Autonomous Portable Firefighting System and Respective Method of Operation".