Current Trends in Wildfire Detection, Monitoring and Surveillance
Abstract
1. Introduction
2. Historical and Theoretical Background to Wildfire Detection and Surveillance
- If S is interpreted as belonging to the wildfire and E belongs to the wildfire, we have a Correct Detection or True Positive, represented by the set TP = (π−1 (S) ∩ E).
- If S is interpreted as belonging to the wildfire and E does not belong to the wildfire, we have a False Detection or False Positive, represented by the set FP = (π−1 (S) ∩ EC).
- If S is interpreted as not belonging to the wildfire and E belongs to the wildfire, we have a Missed Detection or False Negative, represented by the set FN = (π−1 (S)C∩ E).
- If S is interpreted as not belonging to the wildfire and E does not belong to the wildfire, we have a Correct Rejection or True Negative, represented by the set TN = (π−1 (S)C∩ EC).
3. Wildfire Detection Evaluation, Validation and Testing
- Processing level, which focuses on time and space efficiency. A system performs better if it operates faster, requires less processing time and hardware, uses less disk space, and consumes less energy (i.e., is a “green” system).
- Detection level, which focuses on detection results. A system performs better if it achieves a high percentage of correct detections and correct non-detections, while keeping the percentage of missed detections and false detections low. This level of evaluation usually requires a ground-truth database, where a reference observer—typically a human—performs the same detection task to generate the benchmark dataset.
- Global binary evaluation of the observer on a series of test images, where the observer is considered as a simple binary classifier for each image, determining whether wildfire is present or not present, regardless of its location on the image. In this evaluation, the ground truth set consists of images divided into two simple subsets. One includes images where wildfire is present (set E), and the other includes images where wildfire is not present (set complement EC).
- Local evaluation at the image level, where the goal is to assess whether the observer correctly identifies the location of wildfire indicators, typically smoke in the early stages of a fire, and accurately recognizes the pixels where smoke is present on the image. For this evaluation, the ground truth images should be manually segmented, in the simplest case into two binary regions: areas where smoke is present and areas where smoke is not. The set E now includes smoke pixels, while its complement EC includes all other no-smoke pixels. Since smoke at its boundary areas is a semi-transparent phenomenon, fuzzy evaluation has also been proposed, where the smoke is segmented as a fuzzy set with increasing membership degrees from 0 to 1 [16].
- Global comprehensive evaluation of the observer based on local evaluation measures, where each image is first evaluated locally, and then, based on all locally evaluated images, the global evaluation of the observer is determined. Each image has its own evaluation values, based on which the global evaluation of the observer is made. In this case, graphical representations are particularly useful.
- Precision–Recall Graph, which shows the tradeoff between precision and recall for different thresholds. It is often used in situations where classes are heavily imbalanced [20]. It summarizes the trade-off between precision (how many of the detected smoke regions are correct) and recall (how much of the actual smoke is detected). As a single measure, Average Precision (AP) is used. AP refers to the area under the Precision–Recall curve for a single class (e.g., “smoke”). In practice, AP is usually computed using interpolated precision values at specific recall thresholds (e.g., every 0.1 from 0 to 1). Another single measure connected with the Precision–Recall Graph is mean Average Precision (mAP). In single-class problems like smoke detection, mAP is used to average AP over multiple IoU (Intersection-over-Union) thresholds. It provides a single number summarizing overall detection performance.
- DER (Detection Evaluation Radar Graph) considers several measures in the form of a radar graph [22]. Typical measures are True Positive Rate, True Negative Rate, Accuracy, Positive Predictive Value, True Negative Accuracy, (MCC + 1)/2.
- Gain and Lift Charts, which show visually how much the observer (classifier) is better than random guessing (the baseline) [23].
- A confusion Matrix Heatmap, which visually represents the distribution of actual and predicted classes. In the confusion matrix (Figure 4—left), values of TP, FP, FN, TN are added and colored to visualize the classificatory quality.
- Kolmogornov–Smirnov (KS) test graphically shows the maximum difference between the cumulative distributions of positive and negative classifications. ROC AUC score has values between 0.5 and 1, while the KS test ranges from 0.0 to 1.0, therefore some researchers suggest that the KS test is more appropriate for binary classificatory evaluation [24].
- Histogram of Predicted Scores compares distributions of classifier scores for positive and negative classes and offers intuitive visualization of score separation between classes. These are related to the ROC curve. A good classifier has well separated positive and negative histograms [25].
4. Current Approaches to Wildfire Detection
4.1. Camera Types Used in Automatic Video-Based Wildfire Surveillance Systems
4.2. Detection Algorithms in Automatic Video-Based Wildfire Surveillance Systems
- Classic, traditional algorithms based on standard image processing and analysis techniques.
- Newer algorithms based on machine learning and deep learning techniques.
4.2.1. Classic Algorithms
4.2.2. Machine Learning and Deep Learning Based Algorithms
Evaluation * | Dataset | Algorithm Specifications | YOLO Type |
---|---|---|---|
AP: 71.0 F1: 70.0 | 139 video sequences from operational wildfire surveillance system, annotated by the authors | Multichannel images based on temporal and contextual information, detection, small fire detection on low quality images [57] | YOLOv5 2023 |
mAP: 70.9 F1: 68.0 | 21,136 images derived from public datasets and online sources, contains forest, indoor, urban, and traffic fires | Integration of separable vision transformer block in the final layer of the backbone network, depthwise and pointwise self-attention mechanism [58] | YOLOv5 2022 |
mAP: YOLOv5: 93.5 YOLOv6: 83.9 YOLOv8: 91.0 | Custom-assembled data set containing images collected from various Internet platforms and online datasets | Evaluation of different YOLO models on a diverse image dataset representing different wildfire cases [66] | YOLOv5 YOLOv6 YOLOv8 2024 |
ACC: 90.1 R (TPR): 96.1 P (PPV): 93.6 F1: 94.8 | 14,400 images collected and annotated by the authors | Reduced model size, multiscale feature extraction for small object detection, coordinated attention mechanism to focus model on regions where detections are more likely [67] | YOLOv5 2022 |
ACC: 96.0 R (TPR): 97.86 P (PPV): 94.61 F1: 95.94 | 4398 images collected and annotated by the authors | A new auto-annotation scheme proposed based on the edge detection, new wildfire dataset [68] | YOLOv5 2023 |
smoke: AP: 8.8; R(TPR): 10.0; P(PPV): 29.0 big smoke: AP: 81.0; R(TPR): 80.0; P(PPV): 91.0 | Wildfire smoke dataset [69], Boreal Forest fire data; annotated as “smoke” and “big smoke” classes | Transfer learning, freezing backbone layers. Smoke-based wildfire detection in boreal forests [56] | YOLOv5 2023 |
smoke: AP: 94.3 fire: AP: 76.5 | Aerial flame dataset collected from different YouTube videos, FLAME2 dataset [70], annotated as “smoke” and “fire” | Transfer learning applied in two phases: feature extraction and fine-tuning, varying number of frozen layers in backbone and neck of the YOLOv5 model [55] | YOLOv5 2024 |
mAP: 91.6 P (PPV): 89.7 R (TPR): 84.6 | Created from publicly available images, containing forest fire images, small fires, smoke patterns and fire spread images | Introducing lightweight modules to enhance contextual understanding with reduced computational overhead, optimized for edge deployment [60] | YOLOv11n 2025 |
mAP: 71.62 R (TPR): 96.31 P (PPV): 3.39 F1: 6.55 | Data from aerial, infrared and webcam sources, annotated as “smoke” and “fire” classes | YOLO-NAS architecture with ADAM/ADAMW optimizer, data integration from diverse sources (aerial, infrared, satellite, webcam) [71] | YOLO-NAS 2024 |
mAP: 86.8 R (TPR: 82.1 P (PPV): 87.6 F1: 84.8 | Fire and smoke dataset, 9796 images taken in various conditions, perspectives, and distances. | Hyperparameter tuning, one-factor-at-a-time analysis of individual parameter contribution to model accuracy [72] | YOLOv8 2024 |
D-Fire: mAP: YOLOv5: 79.5 YOLOv7: 80.4 YOLOv8: 79.7 WSDY mAP: YOLOv5: 95.9 YOLOv7: 96.0 YOLOv8: 98.1 | D-Fire dataset [73] 21,527 images (only fire, smoke, smoke and fire, no fire or smoke categories), WSDY dataset [74] 737 smoke images | Evaluation of different YOLO models for detecting and localizing smoke and wildfires [59] | YOLOv5 YOLOv7 YOLOv8 2024 |
mAP: 95.19 R (TPR): 87.67 P (PPV): 91.37 | Dataset containing both real and synthetic smoke images [75], synthetic smoke images generated by inserting real or simulated smoke into forest backgrounds | Substituting Convolutional kernels with Omni-Dimensional Dynamic Convolution (ODOConv) multi-dimensional attention mechanism in the backbone of YOLOv8 model [76] | YOLOv8 2024 |
mAP: 77.5 | Drone thermal imaging dataset, collected and annotated by the authors | Lightweight YOLOv8-based model for real-time wildfire detection on drone thermal images. Partially Decoupled Lightweight Convolution (PDP) and a Residual PDP block for reduced computational cost and improved feature processing efficiency [77] | YOLOv8 2025 |
5. Integration of Wildfire Detection in Wildfire Monitoring and Surveillance Systems
5.1. GIS-Centric Architecture and System Integration
- Data ingestion modules acquiring video-based detection alerts, meteorological data, fuel moisture estimates, and remote sensing products.
- Processing modules implementing wildfire risk index calculations and fire spread simulations.
- Visualization modules based on OpenLayers and AR platforms, providing interactive map-based and video-enhanced interfaces.
- A centralized spatial database supporting dynamic updates and synchronized access to all data layers.
5.2. Integration of Fire Risk Indices
- Daily updates with full automation.
- Risk visualization via Web-GIS and public panels (AdriaFireRiskPanels).
- Specialized components for eruptive fire risk in steep terrains and fuel moisture modeling.
5.3. GIS-Based Fire Spread Simulation
- Remote execution on dedicated servers for rapid computation.
- Support for variable ignition sources, including both points and areas.
- Incorporation of wind influence adjusted for local topography.
- Optional modeling of suppression barriers or natural fire breaks.
- Generation of time-stepped perimeter outputs suitable for dynamic visualization.
5.4. Augmented Reality in Wildfire Surveillance
- Estimation of smoke plume dimensions and distances.
- Overlay of map features—such as infrastructure, topography, or danger zones—onto video.
- Real-time visualization of fire perimeters in the camera’s field of view.
5.5. Web-Based Interface for Integrated Wildfire Monitoring
- Monitoring of live surveillance feeds and automated detection alerts.
- Visualization of current wildfire risk levels and simulated fire spread outputs.
- Interactive tools for initiating simulations and adjusting scenario parameters.
- Display of supporting layers, such as weather conditions, terrain data, and system logs.
- Click-to-locate functionality, allowing operators to click on a video feed and instantly highlight the corresponding location on the map.
5.6. Summary and Outlook
- Modular architectures centered on GIS platforms.
- Fine-resolution, dynamically updated fire danger indices.
- Predictive fire spread simulations calibrated to local terrain and conditions.
- Augmented visual interfaces that link video surveillance with spatial context.
- Interactive web-based dashboards designed for operational use.
6. Gaps in Wildfire Monitoring and Surveillance
- limited spatial coverage,
- environmental conditions affecting visibility,
- data processing limitations, and
- difficulties integrating with other monitoring platforms hinder their overall efficiency.
7. Synthesis and Future Directions Including Emerging Technologies in Wildfire Monitoring and Surveillance
- Visible/NIR cameras: Detect smoke during the day and flame glow at night; effective for long-range smoke detection but limited in detecting heat.
- Infrared (IR) cameras: Detect heat (hotspots) and can work in low visibility; however, they are costly and less effective if the fire source is obscured, especially in summer when temperature contrast is low.
- Classic algorithms that rely on predefined visual features like color, motion, and texture to detect smoke or flames. They use handcrafted rules and often combine multiple sub-algorithms to reduce false alarms.
- Machine Learning (ML) and Deep Learning (DL) methods learn features directly from data, offering higher accuracy. CNNs, transfer learning, and attention mechanisms are widely used. These models require large, diverse datasets and computational resources. Many advanced ML methods now rely on YOLO (You Only Look Once) object detection algorithms due to their real-time efficiency. Newer YOLO variants (v5–v11) are adapted for wildfire detection using lightweight architectures, attention modules, and domain-adaptive learning. Datasets vary widely, making model comparisons difficult. A key challenge remains in detecting distant or low-visibility smoke.
- Hybrid Systems that combine classic detection based on digital image processing and analysis methods with deep learning refinement. They have proven effective, particularly in reducing false alarms and maintaining real-time capability.
Author Contributions
Funding
Conflicts of Interest
Abbreviations
ACC | Accuracy |
AP | Average Precision |
AUC | Area Under the Curve |
CNN | Convolutional Neural Network |
FNR | False Negative Rate (Miss Rate) |
FPR | False Positive Rate (Fallout) |
F1 | F1 Score |
GPGPU | General-Purpose Graphics Processing Units |
GSConv | Ghost-Shuffle Convolution |
JACC | Jaccard (Tanimoto Similarity) |
LWIR | Long-wavelength Infrared |
mAP | mean Average Precision |
MCC | Matthews Correlation Coefficient |
MWIR | Mid-wavelength Infrared |
NIR | Near Infrared |
R | Recall (True Positive Rate) |
P | Precision (Positive Predictive Value) |
PPV | Positive Predictive Value |
ROC | Receiver Operating Characteristics |
SCAM-SCNN | Spatial and Channel Attention Modularized CNN architecture |
TNAC | True Negative Accuracy (Inverse Precision) |
TNR | True Negative Rate (Specificity, Inverse Recall) |
TPR | True Positive Rate (Sensitivity, Recall) |
UAV | Unmanned Aerial Vehicle |
YOLO | You Only Look Once (YOLO) algorithm |
References
- Rjoub, D.; Alsharoa, A.; Masadeh, A. Unmanned-Aircraft-System-Assisted Early Wildfire Detection with Air Quality Sensors. Electronics 2023, 12, 1239. [Google Scholar] [CrossRef]
- Ko, A.; Lee, N.M.Y.; Sham, R.P.S.; So, C.M.; Kwok, S.C.F. Intelligent Wireless Sensor Network for Wildfire Detection. WIT Trans. Ecol. Environ. 2012, 158, 137–148. [Google Scholar] [CrossRef]
- Töreyın, B.U.; Çetin, A.E. Wildfire Detection Using LMS Based Active Learning. In Proceedings of the IEEE International Conference on Acoustics Speech and Signal Processing, Taipei, Taiwan, 19–24 April 2009; p. 1461. [Google Scholar]
- Štula, M.; Krstinić, D.; Šerić, L. Intelligent Forest Fire Monitoring System. Inf. Syst. Front. 2011, 14, 725–739. [Google Scholar] [CrossRef]
- Stipaničev, D.; Šerić, L.; Braović, M.; Krstinić, D.; Jakovčević, T.; Štula, M.; Bugarić, M.; Maras, J. Vision Based Wildfire and Natural Risk Observers. In Proceedings of the 3rd International Conference on Image Processing Theory, Tools and Applications (IPTA), Istanbul, Turkey, 15–18 October 2012; pp. 37–42. [Google Scholar] [CrossRef]
- Lindsa, P.; Norman, D.A. Human Information Processing: An Introduction to Psychology; Academic Press: New York, NY, USA, 1972. [Google Scholar]
- Benett, B.M.; Hoffman, D.D.; Prakashi, C. Observer Mechanics—A Formal Theory of Perception; Public Domain; Academic Press Inc.: Cambridge, MA, USA, 1989. [Google Scholar]
- Benett, B.M.; Hoffman, D.D.; Prakashi, C.; Richman, S. Observer theory, Bayes theory, and psychophysics. In Perception as Bayesian Inference; Knill, D., Richards, W., Eds.; Cambridge University Press: Cambridge, UK, 1996; pp. 163–212. [Google Scholar]
- Stipaničev, D. Automatic Surveillance Methods. In Encyclopedia of Wildfires and Wildland-Urban Interface (WUI) Fires; Manzello, S.L., Ed.; Springer: Berlin/Heidelberg, Germany, 2019. [Google Scholar] [CrossRef]
- Benett, B.M.; Hoffman, D.D.; Prakashi, C. Perception and Evolution. In Perception and the Physical World: Psychological and Philosophical Issue in Perception; Heyer, D., Mausfeld, R., Eds.; John Wiley & Sons: Hoboken, NJ, USA, 2002; pp. 229–245. [Google Scholar]
- Yuille, A.L.; Bülthoff, H.H. Bayesian Decision Theory and Psychophysics. In Perception as Bayesian Inference; Knill, D., Richards, W., Eds.; Cambridge University Press: Cambridge, UK, 1996; pp. 163–212. [Google Scholar]
- Hecht-Nielsen, R. Cogent Confabulation. Neural Netw. 2005, 18, 111–115. [Google Scholar] [CrossRef] [PubMed]
- Hecht-Nielsen, R. The mechanism of Thought. In Proceedings of the 2006 IEEE International Joint Conference on Neural Network Proceedings, Vancouver, BC, Canada, 16–21 July 2006; pp. 419–426. [Google Scholar]
- Zhang, Y.J. A Survey of Evaluation Methods for Image Segmentation. Pattern Recognit. 1996, 29, 1335–1346. [Google Scholar] [CrossRef]
- Bodrožić, L.; Stipaničev, D.; Štula, M. Observer Network and Forest Fire Detection. Inf. Fusion 2011, 12, 160–175. [Google Scholar] [CrossRef]
- Jakovcevic, T.; Šerić, L.; Stipaničev, D.; Krstinić, D. Wildfire smoke-detection algorithms evaluation. In Proceedings of the VI International Conference on Forest Fire Research, Coimbra, Portugal, 15–18 November 2010. [Google Scholar]
- Powers, D.M.W. Evaluation: From Precision, Recall and F-measure to ROC, Informedness, Markedness and Correlation. J. Mach. Learn. Technol. 2011, 2, 37–63. [Google Scholar]
- Bown, W.C. Sensitivity and Specificity versus Precision and Recall, and Related Dilemmas. J. Classif. 2024, 41, 402–426. [Google Scholar] [CrossRef]
- Fawcett, T. An Introduction to ROC analysis. Pattern Recognit. Lett. 2006, 27, 861–874. [Google Scholar] [CrossRef]
- Saito, T.; Rehmsmeier, M. The precision-recall plot is more informative than the ROC plot when evaluating binary classifiers on imbalanced datasets. PLoS ONE 2015, 10, e0118432. [Google Scholar] [CrossRef]
- Martin, A.; Doddington, A.G.; Kamm, T.; Ordowski, M.; Przybocki, M. The DET Curve in Assessment of Detection Task Performance. In Proceedings of the Eurospeech ‘97, Rhodes, Greece, 22–25 September 1997; Volume 4, pp. 1895–1898. [Google Scholar]
- Bacevicius, M.; Paulauskaite-Taraseviciene, A. Machine Learning Algorithms for Raw and Unbalanced Intrusion Detection Data in a Multi-Class Classification Problem. Appl. Sci. 2023, 13, 7328. [Google Scholar] [CrossRef]
- Nisbet, R.; Elder, J.; Miner, G.D. Handbook of Statistical Analysis and Data Mining Applications; Academic Press: London, UK, 2009. [Google Scholar]
- Adeodato, P.J.; Melo, S.B. On the equivalence between Kolmogorov-Smirnov and ROC curve metrics for binary classification. arXiv 2016, arXiv:1606.00496. [Google Scholar] [CrossRef]
- Fernández, A.; García, S.; Galar, M.; Prati, R.C.; Krawczyk, B.; Herrera, F. Learning from Imbalanced Data Sets; Springer International Publishing: Cham, Switzerland, 2018. [Google Scholar]
- Günay, O.; Töreyın, B.U.; Köse, K.; Çetin, A.E. Entropy-Functional-Based Online Adaptive Decision Fusion Framework with Application to Wildfire Detection in Video. IEEE Trans. Image Process. 2012, 21, 2853. [Google Scholar] [CrossRef]
- Krstinić, D.; Stipaničev, D.; Jakovčević, T. Histogram-Based Smoke Segmentation in Forest Fire Detection System. Inf. Technol. Control. 2009, 38, 237–244. [Google Scholar]
- Günay, O.; Taşdemir, K.; Töreyın, B.U.; Çetin, A.E. Video Based Wild Fire Detection at Night. Fire Saf. J. 2009, 44, 860. [Google Scholar] [CrossRef]
- He, X.; Xu, X. Optimal Band Selection of Multispectral Sensors for Wildfire Detection. In Proceedings of the 2017 Sensor Signal Processing for Defence Conference (SSPD), London, UK, 6–7 December 2017; pp. 1–5. [Google Scholar] [CrossRef]
- Krstinic, D. Automatic Forest Fire Detection in Visible Spectra. In Proceedings of the 21st International Central European Conference on Information and Intelligent Systems, Varaždin, Croatia, 30 April 2010. [Google Scholar]
- Reig, I.B.; Serrano, A.; Vergara, L. Multisensor Network System for Wildfire Detection Using Infrared Image Processing. Sci. World J. 2013, 2013, 402196. [Google Scholar] [CrossRef] [PubMed]
- Günay, O.; Töreyın, B.U.; Kose, K.; Cetin, A.E. Online Adaptive Decision Fusion Framework Based on Entropic Projections onto Convex Sets with Application to Wildfire Detection in Video. Opt. Eng. 2011, 50, 77202. [Google Scholar] [CrossRef]
- Stipaničev, D.; Šerić, L.; Krstinić, D.; Bugarić, M.; Vuković, A. IPA Adriatic Holistic Forest Fire Protection Project—The year after. In Advances in Forest Fire Research 2018; Viegas, D.X., Ed.; ADAI: Coimbra, Portugal, 2018; pp. 88–98. [Google Scholar] [CrossRef]
- Thermography. Available online: https://en.wikipedia.org/wiki/Thermography (accessed on 20 August 2025).
- Stipaničev, D.; Štula, M.; Krstinić, D.; Šerić, L.; Jakovčević, T.; Bugarić, M. Advanced Automatic Wildfire Surveillance and Monitoring Network. In Proceedings of the VI International Conference on Forest Fire Research, Coimbra, Portugal, 15–18 November 2010. [Google Scholar]
- Ko, B.C. Wildfire Smoke Detection Using Temporospatial Features and Random Forest Classifiers. Opt. Eng. 2012, 51, 17208. [Google Scholar] [CrossRef]
- Labati, R.D.; Genovese, A.; Piuri, V.; Scotti, F. Wildfire Smoke Detection Using Computational Intelligence Techniques Enhanced With Synthetic Smoke Plume Generation. IEEE Trans. Syst. Man Cybern. Syst. 2013, 43, 1003–1012. [Google Scholar] [CrossRef]
- Ko, B.C.; Park, J.; Nam, J.-Y. Spatiotemporal Bag-of-Features for Early Wildfire Smoke Detection. Image Vis. Comput. 2013, 31, 786–795. [Google Scholar] [CrossRef]
- Mukhiddinov, M.; Abdusalomov, A.; Cho, J. A Wildfire Smoke Detection System Using Unmanned Aerial Vehicle Images Based on the Optimized YOLOv5. Sensors 2022, 22, 9384. [Google Scholar] [CrossRef]
- Jeong, M.; Park, M.; Nam, J.-Y.; Ko, B.C. Light-Weight Student LSTM for Real-Time Wildfire Smoke Detection. Sensors 2020, 20, 5508. [Google Scholar] [CrossRef]
- Tao, C.; Zhang, J.; Wang, P. Smoke Detection Based on Deep Convolutional Neural Networks. In Proceedings of the 2016 International Conference on Industrial Informatics—Computing Technology, Intelligent Technology, Industrial Information Integration (ICIICII), Wuhan, China, 3–4 December 2016. [Google Scholar] [CrossRef]
- González, A.; Zúñiga, M.; Nikulin, C.; Carvajal, G.; Cárdenas, D.; Pedraza, M.A.; Fernandez, C.A.; Munoz, R.I.; Castro, N.; Rosales, B.; et al. Accurate Fire Detection through Fully Convolutional Network. In Proceedings of the 7th Latin American Conference on Networked and Electronic Media (LACNEM 2017), Valparaiso, Chile, 6–7 November 2017; pp. 1–6. [Google Scholar] [CrossRef]
- Sharma, J.; Granmo, O.; Goodwin, M.; Fidje, J.T. Deep Convolutional Neural Networks for Fire Detection in Images. In Communications in Computer and Information Science; Springer Science+Business Media: Berlin/Heidelberg, Germany, 2017; p. 183. [Google Scholar]
- Hung, K.-M.; Chen, L.; Wu, J.-A. Wildfire Detection in Video Images Using Deep Learning and HMM for Early Fire Notification System. In Proceedings of the 12th International Congress on Advanced Applied Informatics (IIAI-AAI), Toyama, Japan, 7–11 July 2019; IEEE: New York, NY, USA, 2019. [Google Scholar]
- Zhang, A.; Zhang, A.S. Real-Time Wildfire Detection and Alerting with a Novel Machine Learning Approach. Int. J. Adv. Comput. Sci. Appl. 2022, 13, 0130801. [Google Scholar] [CrossRef]
- Almeida, J.S.; Huang, C.; Nogueira, F.G.; Bhatia, S.; de Albuquerque, V.H.C. EdgeFireSmoke: A Novel Lightweight CNN Model for Real-Time Video Fire–Smoke Detection. IEEE Trans. Ind. Inform. 2022, 18, 7889–7898. [Google Scholar] [CrossRef]
- Khan, S.; Muhammad, K.; Mumtaz, S.; Baik, S.W.; Albuquerque, V.H.C. de Energy-Efficient Deep CNN for Smoke Detection in Foggy IoT Environment. IEEE Internet Things J. 2019, 6, 9237–9245. [Google Scholar] [CrossRef]
- Zhang, Z.; Guo, Y.; Chen, G.; Xu, Z.-D. Wildfire Detection via a Dual-Channel CNN with Multi-Level Feature Fusion. Forests 2023, 14, 1499. [Google Scholar] [CrossRef]
- Shalan, A.; Walee, N.A.; Hefny, M.; Rahman, M.K. Improving Deep Machine Learning for Early Wildfire Detection from Forest Sensory Images. In Proceedings of the 5th International Conference on Artificial Intelligence, Robotics and Control (AIRC), Cairo, Egypt, 22–24 April 2024. [Google Scholar] [CrossRef]
- El-Madafri, I.; Peña, M.; Torre, N.O. Dual-Dataset Deep Learning for Improved Forest Fire Detection: A Novel Hierarchical Domain-Adaptive Learning Approach. Mathematics 2024, 12, 534. [Google Scholar] [CrossRef]
- Walee, N.A.; Shalan, A.; Kadlec, C.; Rahman, M.K. Optimizing Deep Learning Accuracy: Visibility Filtering Approach for Early Wildfire Detection in Forest Sensor Images. In Proceedings of the IEEE/ACIS 9th International Conference on Big Data, Cloud Computing, and Data Science (BCD), Kitakyushu, Japan, 16–18 July 2024; pp. 59–64. [Google Scholar] [CrossRef]
- Jegan, R.; Birajdar, G.K.; Chaudhari, S. Deep Residual Multi-resolution Features and Optimized Kernel ELM for Forest Fire Image Detection Using Imbalanced Database. Fire Technol. 2025. [Google Scholar] [CrossRef]
- Redmon, J.; Santosh, D.H.H.; Ross, G.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. arXiv 2015, arXiv:1506.02640. [Google Scholar] [CrossRef]
- Khanam, R.; Hussain, M. YOLOv11: An Overview of the Key Architectural Enhancements. arXiv 2024, arXiv:2410.17725. [Google Scholar] [CrossRef]
- Vazquez, G.; Zhai, S.; Yang, M. Transfer Learning Enhanced Deep Learning Model for Wildfire Flame and Smoke Detection. In Proceedings of the 2024 International Conference on Smart Applications, Communications and Networking (SmartNets), Harrisonburg, VA, USA, 28–30 May 2024; pp. 1–4. [Google Scholar] [CrossRef]
- Raita-Hakola, A.; Rahkonen, S.; Suomalainen, J.; Markelin, L.; de Oliveira, R.A.; Hakala, T.; Koivumäki, N.; Honkavaara, E.; Pölönen, I. Combining YOLO V5 and Transfer Learning for Smoke-Based Wildfire Detection in Boreal Forests. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, XLVIII-1/W, 1771–1778. [Google Scholar] [CrossRef]
- Krstinić, D.; Šerić, L.; Ivanda, A.; Bugarić, M. Multichannel Data from Temporal and Contextual Information for Early Wildfire Detection. In Proceedings of the 2022 7th International Conference on Smart and Sustainable Technologies (SpliTech), Bol, Croatia, 20–23 June 2023; IEEE: New York, NY, USA, 2023. [Google Scholar]
- Xu, H.; Li, B.; Zhong, F. Light-YOLOv5: A Lightweight Algorithm for Improved YOLOv5 in Complex Fire Scenarios. Appl. Sci. 2022, 12, 12312. [Google Scholar] [CrossRef]
- Gonçalves, L.A.O.; Ghali, R.; Akhloufi, M.A. YOLO-Based Models for Smoke and Wildfire Detection in Ground and Aerial Images. Fire 2024, 7, 140. [Google Scholar] [CrossRef]
- Tao, Y.; Li, B.; Li, P.; Qian, J.; Qi, L. Improved Lightweight YOLOv11 Algorithm for Real-Time Forest Fire Detection. Electronics 2025, 14, 1508. [Google Scholar] [CrossRef]
- Casas, E.; Ramos, L.; Bendek, E.; Rivas, F. Assessing the Effectiveness of YOLO Architectures for Smoke and Wildfire Detection. IEEE Access 2023, 11, 96554–96583. [Google Scholar] [CrossRef]
- Li, T.; Zhao, E.; Zhang, J.; Hu, C. Detection of Wildfire Smoke Images Based on a Densely Dilated Convolutional Network. Electronics 2019, 8, 1131. [Google Scholar] [CrossRef]
- Gonçalves, A.M.; Brandão, T.; Ferreira, J.C. Wildfire Detection with Deep Learning—A Case Study for the CICLOPE Project. IEEE Access 2024, 12, 82095–82110. [Google Scholar] [CrossRef]
- CICLOPE Wildfire Monitoring System. Available online: https://www.ciclope.com.pt/ (accessed on 28 May 2025).
- Baptista, M.; Oliveira, B.; Paulo, C.; Joao, C.F.; Tomaso, B. Improved Real-Time Wildfire Detection Using a Surveillance System. In Proceedings of the World Congress on Engineering (WCE 2019), London, UK, 3–5 July 2019. [Google Scholar]
- Haroun, A.; Tagmouni, A.; Chergui, S.; Ladlani, R.; Bechar, A.; Melchane, S.; Kherbachi, S. Advancing Wildfire Detection: Using YOLOv8 on Multifarious Images. In Proceedings of the 9th International Conference on Control and Robotics Engineering (ICCRE), Osaka, Japan, 10–12 May 2024; pp. 359–364. [Google Scholar] [CrossRef]
- Wei, C.; Xu, J.; Li, Q.; Jiang, S. An Intelligent Wildfire Detection Approach through Cameras Based on Deep Learning. Sustainability 2022, 14, 15690. [Google Scholar] [CrossRef]
- Saleem, A.; Abdulhadi, S. Wildfire Detection System Using Yolov5 Deep Learning Model. Int. J. Comput. Digit. Syst. 2023, 14, 10149–10158. [Google Scholar] [CrossRef]
- Dwyer, B. Wildfire Smoke Dataset. Roboflow Universe Repository. 2022. Available online: https://public.roboflow.com/object-detection/wildfire-smoke (accessed on 27 May 2025).
- Hopkins, B.; O’Neill, L.; Afghah, F.; Razi, A.; Rowell, E.; Watts, A.; Fule, P.; Coen, J. FLAME 2: Fire Detection and Modeling: Aerial Multi-Spectral Image Dataset; IEEE Dataport: Piscataway, NJ, USA, 2022. [Google Scholar] [CrossRef]
- Maillard, S.; Khan, M.S.; Cramer, A.K.; Sancar, E.K. Wildfire and Smoke Detection Using YOLO-NAS. In Proceedings of the 2024 IEEE 3rd International Conference on Computing and Machine Intelligence (ICMI), Mt Pleasant, MI, USA, 13–14 April 2024; pp. 1–5. [Google Scholar] [CrossRef]
- Ramos, L.; Casas, E.; Bendek, E.; Romero, C.; Rivas, F. Hyperparameter Optimization of YOLOv8 for Smoke and Wildfire Detection: Implications for Agricultural and Environmental Safety. Artif. Intell. Agric. 2024, 12, 109–126. [Google Scholar] [CrossRef]
- de Venâncio, P.V.A.B.; Lisboa, A.C.; Barbosa, A.V. An Automatic Fire Detection System Based on Deep Convolutional Neural Networks for Low-Power, Resource-Constrained Devices. Neural Comput. Appl. 2022, 34, 15349–15368. [Google Scholar] [CrossRef]
- Hemateja, A.V.N.M. WildFire-Smoke-Dataset-Yolo. 2021. Available online: https://www.kaggle.com/datasets/ahemateja19bec1025/wildfiresmokedatasetyolo (accessed on 28 May 2025).
- Zhang, Q.; Lin, G.; Zhang, Y.; Gao, X.; Wang, J. Wildland Forest Fire Smoke Detection Based on Faster R-CNN Using Synthetic Smoke Images. Procedia Eng. 2018, 211, 441–446. [Google Scholar] [CrossRef]
- Zhou, J.; Li, Y.; Yin, P. A Wildfire Smoke Detection Based on Improved YOLOv8. Int. J. Inf. Commun. Technol. 2024, 25, 52–67. [Google Scholar] [CrossRef]
- Wang, L.; Doukhi, O.; Lee, D.J. FCDNet: A Lightweight Network for Real-Time Wildfire Core Detection in Drone Thermal Imaging. IEEE Access 2025, 13, 14516–14530. [Google Scholar] [CrossRef]
- San-Miguel-Ayanz, J.; Durrant, T.; Boca, R.; Maianti, P.; Libertà, G.; Artés Vivancos, T.; Oom, D.; Branco, A.; de rigo, D.; Ferrari, D.; et al. Forest Fires in Europe, Middle East and North Africa 2019; JRC Technical Reports, EUR 30402 EN; Publications Office of the European Unio: Luxemburg, 2020. [Google Scholar] [CrossRef]
- Bugarić, M.; Stipaničev, D.; Jakovčević, T. AdriaFirePropagator and AdriaFireRisk: User friendly Web based wildfire propagation and wildfire risk prediction software. In Advances in Forest Fire Research 2018; Viegas, D.X., Ed.; ADAI: Coimbra, Portugal, 2018; pp. 890–899. [Google Scholar]
- Bugarić, M.; Braović, M.; Stipaničev, D. Statistical evaluation of site-specific wildfire risk index calculation for Adriatic regions. In Advances in Forest Fire Research 2018, Proceedings of the VII International Conference on Forest Fire Research, Coimbra, Portugal, 17–20 November 2014; Viegas, D.X., Ed.; ADAI: Coimbra, Portugal, 2014. [Google Scholar]
- Chuvieco, E.; Yebra, M.; Martino, S.; Thonicke, K.; Gómez-Giménez, M.; San-Miguel, J.; Oom, D.; Velea, R.; Mouillot, F.; Molina, J.R.; et al. Towards an Integrated Approach to Wildfire Risk Assessment: When, Where, What and How May the Landscapes Burn. Fire 2023, 6, 215. [Google Scholar] [CrossRef]
- Wildfire Hazard Potential (WHP); Version 2023 Continuous (Metadata Created May 31, 2024; Updated April 21, 2025); U.S. Forest Service, Wildfire Modeling Institute: Missoula, MT, USA, 2023. Available online: https://research.fs.usda.gov/firelab/products/dataandtools/wildfire-hazard-potential (accessed on 20 August 2025).
- Thies, B. Machine Learning Wildfire Susceptibility Mapping for Germany. Nat. Hazards 2025, 121, 12517–12530. [Google Scholar] [CrossRef]
- Marquez Torres, A.; Signorello, G.; Kumar, S.; Adamo, G.; Villa, F.; Balbi, S. Fire risk modeling: An integrated and data-driven approach applied to Sicily. Nat. Hazards Earth Syst. Sci. 2023, 23, 2937–2959. [Google Scholar] [CrossRef]
- Finney, M.A. The challenge of quantitative risk analysis for wildland fire. For. Ecol. Manag. 2025, 211, 97–108. [Google Scholar] [CrossRef]
- Huang, Y.; Li, J.; Zheng, H. Modeling of Wildfire Digital Twin: Research Progress in Detection, Simulation, and Prediction Techniques. Fire 2024, 7, 412. [Google Scholar] [CrossRef]
- Stipaničev, D.; Bugarić, M.; Šerić, L.; Jakovčević, T. Web GIS Technologies in Advanced Cloud Computing Based Wildfire Monitoring System. In Proceedings of the 5th International Wildland Fire Conference WILDFIRE 2011, Sun City, South Africa, 9–13 May 2011. Paper No. 68. [Google Scholar]
- Shabnam, S.E. Mixed reality and remote sensing application of unmanned aerial vehicle in fire and smoke detection. J. Ind. Inf. Integr. 2019, 15, 42–49. [Google Scholar] [CrossRef]
- Costa, C.; Gomes, E.; Rodrigues, N.; Gonçalves, A.; Ribeiro, R.; Costa, P.; Pereira, A. Augmented reality mobile digital twin for unmanned aerial vehicle wildfire prevention. Virtual Real. 2025, 29, 71. [Google Scholar] [CrossRef]
- Allison, R.S.; Johnston, J.M.; Craig, G.; Jennings, S. Airborne Optical and Thermal Remote Sensing for Wildfire Detection and Monitoring. Sensors 2016, 16, 1310. [Google Scholar] [CrossRef]
- Szpakowski, D.M.; Jensen, J. A Review of the Applications of Remote Sensing in Fire Ecology. Remote Sens. 2019, 11, 2638. [Google Scholar] [CrossRef]
- Lentile, L.B.; Holden, Z.A.; Smith, A.M.S.; Falkowski, M.J.; Hudak, A.T.; Morgan, P.; Lewis, S.A.; Gessler, P.E.; Benson, N. Remote Sensing Techniques to Assess Active Fire Characteristics and Post-Fire Effects. Int. J. Wildland Fire 2006, 15, 319–345. [Google Scholar] [CrossRef]
- Chan, C.C.; Alvi, S.A.; Zhou, X.; Durrani, S.; Wilson, N.; Yebra, M. IoT Ground Sensing Systems for Early Wildfire Detection: Technologies, Challenges and Opportunities. arXiv 2023, arXiv:2312.10919. [Google Scholar] [CrossRef]
Name | Equation 1 | Ideal Observer |
---|---|---|
True Positive Rate (Sensitivity, Recall) | 1 | |
False Positive Rate (Fallout) | 0 | |
True Negative Rate (Specificity, Inverse Recall) | 1 | |
False Negative Rate (Miss Rate) | 0 | |
Positive Predictive Value (Precision, Confidence) | 1 | |
Accuracy | 1 | |
True Negative Accuracy (Inverse Precision) | 1 | |
Jaccard (Tanimoto Similarity) | 1 | |
F1 Score 2 | 1 | |
Matthews Correlation Coefficient 3 | 1 |
No. of Images | Type | URL | Name |
---|---|---|---|
462 | sequence of images Wildfire Ignition | https://www.hpwren.ucsd.edu/FIgLib/ (accessed on 20 August 2025) | HPWREN Fire Ignition Images Library |
400 | image/smoke Mediterranean Landscape | http://wildfire.fesb.hr/index.php?option=com_content&view=article&id=66&Itemid=76 (accessed on 20 August 2025) | FESB MILD Dataset |
1005 | image/smoke Natural Landscape | https://universe.roboflow.com/mazhar-cakir/wildfire-j801f (accessed on 20 August 2025) | Wildfire Computer Vision Project–Mazhar Cakir |
2192 | image/smoke Natural Landscape | https://github.com/aiformankind/wildfire-smoke-dataset (accessed on 20 August 2025) | Open Wildfire Smoke Datasets |
596 | image/smoke/fire Natural Landscape | https://universe.roboflow.com/rapeepong-nrqia/wildfire-itolc (accessed on 20 August 2025) | Wildfire Computer Vision Project–Rapeepong 1 |
271 | image/smoke/fire Natural Landscape | https://universe.roboflow.com/rapeepong-nrqia/wildfire-ckwxt (accessed on 20 August 2025) | Wildfire Computer Vision Project–Rapeepong 2 |
UAV images/videos Prescribed Burning | https://etsin.fairdata.fi/dataset/1dce1023-493a-4d63-a906-f2a44f831898 (accessed on 20 August 2025) | Boreal Forest Fire | |
32,375 | UAV fire frames binary: fire or non-fire | https://ieee-dataport.org/open-access/flame-dataset-aerial-imagery-pile-burn-detection-using-drones-uavs (accessed on 20 August 2025) | FLAME Dataset |
122,624 (Land) 53,530 (UAV) | flame/smoke UAV and Land | https://doi.org/10.57760/sciencedb.j00104.00103 (accessed on 20 August 2025) | FASSD |
999 | flame Outdoor | https://www.kaggle.com/datasets/phylake1337/fire-dataset (accessed on 20 August 2025) | FIRE Dataset |
7000+ | smoke/flame Urban/Rural | https://www.kaggle.com/datasets/dataclusterlabs/fire-and-smoke-dataset (accessed on 20 August 2025) | Fire and Smoke Dataset |
9462 | smoke/flame Urban/Rural | https://github.com/siyuanwu/DFS-FIRE-SMOKE-Dataset (accessed on 20 August 2025) | DFS (Dataset for Fire and Smoke Detection) |
100,000 | smoke/synthetic images Natural Landscape Urban/Rural/Indore | https://bigmms.github.io/cheng_gcce19_smoke100k/ (accessed on 20 August 2025) | Smoke 100k |
Satellite/Sensor | Spatial Resolution | Update Frequency | Highlights |
---|---|---|---|
VIIRS (Suomi-NPP, JPSS) | 375 m | 2 times a day | Fire detection, both day and night |
MODIS (Terra/Aqua) | 1 km | 4 times a day | Standard for global wildfire monitoring |
Sentinel-3 (SLSTR) | 500 m (VNIR/SWIR) 1 km (thermal) | 1–2 times a day | European fire detection. Additional channels optimized for fire and thermal detection. |
Lansat TIRS (Thermal) | 100 m (30 m with interpolation) | 16-day revisit | Thermal imaging, but too infrequent for early fire detection. |
OroraTech (FOREST-3 and Future Constellation) | ~16 m2 | <30 min | Launched in January 2025, the FOREST-3 satellite carries a miniaturized infrared system with on-orbit AI processing for near-real-time alerts. OroraTech plans to expand to a constellation of 96 satellites, capable of detecting fires as small as 16 m2, with global revisit times under 30 min. |
FireSat (High-res IR/thermal) | ~5 × 5 m | ~15–20 min | AI-powered to distinguish real fire events from false positives The first FireSat satellite was deployed in March 2025. It is part of a planned constellation of 50+ satellites, expected to be fully operational by 2030. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bugarić, M.; Krstinić, D.; Šerić, L.; Stipaničev, D. Current Trends in Wildfire Detection, Monitoring and Surveillance. Fire 2025, 8, 356. https://doi.org/10.3390/fire8090356
Bugarić M, Krstinić D, Šerić L, Stipaničev D. Current Trends in Wildfire Detection, Monitoring and Surveillance. Fire. 2025; 8(9):356. https://doi.org/10.3390/fire8090356
Chicago/Turabian StyleBugarić, Marin, Damir Krstinić, Ljiljana Šerić, and Darko Stipaničev. 2025. "Current Trends in Wildfire Detection, Monitoring and Surveillance" Fire 8, no. 9: 356. https://doi.org/10.3390/fire8090356
APA StyleBugarić, M., Krstinić, D., Šerić, L., & Stipaničev, D. (2025). Current Trends in Wildfire Detection, Monitoring and Surveillance. Fire, 8(9), 356. https://doi.org/10.3390/fire8090356