Next Article in Journal
A Four-Dimensional Space-Time Automatic Obstacle Avoidance Trajectory Planning Method for Multi-UAV Cooperative Formation Flight
Previous Article in Journal
Energy Efficient Transmission Design for NOMA Backscatter-Aided UAV Networks with Imperfect CSI
Previous Article in Special Issue
Oblique View Selection for Efficient and Accurate Building Reconstruction in Rural Areas Using Large-Scale UAV Images
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

3D AQI Mapping Data Assessment of Low-Altitude Drone Real-Time Air Pollution Monitoring

by
Sarun Duangsuwan
1,*,
Phoowadon Prapruetdee
1,
Mallika Subongkod
2 and
Katanyoo Klubsuwan
3
1
Electrical Engineering, Department of Engineering, King Mongkut’s Institute of Technology Ladkrabang, Prince of Chumphon Campus, Chumphon 86160, Thailand
2
Business Administration, Department of Business Administration, King Mongkut’s Institute of Technology Ladkrabang, Prince of Chumphon Campus, Chumphon 86160, Thailand
3
Smart City Thailand Association (SCTA), Bangkok 10230, Thailand
*
Author to whom correspondence should be addressed.
Drones 2022, 6(8), 191; https://doi.org/10.3390/drones6080191
Submission received: 27 June 2022 / Revised: 25 July 2022 / Accepted: 26 July 2022 / Published: 29 July 2022
(This article belongs to the Special Issue UAV Photogrammetry for 3D Modeling)

Abstract

:
Air pollution primarily originates from substances that are directly emitted from natural or anthropogenic processes, such as carbon monoxide (CO) gas emitted in vehicle exhaust or sulfur dioxide (SO2) released from factories. However, a major air pollution problem is particulate matter (PM), which is an adverse effect of wildfires and open burning. Application tools for air pollution monitoring in risk areas using real-time monitoring with drones have emerged. A new air quality index (AQI) for monitoring and display, such as three-dimensional (3D) mapping based on data assessment, is essential for timely environmental surveying. The objective of this paper is to present a 3D AQI mapping data assessment using a hybrid model based on a machine-learning method for drone real-time air pollution monitoring (Dr-TAPM). Dr-TAPM was designed by equipping drones with multi-environmental sensors for carbon monoxide (CO), ozone (O3), nitrogen dioxide (NO2), particulate matter (PM2.5,10), and sulfur dioxide (SO2), with data pre- and post-processing with the hybrid model. The hybrid model for data assessment was proposed using backpropagation neural network (BPNN) and convolutional neural network (CNN) algorithms. Experimentally, we considered a case study detecting smoke emissions from an open burning scenario. As a result, PM2.5,10 and CO were detected as air pollutants from open burning. 3D AQI map locations were shown and the validation learning rates were apparent, as the accuracy of predicted AQI data assessment was 98%.

1. Introduction

Open burning is the burning of unwanted materials such as waste, tree branches, leaves, grass, and plastics; through open burning, smoke and emissions are released directly into the air. During open burning, air pollutants are the main considerations for health, human life, society, and cities. Globally, 5.4 million deaths were attributed to ambient air pollution in 2021. At 89.0%, low and middle-income countries had the highest portion of deaths. In a regional breakdown, it was found that the southeast Asian regions accounted for 943,000 deaths, while the highest number of deaths was 1,254,000 for the western Pacific region [1].
In Thailand, open burning is a major cause of air pollution, resulting in numerous public health problems and even death. It is a source of haze smoke in northern Thailand. Most open burning is caused by activities such as agricultural management, forest foraging, waste burning, and combustion. During the summer session from January to April, a high air quality index (AQI) above 300 is typically reported every year, meaning it is dangerous for human health. Smoke from the open burning of fields is a leading cause of the smog crisis in many areas. The problem of smoke impacts socio-economic development, tourism, and public health in the daily lives of citizens in various regions [2].
The objective of air pollution monitoring is to fulfill statutory air quality reporting requirements [3]. An important step in developing an effective monitoring technology is to assess where air quality in the region is likely to deteriorate or if an area is at risk, such as in agricultural zones, industrial gas leakages, forest boundaries, open burning areas, and so on. These areas are impossible to monitor using human interaction through on-site surveys. In Thailand, one methodology to estimate air pollutant emissions is called the moderate-resolution imaging spectro-radiometer (MODIS) [4]. The MODIS solution is based on satellite image processing that can monitor a burned area with a 500 m × 500 m spatial resolution of the land surface. The monitoring results can show the spectrograph of air pollutants parameters such as CO2, CO, CH4, NO2, PM2.5, PM10, and NOx in a two-dimensional (2D) mapping area. An alternative solution is to use unmanned aerial vehicles (UAV) or drones to monitor and survey the risk area or any location that has poor accessibility. Using a UAV mounted with multi-gas sensors provides the ability to fly to and survey an area to estimate the AQI by 3D mapping [5].
UAV-based air quality monitoring is challenging for research pioneers in terms of developing cost-effective UAV multi-functional monitoring for 3D AQI mapping. In the literature, the authors of [6] presented a self-driven drone and LoRaWAN network for air quality data assessment. The sensor boxset consisted of CO, NO2, SO2, PM10, and PM2.5 sensors, mounted on a rotary UAV. The advantage of the LoRaWAN network is that it enables long-range communication up to 5 km, through which all sensors send data back to the cloud network. Several works [7,8,9,10,11,12] have proposed the development of drone air pollution monitoring. In [7], the authors describe broad gas sensors for pollution such as CO2, CO, NH3, SO2, O3, and NO2 to equip UAVs; the Zigbee module was used for sending data to the internet. Monitoring uses text messages and generates an air quality health index (AQHI) map. The AQHI map shows a specific region of measured data for each location in the text message. In the same work, the authors of [7,8] presented low-cost, off-the-shelf sensors and components that perform real-time monitoring of pollutants at low and high altitudes, through which the AQI can be monitored via mobile application for real-time data assessment. In [9], the authors presented various types of air quality sensors for CO, SO2, NO2, O3, PM2.5, PM10, and black carbon (BC). They described the challenges regarding UAV-based systems, such as power consumption, weather conditions, equipment cost, and size. However, these related works did not consider 3D AQI mapping.
3D visualization and mapping are challenging for innovative air pollution monitoring. Existing work in [13] presents a spatial interpolation model for 3D AQI. The goal of spatial interpolation is to create a surface representing the 3D space as closely as possible. The results showed that the trilinear interpolation model is the most suitable for the 3D distribution of the air quality model in case studies. It can be seen that 3D virtualization and mapping employ post-data processing techniques based on the Gaussian plume model, which is a traditional method for air pollution monitoring. In [14], the authors presented the performance of the Gaussian plume model and neural network algorithm (GPM-NN) for the 3D AQI map. Further, the GPM-NN method was simulated by optimizing power consumption to increase the accuracy of AQI. Unfortunately, the limit of the GPM-NN method is that it is not designed to model dispersion under low wind conditions or sites close to the source, that is, at a distance less than 100 m. Additionally, trends of machine learning-based data prediction models are fulfilled for air quality monitoring.
Machine learning algorithms play an important role in new air pollution monitoring. Machine learning-based air pollution monitoring prediction models are statistical models that depend on observation and training data. Mostly, the main types of statistical models are regression models and neural network models. In [15], four types of machine learning algorithms—decision tree, random forest (RF), multilayer perceptron (MLP), and radial basis function (RBF) neural network—are presented to monitor the PM2.5 data acquisition. The comparative results showed that the RBF neural network is the best suited algorithm for predicting air quality models. In [16], nine machine learning algorithms—logistic regression (LR), linear discriminant analysis (LDA), K-nearest neighbors (K-NN), decision tree, Gaussian naïve Bayes (GNB), MLP, Adaboost (ADB), RF, and support vector machine (SVM)—are presented for classifying photogrammetry using a drone camera and LiDAR. The simulation result with the highest precision was the MLP algorithm. On the other hand, the result with the lowest precision was the GNB algorithm. Among the several machine learning methods, MLP-based neural networks have shown successful results, but there has been no consideration of these machine learning algorithms for 3D AQI mapping. 3D mapping to UAV was presented using a convolutional neural network (CNN) [17]. This method can utilize the UAV when lacking a global navigation satellite system (GNSS) signal. The CNN algorithm can transform the image from the UAV camera from RGB to a 3D construction and virtualization map. Then, the concept might be capable of designing 3D AQI mapping by UAV. In the case study of smoke detection [18,19], the CNN algorithm was used to classify the image of the smoke region situation for protection against forest fires. The 3D-CNN was developed to find the smoke target location [20]. Interestingly, the CNN algorithm was studied for air pollution prediction [21,22]. It is well-known that the CNN algorithm is suitable for image extraction and video prediction. In [23], the authors presented AQNet-based PM2.5 monitoring. The proposed method uses a K-NN algorithm and deep neural network (DNN) procedure to jointly perform hybrid deep learning between the regression model and neural network model for 3D air quality.
As mentioned above, research on machine learning models has achieved significant progress. Although there are some related works on 3D virtualization maps, the proposed 3D AQI mapping has rarely been reported.
This paper presents a 3D AQI mapping data assessment for low-altitude drone real-time air pollution monitoring. The measurement of 3D AQI mapping data is assessed by using the BPNN–CNN algorithm (hybrid model). The objective of the proposed BPNN–CNN is to validate the dataset through training as a real-time monitor and to optimize the output layer of the results using 3D AQI mapping graphs on the x-axis, y-axis, and z-axis. Furthermore, the measurement of AQI using Dr-TAPM takes into account the real situation of open burning smoke detection in the present scenario.

2. Design of Frameworks

The framework for 3D AQI mapping is shown in Figure 1, where the procedures are divided into five sections. The AQI is measured remotely by the Dr-TAPM, while the cloud network is for data storage, data processing is for the pre-processing and post-processing of 3D AQI values using computer vision, the hybrid model is a proposed machine-learning model using the BPNN–CNN algorithm, and 3D AQI mapping is the output in 3D graphical form.

2.1. Dr-TAPM

The Dr-TAPM platform is a rotary-wing UAV for air pollution monitoring. In previous work [6], the structure of the Dr-TAPM was designed using all aluminum to make it lightweight, low-cost, and robust. Figure 2 shows the structure and dimensions of the Dr-TAPM. The flight time is 30 min carrying a loadcell (sensor box), and the battery type is a lithium polymer (Li-Po) 6800 mAh 6 cell with two batteries. The weight is 2.3 kg without loadcell, and supports a loadcell up to 3 kg. The remote controller is WiFi 2.4 GHz with a range of 100 m, while the Pixhawk broad is the flight controller and flight positioning is by GPS module. Next, a sensor box can be equipped for flying, the sensor types of which are CO, O3, NO2, PM2.5,10, and SO2 sensors to measure the AQI value. An Arduino MEGA2260 controller board is used for control of sensor modules, and a Raspberry Pi 3 is used for data connection to the cloud network. Figure 3 shows a circuit diagram of all sensors and the controller boards in the sensor box. In this work, we confirm that all sensors were refreshed to their defaults before testing to ensure the return of accurate data.

2.2. Cloud Network

The framework for the cloud network uses Google Firebase for real-time data storage and processing. To show the AQI monitoring, the mobile and web applications were displayed in real-time AQI status. When showing a green color from 0–50 levels, AQI is ‘good’, while the yellow color from 51–100 levels means ‘moderate’, the orange color from 101–150 levels means ‘starting to impact health’, the red color from 151–200 levels means ‘unhealthy’, and the purple color from 201–300 means ‘very unhealthy’.

2.3. Data Processing

A framework for data processing was developed for comparing data between the measured AQI and the AQI predicted by the hybrid model. A computer-aided simulation is used to display the 3D AQI mapping in real-time from the Dr-TAPM surveying.
The design of the framework for data processing is based on high-level synthesis (HLS). HLS is an automated design process that interprets an algorithmic description of computer programming (MATLAB) and creates a model in a hardware description language (HDL). Our framework presents the development steps, which are illustrated in Figure 4. Steps that involve software, middleware, and hardware have been depicted with different shades of color. The first step of the framework is to take real-time AQI raw data and convert it into HLS language. To generate the HLS code describing the software-based MATLAB program, the Simulink tool provides a set of HLS functions that diagnose the raw data and the data predicted by the hybrid model. In the next step, the HLS code is subsequently translated to HDL code by the Simulink tool, which takes into account optimization directives such as plotting the 3D AQI mapping and then displays the results in mobile and web applications.

3. Methodology

Hybrid Model

Accurate air pollution monitoring by UAV has difficulty precisely predicting 3D AQI for a region at a specific time. The machine learning model fills the gap between the capability of the Dr-TAPM and 3D AQI mapping. Computer vision features in CNNs [17,22] have been used to construct 3D map imagery. A BPNN [24,25,26] is a prediction method for accurate air pollutant values in real-time monitoring. The motivation for this work is to present a hybridization approach (BPNN–CNN) using BPNN and CNN to produce reliable and accurate 3D AQI mapping with the low-altitude Dr-TAPM. Figure 5 shows a schematic diagram of the hybrid model using BPNN and CNN algorithms, and the graphical illustration of the proposed hybrid framework is shown in Figure 6. In Figure 6, the BPNN is a fully connected neural network with three layers: one input layer, one hidden layer, and one output layer. The steps of the BPNN consist of two main propagations. Firstly, there is the feedforward propagation and weight adjustment. In the feedforward propagation, AQI raw data are input signals that are assigned a weight by the rectified linear unit (ReLU) function or activation function in the hidden layer; then, the weight is sent to the output layer to calculate the predicted AQI values. Secondly, the backpropagation is a procedure of optimization between the predicted data and tested data by finding the error indicators. After the data is predicted, the next step is a CNN procedure with three layers: the feature extraction layer, the convolution layer, and the 3D feature map in different colors.
The formulation of the BPNN algorithm was proposed by R. Hecht-Nielsen in 1992. It can be seen as an extension of the perceptron algorithm [27]. A general formula for a single layer is as follows:
y = f ( w T x + b )
where x is the test data in the input layer, y is the output layer, w is the weight layer, and b is the bias. The activation function is a control weight and bias to estimate the backpropagation error as a minimum by using the sigmoid function or ReLU function. The most widely used activation functions are the following:
f ( x ) = 1 1 + e l
ReLU = { l               l > 1 0             l 0
where l is sub-dataset.
The hidden layer in Figure 6 denotes a weighted connection by using the ReLU unit to activate the data input. The training model of the BPNN is optimized by using stochastic gradient descent (SGD), which is an optimization method for reducing the loss function or backpropagation error for weight and control bias simultaneously. For the training model, the output unit activation function is
y ^ = f ( w T h + b )
where h is the training model using SGD.
The accuracy of the backpropagation error indicators or learning rates is shown by the mean absolute error (MAE), the root mean square error (RMSE), and the coefficient of determination (R2). The formula for the three indicators is given by
MAE = i = 1 K | y i y ^ i | K
RMSE = i = 1 K ( y i y ^ i ) 2 K
R 2 = i = 1 K y i y ^ i i = 1 K y i i = 1 K y ^ i i = 1 K y i 2 ( i = 1 K y i ) 2 i = 1 K y ^ i 2 ( i = 1 K y ^ i ) 2
where K denotes the number of total raw datasets.
The formulation of the CNN algorithm was proposed by LeCun et al. in 1989. The function of a CNN is similar to a feedforward neural network in that they both use a convolution operation to extract the features of input data. The main functions of a CNN are divided into the feature extraction layer, the convolution layer, and the feature map. In feature extraction, the input data is filtered before feeding the data sequence into the convolution layer. Then, the convolution layer is activated by using the ReLU unit. In general, this is a max-pooling layer. The max-pooling consists of two steps. Firstly, it makes the 2D-convolution of the predicted data from the BPNN, which is useful when determining the 3D map in the next layer. Secondly, the pooling layer helps condense the size of the output layer. The last layer is the 3D feature map. This process translates the AQI values for each color level and map by the context of the display as follows
S ( y ^ , K ) = n m I ( x , y , z ) C ( y ^ n , K m )
where I ( x , y , z ) denotes the 3D AQI map levels from 1 to 300 index in each color—green, yellow, orange, red, and purple—where n and m are independent of the pollutant parameters and the locations being tested. C is the function of the convolution layer from the max-pooling step, which depends on the predicted data.

4. Data Assessment

4.1. Training Model

The dataset is separated into two parts. Firstly, 20% of the actual data were used for testing. These data are x = [ x 1 , x 2 , x 3 , , x n ] , where n represents the pollutant sensors such as the CO, O3, NO2, PM2.5,10, and SO2 sensors. Secondly, 80% of the data were used for training the BPNN–CNN model (See Table 1).
The framework of the training model was established using Python programming in Scikit-learn and TensorFlow. Scikit-learn is a feature operation of a machine-learning model that can provide a hyperparameter setting, facilitating the construction of a BPNN model according to library requirements. TensorFlow is a tool for the construction of a CNN to offer image processing in 3D graphs. To validate the training model, the four parameter sets for the BPNN–CNN model include input layers, hidden layers, the activation function, and an optimizer. The input layers were set at seven nodes and one hidden layer, with each layer set as 16 nodes. The activation function was a ReLU unit, and an SGD was provided as the optimizer.

4.2. Experimental Setup in Case Study of Open Burning Smoke Detection

In this work, we consider a case study of open burning, which allows the Dr-TAMP to detect smoke-based air pollutants using remote sensing. Figure 7a shows a schematic model of the experimental setup on the x-axis, y-axis, and z-axis. The UAV height was set to 5 m and 10 m, and the steps up were from 1 m to 20 m. For the open burning point source, we used the burning of agricultural waste materials such as palm wood, unwanted trees, leaves, grass, and dry weeds. Figure 7b shows the field test at King Mongkut’s Institute of Technology Ladkrabang Prince of Chumphon Campus. We note that the surrounding environment is clear and rarely impacted by the wind. The data was captured 30 times per position and recorded into the cloud network every 10 s. Our experimental data were recorded as a total of 1400 datasets for training the BPNN–CNN model.

4.3. Results

This subsection will report the experimental results when considering two altitudes level of the Dr-TAPM. The result of all air pollutant parameters at 5 m altitude are shown in Figure 8a–j, while the parameters at 10 m altitude are shown in Figure 9a–j. The 3D AQI maps are depicted in Figure 10a,b.
Figure 8a shows the result at 5 m of UAV altitude for the measurement data plot of PM2.5,10, with the actual data and the curve predicted by the BPNN–CNN model. The actual data of the PM2.5,10 level varied from 20 ug/m3 to 160 ug/m3, while the BPNN–CNN model can use the real-time monitoring of actual data to create a smoothed curve. Then, the predicted data model can perform modeling of the 3D mapping result for PM2.5,10, as shown in Figure 8b. We can see that the PM2.5,10 curve increased slightly from the distance of 10 m to 20 m. Figure 8c,d shows the actual data plot of CO emissions from the open burning source. The level of CO was 5.5 ppm to 45 ppm, according to the BPNN–CNN learning curve. It is well known that the CO must be lower than 50 ppm for the safety of health. The ozone (O3) level is shown in Figure 8e,f. The BPNN–CNN curve model predicted the actual O3 data from 6.3 ppm to 22 ppm. It can be seen that the ozone also increased from 10 m distance to 20 m. Generally, ozone must not be above 10 ppm, below which it is not a risk for health. Next, sulfur dioxide gas (SO2) is shown in Figure 8g,h. The result from the BPNN–CNN model curve varies from 40 ppb to 400 ppb. In practice, SO2 is dangerous for health if the level is higher than 100 ppb [3]. It can be seen that the result was higher than 100 ppb from 10 m to 20 m. Additionally, nitrogen dioxide gas (NO2) is shown in Figure 8i,j, with the results ranging from 12 ppb to 65 ppb. It should be noted that the NO2 must be not above 20 ppb [3]. When the UAV steps were close to the open burning source, then the result was higher than 20 ppb. We can see that both SO2 and NO2 gases are dangerous when sensed close to the open burning source.
The air pollutant results at 10 m of UAV altitude are shown in Figure 9a–j. The results are similar to those at 5 m. To distinguish the measurement data, it can be seen that the actual data is distributed and scattered more than previously. However, the data predicted by the BPNN–CNN model learns the uncertain data in real-time. Figure 9a,b shows PM2.5,10, where the BPNN–CNN curve was detected from 18 ug/m3 to 90 ug/m3. It is well known that PM2.5,10 should not be above 50 ug/m3. We observe that the smoke detection by Dr-TAPM captured PM2.5,10 over 50 ug/m3 at 12 m to 20 m near an open burning source. CO is shown in Figure 9c,d where the concentration was from 5 ppm to 36 ppm. Ozone (O3) was detected at 7–22 ppm, as shown in Figure 9e,f. SO2 gas was detected at 48 ppb to 200 ppb, and NO2 gas at 10 ppb to 70 ppb, as shown in Figure 9g–j.
It is apparent that the smoke from open burning resulted in highly intensive pollution to the air. In particular, PM2.5,10 and CO were the main parameters of air pollution in this experiment. The 3D AQI map location for both levels of the UAV monitor is shown in Figure 10a,b. The separation of color on the 3D AQI map shows the air pollution in the case study for the open burning scenario.
The accuracy of the BPNN–CNN model is shown with MAE, RMSE, and R2, in Table 2 and Table 3. We observe that the average accuracy of the proposed BPNN–CNN model was 0.992 in the 5 m UAV altitude test, and 0.965 at 10 m, measured by R2. Therefore, the average accuracy by R2 in the two UAV altitude scenarios was 0.979 or 98% as a percentage of the precision learning rate.

5. Conclusions

This paper presented a 3D AQI mapping data assessment for low-altitude drone real-time air pollution monitoring in a case study of smoke detection from an open burning scenario. The data assessment was performed successfully using Dr-TAPM as the mobility tool for operational air pollutant monitoring. The air pollutant parameters that were considered were carbon monoxide (CO), ozone (O3), nitrogen dioxide (NO2), particulate matter (PM2.5,10), and sulfur dioxide (SO2). Furthermore, the proposed BPNN–CNN model predicted the results of air pollution parameters and generated 3D AQI mapping locations. It was shown that the proposed BPNN–CNN model could assess air pollution monitoring at 98% accuracy for data assessment in an open burning scenario. The results of this work benefit the design and development of a low-altitude UAV for ground truth air pollution monitoring based on 3D remotely sensed AQI data-driven processing. In future work, path planning and 3D AQI trajectory mapping will be presented and the consideration of large-scale scenarios, flight operation time, and multiple open burning sources will be investigated.

Author Contributions

Conceptualization, S.D. and K.K.; methodology, S.D.; software, S.D.; validation, S.D., P.P., M.S. and K.K.; investigation, S.D.; writing—original draft preparation, S.D.; writing—review and editing, S.D., P.P., M.S. and K.K. All authors have read and agreed to the published version of the manuscript.

Funding

The research on 3D AQI Mapping Data Assessment of Low-Altitude Drone Real-Time Air Pollution Monitoring by King Mongkut’s Institute of Technology Ladkrabang/ Prince of Chumphon Campus received funding support from the National Science, Research and Innovation Fund (NSRF) under Grant No. RE-KRIS/FF65/50.

Data Availability Statement

The data presented in this study are available on request to the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. 2021 World Air Quality Report. Available online: https://www.iqair.com/th-en/world-most-polluted-countries (accessed on 19 July 2022).
  2. Pansuk, J.; Junpen, A.; Garivait, S. Assessment of air pollution from household solid waste open burning in Thailand. Sustainability 2018, 10, 2553. [Google Scholar] [CrossRef] [Green Version]
  3. Bhola, R.G.; Luisa, T.M.; Chandra, S.P.O. Air Pollution Health and Environmental Impacts; CRC Press: Boca Raton, FL, USA, 2010. [Google Scholar]
  4. Thanonphat, B.; Savitri, G.; Sebastein, B.; Agapol, J. An inventory of air pollutant emissions from biomass open burning in Thailand using MODIS burned area product (MCD45A1). J. Sustain. Energy Environ. 2014, 5, 85–94. [Google Scholar]
  5. Bui, X.N.; Lee, C.; Nguyen, Q.L.; Adeel, A.; Cao, X.C.; Nguyen, V.N.; Le, V.C.; Nguyen, H.; Le, Q.T.; Duong, T.H.; et al. Use of unmanned aerial vehicles for 3D topographic mapping and monitoring the air quality of open-pit mines. J. Pol. Miner. Eng. Soc. 2019, 21, 222–238. [Google Scholar] [CrossRef]
  6. Attila, S.; Simona, D.; Ioan, D.; Mihaela, F.-I.; Flaviu, M.F.-I. Air quality assessment system based on self-driven drone and LoRaWAN network. Comput. Commun. 2021, 175, 13–24. [Google Scholar]
  7. Rohi, G.; Ejofodomi, O.; Ofualagba, G. Autonomous monitoring, analysis, and countering of air pollution using environmental drones. Heliyon 2020, 6, e03252. [Google Scholar] [CrossRef] [Green Version]
  8. Duangsuwan, S.; Jamjareekulagarn, P. Development of drone real-time air pollution monitoring for mobile smart sensing in areas with poor accessibility. Sens. Mater. 2020, 32, 511–520. [Google Scholar] [CrossRef] [Green Version]
  9. Pochwala, S.; Gardecki, A.; Lewandowski, P.; Somogyi, V.; Anweiler, S. Developing of low-cost air pollution sensor-measurements with the unmanned aerial vehicles in Poland. Sensors 2020, 20, 3582. [Google Scholar] [CrossRef]
  10. Lambey, V.; Prasad, A. A review on air quality measurement using an unmanned aerial vehicle. Water Air Soil Pollut. 2021, 232, 109. [Google Scholar] [CrossRef]
  11. Alexandru, C.; Adrian, C.F.; Den, T.; Laura, R. Autonomous multi-rotor aerial platform for air pollution monitoring. Sensors 2021, 22, 860. [Google Scholar]
  12. Dharmendra, S.; Meenakshi, D.; Rahul, K.; Chintan, N. Sensors and systems for air quality assessment monitoring and management: A review. J. Environ. Manag. 2021, 289, 112510. [Google Scholar]
  13. Nguyen, Q.L.; Cao, X.C.; Le, V.C.; Nguyen, N.B.; Dang, A.T.; Le, Q.T.; Bui, X.N. 3D spatial interpolation methods for open-pit mining air quality with data acquired by small UAV based monitoring system. J. Pol. Miner. Eng. Soc. 2020, 263–273. [Google Scholar] [CrossRef]
  14. Yang, Y.; Zheng, Z.; Bian, K.; Song, L.; Han, Z. Real-time profiling of fine-grained air quality index distribution using UAV sensing. IEEE Internet Things J. 2018, 5, 186–196. [Google Scholar] [CrossRef]
  15. Wang, S.-Y.; Lin, W.-B.; Shu, Y.-C. Design of machine learning prediction system based on the internet of things framework for monitoring fine PM concentrations. Environments 2021, 8, 99. [Google Scholar] [CrossRef]
  16. Duran, Z.; Ozcan, K.; Atik, E.M. Classification of photogrammetric and airbore LiDAR point clouds using machine learning algorithms. Drones 2021, 5, 104. [Google Scholar] [CrossRef]
  17. Steenbeek, A.; Nex, F. CNN-based dense monocular visual SLAM for real-time UAV exploration in emergency conditions. Drones 2022, 6, 79. [Google Scholar] [CrossRef]
  18. Ding, Z.; Zhao, Y.; Li, A.; Zheng, Z. Spatial-temporal attention two-stream convolution neural network for smoke region detection. Fire 2021, 4, 66. [Google Scholar] [CrossRef]
  19. Guede-Fernandez, F.; Martins, L.; de Almeida, R.V.; Gamboa, H.; Vieira, P. A deep learning based object identification system for forest fire detection. Fire 2021, 4, 75. [Google Scholar] [CrossRef]
  20. Lin, G.; Zhang, Y.; Guo, X.; Zhang, Q. Smoke detection on video sequence using 3D convolution neural networks. Fire 2019, 55, 1827–1847. [Google Scholar]
  21. Wang, W.; Mao, W.; Tong, X.; Xu, G. A novel recursive model based on a convolutional long short-term memory neural network for air pollution prediction. Remote Sens. 2021, 13, 1284. [Google Scholar] [CrossRef]
  22. Ahmed, M.; Xiao, Z.; Shen, Y. Estimation of ground PM2.5 concentrations in Pakistan using convolutional neural network and multi-pollution satellite images. Remote Sens. 2022, 14, 1735. [Google Scholar] [CrossRef]
  23. Yang, Y.; Bai, Z.; He, Z.; Zheng, Z.; Bian, K.; Song, L. AQNet: Fine-Grained 3D spatio-temporal air quality monitoring by aerial-ground WSN. In Proceeding of the IEEE Conference on Computer Communications Poster and Demo, Honolulu, HI, USA, 15–19 April 2018; pp. 1–2. [Google Scholar]
  24. Kow, P.-Y.; Wang, Y.-S.; Zhou, Y.; Kao, I.-F.; Issermann, M.; Chang, L.-G.; Chang, F.-J. Seamless integration of convolutional and back–propagation neural networks for regional multi-step-ahead PM2.5 forecasting. J. Clean. Prod. 2020, 261, 121285. [Google Scholar] [CrossRef]
  25. Bai, L.; Wang, J.; Ma, X.; Lu, H. Air pollution forecasts: An overview. Int. J. Environ. Res. Public Health 2018, 15, 780. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Bai, Y.; Li, Y.; Wang, X.; Xie, J.; Li, C. Air pollutants concentrations forecasting using back propagation neural network based on wavelet decomposition with meteorological conditions. Atmos. Pollut. Res. 2016, 7, 557–566. [Google Scholar] [CrossRef]
  27. Zaini, N.; Ean, L.W.; Ahmed, A.N.; Malek, M.A. A systematic literature review of deep learning neural network for time series air quality forecasting. Environ. Sci. Pollut. Res. 2022, 29, 4958–4990. [Google Scholar] [CrossRef]
Figure 1. Framework for 3D AQI mapping data assessment.
Figure 1. Framework for 3D AQI mapping data assessment.
Drones 06 00191 g001
Figure 2. Dr-TAPM prototype.
Figure 2. Dr-TAPM prototype.
Drones 06 00191 g002
Figure 3. Circuit diagram of equipment inside the sensor box.
Figure 3. Circuit diagram of equipment inside the sensor box.
Drones 06 00191 g003
Figure 4. Data processing of synthesis 3D AQI mapping.
Figure 4. Data processing of synthesis 3D AQI mapping.
Drones 06 00191 g004
Figure 5. A schematic diagram of the hybrid model.
Figure 5. A schematic diagram of the hybrid model.
Drones 06 00191 g005
Figure 6. A schematic diagram of the hybrid model using BPNN–CNN.
Figure 6. A schematic diagram of the hybrid model using BPNN–CNN.
Drones 06 00191 g006
Figure 7. (a) Schematic model setup in 3D; (b) open burning smoke detection in the field test.
Figure 7. (a) Schematic model setup in 3D; (b) open burning smoke detection in the field test.
Drones 06 00191 g007
Figure 8. Air pollutant parameters at 5 m altitude: (a) PM2.5,10 in 2D; (b) PM2.5,10 in 3D; (c) CO in 2D; (d) CO in 3D; (e) O3 in 2D; (f) O3 in 3D; (g) SO2 in 2D; (h) SO2 in 3D; (i) NO2 in 2D; (j) NO2 in 3D.
Figure 8. Air pollutant parameters at 5 m altitude: (a) PM2.5,10 in 2D; (b) PM2.5,10 in 3D; (c) CO in 2D; (d) CO in 3D; (e) O3 in 2D; (f) O3 in 3D; (g) SO2 in 2D; (h) SO2 in 3D; (i) NO2 in 2D; (j) NO2 in 3D.
Drones 06 00191 g008aDrones 06 00191 g008b
Figure 9. Air pollutant parameters at 10 m altitude: (a) PM2.5,10 in 2D; (b) PM2.5,10 in 3D; (c) CO in 2D; (d) CO in 3D; (e) O3 in 2D; (f) O3 in 3D; (g) SO2 in 2D; (h) SO2 in 3D; (i) NO2 in 2D; (j) NO2 in 3D.
Figure 9. Air pollutant parameters at 10 m altitude: (a) PM2.5,10 in 2D; (b) PM2.5,10 in 3D; (c) CO in 2D; (d) CO in 3D; (e) O3 in 2D; (f) O3 in 3D; (g) SO2 in 2D; (h) SO2 in 3D; (i) NO2 in 2D; (j) NO2 in 3D.
Drones 06 00191 g009aDrones 06 00191 g009b
Figure 10. 3D AQI map location in color (a) at 5 m of UAV altitude; (b) at 10 m of UAV altitude.
Figure 10. 3D AQI map location in color (a) at 5 m of UAV altitude; (b) at 10 m of UAV altitude.
Drones 06 00191 g010
Table 1. Parameters for training the BPNN–CNN model.
Table 1. Parameters for training the BPNN–CNN model.
ParametersValues
Number of K raw datasets raw datasets1400
Number of n5
Number of m40
Testing data20%
Training data 80%
Table 2. Learning rate between the actual data and BPNN–CNN model in the 5 m altitude test.
Table 2. Learning rate between the actual data and BPNN–CNN model in the 5 m altitude test.
Air Pollutant ParametersMAERMSER2
PM2.5,100.3511.5610.995
CO0.5102.0540.983
O30.3521.5650.994
SO20.3101.3410.997
NO20.2501.1180.991
Average0.4021.7870.992
Table 3. Learning rate between the actual data and the BPNN–CNN model in the 10 m altitude test.
Table 3. Learning rate between the actual data and the BPNN–CNN model in the 10 m altitude test.
Air Pollutant ParametersMAERMSER2
PM2.5,101.0504.6960.964
CO0.9504.2480.968
O31.1004.9190.961
SO21.1505.1430.956
NO20.8503.8010.979
Average1.0204.5610.965
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Duangsuwan, S.; Prapruetdee, P.; Subongkod, M.; Klubsuwan, K. 3D AQI Mapping Data Assessment of Low-Altitude Drone Real-Time Air Pollution Monitoring. Drones 2022, 6, 191. https://doi.org/10.3390/drones6080191

AMA Style

Duangsuwan S, Prapruetdee P, Subongkod M, Klubsuwan K. 3D AQI Mapping Data Assessment of Low-Altitude Drone Real-Time Air Pollution Monitoring. Drones. 2022; 6(8):191. https://doi.org/10.3390/drones6080191

Chicago/Turabian Style

Duangsuwan, Sarun, Phoowadon Prapruetdee, Mallika Subongkod, and Katanyoo Klubsuwan. 2022. "3D AQI Mapping Data Assessment of Low-Altitude Drone Real-Time Air Pollution Monitoring" Drones 6, no. 8: 191. https://doi.org/10.3390/drones6080191

APA Style

Duangsuwan, S., Prapruetdee, P., Subongkod, M., & Klubsuwan, K. (2022). 3D AQI Mapping Data Assessment of Low-Altitude Drone Real-Time Air Pollution Monitoring. Drones, 6(8), 191. https://doi.org/10.3390/drones6080191

Article Metrics

Back to TopTop