Next Article in Journal
Experimental Investigation on Compressive Properties and Carbon Emission Assessment of Concrete Hollow Block Masonry Incorporating Recycled Concrete Aggregates
Next Article in Special Issue
A New Approach of Hybrid Bee Colony Optimized Neural Computing to Estimate the Soil Compression Coefficient for a Housing Construction Project
Previous Article in Journal
A Brain-Inspired Goal-Oriented Robot Navigation System
Previous Article in Special Issue
Adaptive Network Based Fuzzy Inference System with Meta-Heuristic Optimizations for International Roughness Index Prediction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Artificial Intelligence Technique to Estimate the Gross Calorific Value of Coal Based on Meta-Heuristic and Support Vector Regression Algorithms

1
Faculty of Geosciences and Geoengineering, Hanoi University of Mining and Geology, 18 Vien street, Duc Thang ward, Bac Tu Liem District, Hanoi 100000, Vietnam
2
Center for Excellence in Analysis and Experiment, Hanoi University of Mining and Geology, 18 Vien street, Duc Thang ward, Bac Tu Liem District, Hanoi 100000, Vietnam
3
Institute of Research and Development, Duy Tan University, Da Nang 550000, Vietnam
4
Department of Energy Resources Engineering, Pukyong National University, Busan 48513, Korea
5
Department of Surface Mining, Mining Faculty, Hanoi University of Mining and Geology, 18 Vien Street, Duc Thang Ward, Bac Tu Liem District, Hanoi 100000, Vietnam
6
Center for Mining, Electro-Mechanical Research, Hanoi University of Mining and Geology, 18 Vien Street, Duc Thang Ward, Bac Tu Liem District, Hanoi 100000, Vietnam
7
Division of Computational Mathematics and Engineering, Institute for Computational Science, Ton Duc Thang University, Ho Chi Minh City 70000, Vietnam
8
Faculty of Civil Engineering, Ton Duc Thang University, Ho Chi Minh City 70000, Vietnam
9
Department of Civil Engineering, Tabriz Branch, Islamic Azad University, Tabriz 51368, Iran
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2019, 9(22), 4868; https://doi.org/10.3390/app9224868
Submission received: 25 September 2019 / Revised: 3 November 2019 / Accepted: 7 November 2019 / Published: 14 November 2019
(This article belongs to the Special Issue Meta-heuristic Algorithms in Engineering)

Abstract

:
Gross calorific value (GCV) is one of the essential parameters for evaluating coal quality. Therefore, accurate GCV prediction is one of the primary ways to improve heating value as well as coal production. A novel evolutionary-based predictive system was proposed in this study for predicting GCV with high accuracy, namely the particle swarm optimization (PSO)-support vector regression (SVR) model. It was developed based on the SVR and PSO algorithms. Three different kernel functions were employed to establish the PSO-SVR models, including radial basis function, linear, and polynomial functions. Besides, three benchmark machine learning models including classification and regression trees (CART), multiple linear regression (MLR), and principle component analysis (PCA) were also developed to estimate GCV and then compared with the proposed PSO-SVR model; 2583 coal samples were used to analyze the proximate components and GCV for this study. Then, they were used to develop the mentioned models as well as check their performance in experimental results. Root-mean-squared error (RMSE), correlation coefficient (R2), ranking, and intensity color criteria were used and computed to evaluate the GCV predictive models developed. The results revealed that the proposed PSO-SVR model with radial basis function had better accuracy than the other models. The PSO algorithm was optimized in the SVR model with high efficiency. These should be used as a supporting tool in practical engineering to determine the heating value of coal seams in complex geological conditions.

1. Introduction

Coal is one of the non-renewable natural resources, like oil and gas [1]. It is a fossil fuel that is widely used in the metallurgical industry and in thermal power plants [2,3]. Besides, it is also used for cement industries, industrial chemicals, and aluminum, to name a few [4]. So far, coal still accounts for 40% to 50% of energy generation in the world [5]. The main components of coal include carbon (C), hydrogen (H), nitrogen (N), sulfur (S) and oxygen (O). For the chemical parameters, fixed carbon (FC), ash (A), volatile matter (VM), and moisture (M) are the necessary components of coal [6]. These are the main parameters that decide the heat value of coal [7,8]. A literature review indicated that most of coal applications are related to heating value (gross calorific value-GCV) [9]. Depending on each specific field (e.g., steel industry, cement, power plant, to name a few), the GCV requirements are different [10]. Therefore, determination of GCV in coal seams essential for calculating energy demand, and improving economic effectiveness, as well as selective extraction.
In recent years, scientific and technological issues related to energy and fuel have achieved many new achievements. Simulation and optimization methods for the use of fuel have also been proposed [11,12,13,14,15,16,17,18,19,20]. Environmental issues due to fuel impacts have also been considered and studied [21,22,23,24,25]. Among the types of fuel, the GCV of coal is considered as a primary fuel type for thermal plans. For determination coal GCV, many scholars have been carried out to investigate the relationship between GCV and proximate analysis (i.e., FC, A, VM, and M) as well as coal properties [26,27,28]. For instance, many equations and soft computing models have been proposed. Empirical methods for estimating GCV of coal have been introduced very early [29,30,31,32]. However, they usually take a long time with low accuracy. In recent years, artificial intelligence (AI) is well-known as an advanced technique to solve most of the complex problems relevant engineering, especially in the field of energy and fuels [33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51]. For predicting the GCV of coal, Akkaya [52] applied multiple nonlinear regression models for predicting GCV of coal with high reliability. Tan et al. [28] also developed a soft computing model for predicting GCV based on support vector regression (SVR) algorithm with a promising result. In another study, an artificial neural network (ANN) successfully studied and applied in predicting GCV by Mesroghli et al. [53]. They concluded that the ANN model was suitable for estimating GCV with high accuracy. Erik Yilmaz [54] also developed ANN and an adaptive neuro–fuzzy inference mode (ANFIS) to simulate the GCV of coal. In their study, the ANFIS model exhibited accuracy better than the ANN model. Another technique using wavelet neural networks (WNNs) was also introduced and was applied to estimate GCV by Wen et al. [55]. They compared their WNNs model with previous methods (employed and redeveloped previous models) to develop a complete conclusion. Their results indicated that the WNNs model yielded higher accuracy than the earlier methods in their study. By a new approach, Wood [56] developed a transparent open-box learning networks (TOB) algorithm for predicting the GCV of coal. He found that the TOB algorithm can predict GCV with a tiny error. In addition, many similar works were also carried out to predict GCV using AI techniques, such as [57,58,59,60,61,62,63,64,65].
As in our review of GCV prediction, many soft computing models were widely developed and applied during recent years. However, their performance was not confirmed in other areas since the coal quality in each region/area/seam was different. Furthermore, new soft computing/AI models, with the capability of accurate prediction of GCV, have become the goal of scholars. Therefore, this study aims to reach several novelties with scientific soundness, and can be summarized as follow:
-
Big data with 2583 coal samples were used to analyze proximate components and the GCV for this study.
-
A meta-heuristic algorithm (i.e., a particle swarm optimization (PSO) algorithm) was considered to optimize the support vector regression (SVR) models for predicting GCV. Subsequently, the particle swarm optimization (PSO)-support vector regression (SVR) model with the radial basis kernel function was proposed as the best model for the aiming of GCV prediction.
-
A variety of different AI models were also developed to predict GCV and compared with the proposed PSO-SVR model, including classification and regression trees (CART), multiple linear regression, and principles component analysis (PCA).

2. Study Area and Data Properties

This study was undertaken in an underground coal mine in Quang Ninh province, Vietnam. It lies geographically between longitudes 106°36′23″ E and 106°36′47″ E; latitude 21°04′10″ N and 21°04′30″ N (Figure 1). The study site shows a complex geological structure with many faults and folds. The results of stratigraphic studies indicated that the mine area consists of coal-bearing sediments of Hon Gai Formation (T3n-r hg) with a thickness of 600–800 m (Figure 1). The coal seams have different thickness alternating with layers of gravel, sandstone, sandstone, siltstone, and claystone. Coal in the study site belongs to anthracite to bituminous coal. It is black and has black and gray markings, glass or metallic luster. The coal is mainly homogeneous, brittle, and less cracked. The previous results indicated that the study site coal has a high calorific value, low volatile matter, and sulfur content, with an excellent response to industries.
In this study, the database was collected from previous geological reports at the Mong Duong coal mine for many years [66]. The database consisted of the results of the proximate and ultimate analysis along with the gross calorific values from the coal samples taken at different coal seams in the mine. Sampling, processing, and analyzing methods were carried out by Vietnam’s standards (TCVN) [67,68,69,70]. In this study, the results of the proximate analysis for coal samples were used to predict the GCV. The values of the moisture content (M), ash (A), and volatile matter (VM) for each sample were measured in the laboratory. According to Patel et al. [71], fixed carbon (FC) was considered as an input variable for predicting GCV. However, FC can be calculated by the given equation of FC = 100 − (M + A + VM). Therefore, the correlation between FC and M, A, and VM is 100%. As per recommended in statistical techniques, high correlation parameters should be removed from the dataset to avoid effect on the accuracy of the models [72,73]. Matin, Chelgani [74] were also given similar recommendations and the FC was also not used in their study for predicting GCV. Therefore, FC was not used to predict GCV in this study. Herein, a dataset of 2583 coal samples used in this study included factors of M, A, VM, and GCV. A brief of the dataset used in this study is listed in Table 1. In addition, an illustration of the dataset used in this study is shown in Figure 2.

3. Methods

3.1. Support Vector Regression (SVR)

The name SVR indicates a well-known notion of one of the most commonly used AI techniques, which was first proposed by Vapnik [75]. As well as SVR, SVC (support vector classification) is another type of SVM (support vector machine). However, SVR is known as the most commonly held tool for simulating continuous problems. More details about the SVR are available in previous studies such as [76,77,78,79]. Moreover, three kernel functions which are suitable for regression problems are “linear function”, “polynomial function”, and “radial basis function” that are formulated in Equations (1)–(3), respectively:
F ( X G C V , Y G C V ) = X G C V T Y G C V
F ( X G C V , Y G C V ) = ( γ X G C V T Y G C V + r ) d ; γ > 0 ; d = ( 1 , 2 , , n )
F ( X G C V , Y G C V ) = exp [ X G C V Y G C V 2 2 σ 2 ]
where n is the number of observations; X and Y denote the input and output variables, respectively; r , d , γ , and σ denote the hyper-parameters of the SVR models. In addition, C (cost) is considered as a penalty parameter which is also used to control the quality of the SVR models.

3.2. Particle Swarm Optimization (PSO)

The PSO algorithm is known as one of the powerful optimization algorithms, which was introduced by Eberhart, Kennedy [80]. Inspired by the behavior of the particles/social animals, PSO simulates the movement of a fish flock or choreography of a flock bird, or to a swarm of insects. It is explained based on the feeding instincts of animals. They tend to follow individuals that lead to optimal sources of food [81]. In addition, the process of exchanging information between individuals is carried out continuously during the migration process to point to the most optimal source of food. In addition, more details in the PSO algorithm can be found in the following literature [82,83,84,85,86]. The pseudo-code of the PSO algorithm is described as follows [87]:
Algorithm: The pseudo-code of PSO (particle swarm optimization) algorithm
1for each particle i
2   for each dimention d
3     Initialize position xid randomly within permissible range
4     Initialize velocity vid randomly within permissible range
5   end for
6end for
7 Iteration k = 1
8do
9   for each particle i
10     Calculate fitness value
11     if the fitness value is better than p_bestid in history
12       Set current fitness value as the p_bestid
13     end if
14   end for
15   Choose the particle having the best fitness value as the g_bestid
16   for each particle i
17     for each dimention d
18       Calculate velocity according to the following equation
v j i + 1 = w v j ( i ) + ( c 1 × r 1 × ( l o c a l b e s t j x j ( i ) ) ) + ( c 2 × r 2 × ( g l o b a l b e s t j x j ( i ) ) ) , v min v j ( i ) v max
19       Update particle position according to the following equation
x j i + 1 = x j ( i ) + v j ( i + 1 ) ; j = 1 , 2 , , n
20     end for
21   end for
22k = k+1
23while maximum iterations or minimum error criteria are not attained

3.3. PSO-SVR Model for Estimating GCV

As the primary goal of this study, the new hybrid artificial intelligence model PSO-SVR is described and offered in this section. It is a combination of the PSO and SVR algorithms for the generation of an optimal model in predicting GCV. Accordingly, SVR was determined as the primary model for predicting GCV. For the development of the SVR model, three kernel functions (i.e., radius basis function (RBF), Polynomial (P), and Linear (L)) were applied to map the GCV database. For each kernel function, hyper-parameter(s) were determined, as listed in Table 2. The primary purpose of the hyper-parameters was controlling the accuracy of the SVR model. Therefore, the selection of the optimal values of hyper-parameters was a complication, and it needed a solution for defining the optimal hyperparameters. In this case, the PSO algorithm was an optimal solution for searching optimal hyper-parameters of the SVR model. It performed a global search solution based on particles and their experiences. For each value of hyper-parameter searched, it calculated the performance of the SVR model through a fitness function (i.e., root-mean-squared error (RMSE)). Then, it checked the termination criteria. If the value of RMSE was satisfied (lowest RMSE), it stopped the searching process and gave the final PSO-SVR model. Otherwise, it continued searches until it was satisfied, or it reached the maximum of iterations. Figure 3 describes the framework of the proposed PSO-SVR model with three forms of kernel functions.

3.4. Multiple Linear Regression (MLR)

MLR is one of the simply-implemented regression models used to predict direct correlations between two or more variables. It works on the principle of analyzing the relationship between the independent (i.e., influential) variables and dependent (i.e., target) variables [88]. The general equation of MLR is described as follows:
Y = a 1 X 1 + a 2 X 2 + + a n X n + b
where X1 to Xn represents the independent factors; n symbolizes the number of independent factors; Y is the dependent variable; a1 to an are the correlation coefficients of independent variables and dependent variables; b is the deviation value.

3.5. Classification and Regression Tree (CART)

CART is a famous notion of decision tree approaches in data mining, which was proposed by Breiman [89] (this book was introduced first in 1984, and reprinted in 2017). The two main advantages of the CART algorithm include:
-
Ability to explain the rules created quickly;
-
Ability to apply for both classification and regression problems.
For the regression problem, CART performs separation rules in the form of “ X C ? ” or “ X C ? ” [90]. Its separation mechanism is primarily based on the parent’s node to separate duality into two children nodes [91]. The separation process can be conducted in multiple branches until the initial condition is satisfied. The pseudo-code of the CART algorithm can be found in [92].

3.6. Principles Component Analysis (PCA)

PCA is one of the multivariate data processor techniques, which was suggested by Wold et al. [93]. The nature of PCA is to implement a dimensionality reduction procedure. This method is based on the observation that data is often not distributed randomly in space but is often distributed near specific lines/faces [94]. PCA considers a particular case when those different faces are linear forms of subspaces [95]. For PCA modeling, seven steps are implemented, as illustrated in Figure 4.

4. Model Assessment Indices

Before developing the models, assessment indices were needed to evaluate their accuracy, as well as the error level of the model. Herein, two benchmark statistical criteria were used, including R2 and RMSE. They were computed as:
R M S E = 1 n k = 1 n ( y G C V y ^ G C V ) 2
R 2 = 1 k ( y G C V y ^ G C V ) 2 k ( y G C V y ¯ ) 2
where n is the number of samples; y k , y ^ k and y ¯ are measured, predicted, and the mean of y k values, respectively.

5. Results and Discussion

Before developing the mentioned models, the dataset needed to prepare and normalize. According to previous research, the dataset should be divided into two groups with an 80/20 ratio [96,97,98]. Hence, a split procedure with 80% of the whole dataset (~2327 samples) was used for training. Then, the remaining 20% (~256 samples) was used for evaluating the developed predictive models. Note that all mentioned models were generated based on the same training dataset, and tested based on the same testing dataset. RMSE and R2 were used to evaluate their performance on both training and testing datasets.

5.1. PSO-SVR Models

As stated above, the proposed PSO-SVR model was tried with the three kernel functions (i.e., RBF, P, and L) and their parameters, as shown in Table 2. For implementing the PSO-SVR models, the framework in Figure 6 was applied. The PSO algorithm and its parameters were needed to set up before searching the optimal hyperparameters of the SVR models. In which, the swarm size (s), maximum particle’s velocity (Vmax), the maximum number of iterations (i), individual and group cognitive parameters ( ϕ 1 , ϕ 2 ), and inertia weight (w) were the parameters of PSO, which were used for the optimization process. According to previous works, s should be enough to ensure diversity [35,99,100]; thus, the swarm size of 100, 200, 300, 400, 500, respectively, was applied for the PSO procedure. The other parameters (i.e., Vmax, ϕ 1 , ϕ 2 , and w) was set to 2.0, 1.8, 1.8, and 0.9, respectively, according to recommendations of previous researchers [101,102,103,104]. To check the termination criteria of the optimization process as well as ensure the optimal search process, i was set to 1000 iterations.
Once the parameters of the PSO algorithm were established, the SVR models with various kernel functions were generated with initiation parameters. Then, the PSO procedure performed searching optimal values according to the hyper-parameters in Table 2. Finally, the optimal values of the hyper-parameters were determined with the lowest RMSE, and the PSO global search procedure was stopped after 1000 iterations. Table 3 shows the optimal values of the hyper-parameters with respectively kernel functions. In addition, Figure 5, Figure 6 and Figure 7 show the performance of the optimization process based on kernel functions (i.e., PSO-SVR-L; PSO-SVR-P; PSO-SVR-RBF).

5.2. MLR Model

For MLR modeling, a simple multivariate regression technique in the Eview environment has been implemented. As a result, the MLR formula for predicting GCV based on the training dataset was determined as the following function:
G C V = 14.212 M 92.162 A 21.985 V M + 8647.379

5.3. CART Model

Regarding the CART model for predicting GCV, the complexity parameter (cp) was used as the main parameter for adjusting the model performance. k-fold cross-validation method was considered in the development of the CART model to avoid overfitting [105]. A grid search with cp lies in the range of 0 to 1 was established to find the optimal value of cp in the present study. Eventually, the cp value of 0 was defined as the optimal value for the CART model as shown in Figure 8.

5.4. PCA Model

For PCA modeling, the number of components ( ϑ ) was used as the significant parameter for turning the performance of the PCA model. A trial and error procedure with three PCA models ( ϑ = 1, 2, respectively) was conducted. As a result, the best PCA model reached ϑ = 2 as computed in Table 4. Note that, the k-fold cross-validation method was also used for the development of the PCA model.

5.5. Evaluation

Once the models were established based on the training dataset, their performance needed to re-checked by experimental results, i.e., the testing dataset. For this state, the testing dataset was tested by the developed models. To have a general assessment of the models, a simple ranking and intensity color methods were applied as shown in Table 5 based on the RMSE and R2 values. Both training and testing processes were evaluated in this section.
The results of Table 5 revealed that the AI models provided high reliability in predicting GCV in this study. Color intensity showed that the PSO-SVR model with the radial basis kernel function (i.e., PSO-SVR-RBF) reached the highest accuracy on both training and testing datasets. Additionally, the total ranking of 24 was also interpreted the PSO-SVR-RBF performance. With a slightly lighter color intensity and the overall ranking of 20, the PSO-SVR model with polynomial kernel function (i.e., PSO-SVR-P model) yielded somewhat weaker performance than those of the PSO-SVR-RBF model. Among the PSO-SVR models with different kernel functions (i.e., RBF, P, L), the PSO-SVR model with the linear kernel function (i.e., PSO-SVR-L model) provided lowest performance with a total ranking of eight. Its performance was even lower than those of the MLR model (with an overall ranking of 13). The CART and PCA are the models that provided the poor performance in this study, especially is PCA based on the RMSE, R2 and ranking values in Table 5. Indeed, the PCA model provided the weakest performance with an RMSE of 439.585, R2 of 0.794, and a total ranking of four. Figure 9, Figure 10, Figure 11, Figure 12, Figure 13 and Figure 14 show the deviation between the analyzed and predicted values of GCV in the developed models.

6. Conclusions

Coal is one of the essential fuels for the development of countries, especially developing countries. The demand for the heat supplied by coal is enormous and needs to be calculated and forecasted. Based on the experimental analysis of coal GCV in the laboratory, this study was proposed a new hybrid AI model for predicting GCV of coal with high reliability as well as the accuracy, i.e., the PSO-SVR-RBF model. It can be considered as a robust supporting tool in estimating GCV before exploitation/using coal for heat purposes, such as thermal power plants, the metallurgy industry, and heating load systems, to name a few. The PSO algorithm implemented optimization of the SVR model by searching for the most proper hyper-parameters. Diverse kernel functions include radial, polynomial, and linear functions were tested and evaluated when combined with the SVR model and optimized by the PSO algorithm. The results showed that the RBF is the best-suited kernel function for PSO-SVR in predicting the GCV of coal in this study. This study contributed an acknowledgement as well as a new AI model (i.e., PSO-SVR-P) in statistical communities for predicting GCV with high accuracy. In addition, the developed models for estimating GCV can be considered to similarly apply to other solid fuels such as biomass or peat. However, they need further research, especially in regards to influential parameters for solid fuels. Future work can also be introduced and proposed based on the obtained results of this study. For example, the efficient energy of solid fuels, as well as renewable energy, can be predicted using potential AI models.

Author Contributions

Data collection and experimental works: H.-B.B., H.N., X.-N.B. Writing, discussion, analysis, and revision: H.-B.B., H.N., X.-N.B., T.N.-T., Y.C. and Y.Z.

Funding

This work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2018R1D1A1A09083947).

Acknowledgments

The authors would like to thank Hanoi University of Mining and Geology (HUMG), Hanoi, Vietnam; Duy Tan University, Da Nang, Vietnam; the Center for Excellence in Analysis and Experiment and the Center for Mining, Electro-Mechanical research of HUMG, Duy Tan University, Da Nang, Vietnam, and Ton Duc Thang University, Ho Chi Minh City, Vietnam.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Suárez-Ruiz, I.; Diez, M.A.; Rubiera, F. Coal. In New Trends in Coal Conversion; Elsevier: Amsterdam, The Netherlands, 2019; pp. 1–30. [Google Scholar]
  2. Corbin, D.A. Life, Work, and Rebellion in the Coal Fields: The Southern West Virginia Miners, 1880–1922, 2nd ed.; West Virginia University Press: Morgantown, WV, USA, 2015. [Google Scholar]
  3. McHugh, L. World energy needs: A role for coal in the energy mix. In Coal in the 21st Century; The Royal Society of Chemistry: Paris, France; pp. 1–29.
  4. Dai, S.; Finkelman, R.B. Coal as a promising source of critical elements: Progress and future prospects. Int. J. Coal Geol. 2018, 186, 155–164. [Google Scholar] [CrossRef]
  5. Jordan, B.; Lange, I.; Linn, J. Coal Demand, Market Forces, and, U.S. Coal Mine Closures; Poschingerstr. 5; Muich Society for the Promotion of Economic Research—CESifo GmbH: Munich, Germany, 2018. [Google Scholar]
  6. Ward, C.R.; Suárez-Ruiz, I. Introduction to applied coal petrology. In Applied Coal Petrology; Elsevier: Amsterdam, The Netherlands, 2008; pp. 1–18. [Google Scholar]
  7. Majumder, A.; Jain, R.; Banerjee, P.; Barnwal, J. Development of a new proximate analysis based correlation to predict calorific value of coal. Fuel 2008, 87, 3077–3081. [Google Scholar] [CrossRef]
  8. Pandey, J.; Mohalik, N.; Mishra, R.K.; Khalkho, A.; Kumar, D.; Singh, V. Investigation of the role of fire retardants in preventing spontaneous heating of coal and controlling coal mine fires. Fire Technol. 2015, 51, 227–245. [Google Scholar] [CrossRef]
  9. Li, J.; Zhuang, X.; Querol, X.; Font, O.; Moreno, N. A review on the applications of coal combustion products in China. Int. Geol. Rev. 2018, 60, 671–716. [Google Scholar] [CrossRef]
  10. Durković, R.; Grujičić, R. An approach to determine the minimum specific fuel consumption and engine economical operation curve model. Measurement 2019, 132, 303–308. [Google Scholar] [CrossRef]
  11. Wang, B.; Du, Y.; Xu, N. Simulation and experimental verification on dynamic calibration of fuel gear flowmeters. Measurement 2019, 138, 570–577. [Google Scholar] [CrossRef]
  12. Oliveira, S.P.; Rocha, A.C.; Jorge Filho, T.; Couto, P.R. Uncertainty of measurement by Monte-Carlo simulation and metrological reliability in the evaluation of electric variables of PEMFC and SOFC fuel cells. Measurement 2009, 42, 1497–1501. [Google Scholar] [CrossRef]
  13. Albarbar, A.; Gu, F.; Ball, A. Diesel engine fuel injection monitoring using acoustic measurements and independent component analysis. Measurement 2010, 43, 1376–1386. [Google Scholar] [CrossRef]
  14. Kumar, S.; Dinesha, P. Optimization of engine parameters in a bio diesel engine run with honge methyl ester using response surface methodology. Measurement 2018, 125, 224–231. [Google Scholar] [CrossRef]
  15. Zhou, M.; Zhang, Y.; Jin, S. Dynamic optimization of heated oil pipeline operation using PSO–DE algorithm. Measurement 2015, 59, 344–351. [Google Scholar] [CrossRef]
  16. Sierra, A.; Gercek, C.; Übermasser, S.; Reinders, A. Simulation-supported testing of smart energy product prototypes. Appl. Sci. 2019, 9, 2030. [Google Scholar] [CrossRef]
  17. Nguyen, H.; Bui, X.-N.; Nguyen-Thoi, T.; Ragam, P.; Moayedi, H. Toward a state-of-the-art of fly-rock prediction technology in open-pit mines using EANNs model. Appl. Sci. 2019, 9, 4554. [Google Scholar] [CrossRef]
  18. Fang, Q.; Nguyen, H.; Bui, X.-N.; Nguyen-Thoi, T. Prediction of blast-induced ground vibration in open-pit mines using a new technique based on imperialist competitive algorithm and M5Rules. Nat. Resour. Res. 2019, 1–16. [Google Scholar] [CrossRef]
  19. Fang, Q.; Nguyen, H.; Bui, X.-N.; Tran, Q.-H. Estimation of blast-induced air overpressure in quarry mines using cubist-based genetic algorithm. Nat. Resour. Res. 2019, 1–15. [Google Scholar] [CrossRef]
  20. Bui, X.-N.; Choi, Y.; Atrushkevich, V.; Nguyen, H.; Tran, Q.-H.; Long, N.Q.; Hoang, H.-T. Prediction of blast-induced ground vibration intensity in open-pit mines using unmanned aerial vehicle and a novel intelligence system. Nat. Resour. Res. 2019, 1–20. [Google Scholar] [CrossRef]
  21. Andria, G.; Attivissimo, F.; Di Nisio, A.; Trotta, A.; Camporeale, S.; Pappalardi, P. Design of a microwave sensor for measurement of water in fuel contamination. Measurement 2019, 136, 74–81. [Google Scholar] [CrossRef]
  22. Korhonen, I.; Ahola, J. Studying thermal protection for mobile sensor operating in combustion environment. Measurement 2019. [Google Scholar] [CrossRef]
  23. Aguilar-Jiménez, J.A.; Velázquez, N.; López-Zavala, R.; González-Uribe, L.A.; Beltrán, R.; Hernández-Callejo, L. Simulation of a solar-assisted air-conditioning system applied to a remote school. Appl. Sci. 2019, 9, 3398. [Google Scholar] [CrossRef]
  24. Le, L.T.; Nguyen, H.; Dou, J.; Zhou, J. A Comparative study of PSO-ANN, GA-ANN, ICA-ANN, and ABC-ANN in estimating the heating load of buildings’ energy efficiency for smart city planning. Appl. Sci. 2019, 9, 2630. [Google Scholar] [CrossRef]
  25. Le, L.T.; Nguyen, H.; Zhou, J.; Dou, J.; Moayedi, H. Estimating the heating load of buildings for smart city planning using a novel artificial intelligence technique PSO-XGBoost. Appl. Sci. 2019, 9, 2714. [Google Scholar] [CrossRef]
  26. Mancini, M.; Rinnan, Å.; Pizzi, A.; Toscano, G. Prediction of gross calorific value and ash content of woodchip samples by means of FT-NIR spectroscopy. Fuel Process. Technol. 2018, 169, 77–83. [Google Scholar] [CrossRef]
  27. Feng, Q.; Zhang, J.; Zhang, X.; Wen, S. Proximate analysis based prediction of gross calorific value of coals: A comparison of support vector machine, alternating conditional expectation and artificial neural network. Fuel Process. Technol. 2015, 129, 120–129. [Google Scholar] [CrossRef]
  28. Tan, P.; Zhang, C.; Xia, J.; Fang, Q.-Y.; Chen, G. Estimation of higher heating value of coal based on proximate analysis using support vector regression. Fuel Process. Technol. 2015, 138, 298–304. [Google Scholar] [CrossRef]
  29. Spooner, C. Swelling power of coal. Fuel 1951, 30, 193–202. [Google Scholar]
  30. Mazumdar, B. Coal systematics: Deductions from proximate analysis of coal part I. J. Sci. Ind. Res. B 1954, 13, 857–863. [Google Scholar]
  31. Given, P.H.; Weldon, D.; Zoeller, J.H. Calculation of calorific values of coals from ultimate analyses: Theoretical basis and geochemical implications. Fuel 1986, 65, 849–854. [Google Scholar] [CrossRef]
  32. Parikh, J.; Channiwala, S.; Ghosal, G. A correlation for calculating HHV from proximate analysis of solid fuels. Fuel 2005, 84, 487–494. [Google Scholar] [CrossRef]
  33. Nguyen, H.; Bui, X.-N.; Tran, Q.-H.; Mai, N.-L. A new soft computing model for estimating and controlling blast-produced ground vibration based on hierarchical K-means clustering and cubist algorithms. Appl. Soft Comput. 2019, 77, 376–386. [Google Scholar] [CrossRef]
  34. Nguyen, H.; Drebenstedt, C.; Bui, X.-N.; Bui, D.T. Prediction of blast-induced ground vibration in an open-pit mine by a novel hybrid model based on clustering and artificial neural network. Nat. Resour. Res. 2019, 1–19. [Google Scholar] [CrossRef]
  35. Nguyen, H.; Moayedi, H.; Jusoh, W.A.W.; Sharifi, A. Proposing a novel predictive technique using M5Rules-PSO model estimating cooling load in energy-efficient building system. Eng. Comput. 2019, 1–10. [Google Scholar] [CrossRef]
  36. Nguyen, H.; Moayedi, H.; Foong, L.K.; Al Najjar, H.A.H.; Jusoh, W.A.W.; Rashid, A.S.A.; Jamali, J. Optimizing ANN models with PSO for predicting short building seismic response. Eng. Comput. 2019, 1–15. [Google Scholar] [CrossRef]
  37. Balaeva, Y.S.; Kaftan, Y.S.; Miroshnichenko, D.V.; Kotliarov, E.I. Influence of Coal Properties on the Gross Calorific Value and Moisture-Holding Capacity. Coke Chem. 2018, 61, 4–11. [Google Scholar] [CrossRef]
  38. Balaeva, Y.S.; Miroshnichenko, D.V.; Kaftan, Y.S. Predicting the classification characteristics of coal. Part 1. The gross calorific value in the wet ash-free state. Coke Chem. 2015, 58, 321–328. [Google Scholar] [CrossRef]
  39. Kumari, P.; Singh, A.K.; Wood, D.A.; Hazra, B. Predictions of gross calorific value of indian coals from their moisture and ash content. J. Geol. Soc. India 2019, 93, 437–442. [Google Scholar] [CrossRef]
  40. Wood, D.A. Sensitivity analysis and optimization capabilities of the transparent open-box learning network in predicting coal gross calorific value from underlying compositional variables. Model. Earth Syst. Environ. 2019, 5, 1–14. [Google Scholar] [CrossRef]
  41. Martinka, J.; Martinka, F.; Rantuch, P.; Hrušovský, I.; Blinová, L.; Balog, K. Calorific value and fire risk of selected fast-growing wood species. J. Therm. Anal. Calorim. 2018, 131, 899–906. [Google Scholar] [CrossRef]
  42. Sampath, K.; Perera, M.; Ranjith, P.; Matthai, S.; Tao, X.; Wu, B. Application of neural networks and fuzzy systems for the intelligent prediction of CO2-induced strength alteration of coal. Measurement 2019, 135, 47–60. [Google Scholar] [CrossRef]
  43. Sun, J.; Qi, G.; Zhu, Z. A sparse neural network based control structure optimization game under dos attacks for des frequency regulation of power grid. Appl. Sci. 2019, 9, 2217. [Google Scholar] [CrossRef] [Green Version]
  44. Wang, D.-L.; Sun, Q.-Y.; Li, Y.-Y.; Liu, X.-R. Optimal energy routing design in energy internet with multiple energy routing centers using artificial neural network-based reinforcement learning method. Appl. Sci. 2019, 9, 520. [Google Scholar] [CrossRef] [Green Version]
  45. Asteris, P.; Kolovos, K.; Douvika, M.; Roinos, K. Prediction of self-compacting concrete strength using artificial neural networks. Eur. J. Environ. Civ. Eng. 2016, 20 (Suppl. 1), s102–s122. [Google Scholar] [CrossRef]
  46. Asteris, P.; Roussis, P.; Douvika, M. Feed-forward neural network prediction of the mechanical properties of sandcrete materials. Sensors 2017, 17, 1344. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  47. Asteris, P.G.; Ashrafian, A.; Rezaie-Balf, M. Prediction of the compressive strength of self-compacting concrete using surrogate models. Comput. Concr. 2019, 24, 137–150. [Google Scholar]
  48. Xu, H.; Zhou, J.; Asteris, P.G.; Jahed Armaghani, D.; Tahir, M.M. Supervised machine learning techniques to the prediction of tunnel boring machine penetration rate. Appl. Sci. 2019, 9, 3715. [Google Scholar] [CrossRef] [Green Version]
  49. Sarir, P.; Chen, J.; Asteris, P.G.; Armaghani, D.J.; Tahir, M.M. Developing GEP tree-based, neuro-swarm, and whale optimization models for evaluation of bearing capacity of concrete-filled steel tube columns. Eng. Comput. 2019, 1–19. [Google Scholar] [CrossRef]
  50. Apostolopoulou, M.; Armaghani, D.J.; Bakolas, A.; Douvika, M.G.; Moropoulou, A.; Asteris, P.G. Compressive strength of natural hydraulic lime mortars using soft computing techniques. Procedia Struct. Integr. 2019, 17, 914–923. [Google Scholar] [CrossRef]
  51. Armaghani, D.J.; Hatzigeorgiou, G.D.; Karamani, C.; Skentou, A.; Zoumpoulaki, I.; Asteris, P.G. Soft computing-based techniques for concrete beams shear strength. Procedia Struct. Integr. 2019, 17, 924–933. [Google Scholar] [CrossRef]
  52. Akkaya, A.V. Proximate analysis based multiple regression models for higher heating value estimation of low rank coals. Fuel Process. Technol. 2009, 90, 165–170. [Google Scholar] [CrossRef]
  53. Mesroghli, S.; Jorjani, E.; Chelgani, S.C. Estimation of gross calorific value based on coal analysis using regression and artificial neural networks. Int. J. Coal Geol. 2009, 79, 49–54. [Google Scholar] [CrossRef]
  54. Erik, N.Y.; Yilmaz, I. On the use of conventional and Soft Computing Models for prediction of gross calorific value (GCV) of coal. Int. J. Coal Prep. Util. 2011, 31, 32–59. [Google Scholar] [CrossRef]
  55. Wen, X.; Jian, S.; Wang, J. Prediction models of calorific value of coal based on wavelet neural networks. Fuel 2017, 199, 512–522. [Google Scholar] [CrossRef]
  56. Wood, D.A. Transparent open-box learning network provides auditable predictions for coal gross calorific value. Model. Earth Syst. Environ. 2019, 5, 395–419. [Google Scholar] [CrossRef]
  57. Acikkar, M.; Sivrikaya, O. Prediction of gross calorific value of coal based on proximate analysis using multiple linear regression and artificial neural networks. Turk. J. Electr. Eng. Comput. Sci. 2018, 26, 2541–2552. [Google Scholar] [CrossRef]
  58. Qi, M.; Luo, H.; Wei, P.; Fu, Z. Estimation of low calorific value of blended coals based on support vector regression and sensitivity analysis in coal-fired power plants. Fuel 2019, 236, 1400–1407. [Google Scholar] [CrossRef]
  59. Saha, U.K.; Sonon, L.; Kane, M. Prediction of calorific values, moisture, ash, carbon, nitrogen, and sulfur content of pine tree biomass using near infrared spectroscopy. J. Near Infrared Spectrosc. 2017, 25, 242–255. [Google Scholar] [CrossRef]
  60. Ozveren, U. An artificial intelligence approach to predict gross heating value of lignocellulosic fuels. J. Energy Inst. 2017, 90, 397–407. [Google Scholar] [CrossRef]
  61. De la Roza-Delgado, B.; Modroño, S.; Vicente, F.; Martínez-Fernández, A.; Soldado, A. Suitability of faecal near-infrared reflectance spectroscopy (NIRS) predictions for estimating gross calorific value. Span. J. Agric. Res. 2015, 13, 203. [Google Scholar] [CrossRef] [Green Version]
  62. Jing, L. Predicting the gross calorific value of coal based on support vector machine and partial least squares algorithm. In Proceedings of the 2016 IEEE International Conference on Knowledge Engineering and Applications (ICKEA), Singapore, 28–30 September 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 221–225. [Google Scholar]
  63. Uzun, H.; Yıldız, Z.; Goldfarb, J.L.; Ceylan, S. Improved prediction of higher heating value of biomass using an artificial neural network model based on proximate analysis. Bioresour. Technol. 2017, 234, 122–130. [Google Scholar] [CrossRef]
  64. Wang, K.; Zhang, R.; Li, X.; Ning, H. Calorific value prediction of coal based on least squares support vector regression. In Information Technology and Intelligent Transportation Systems; Springer: Berlin/Heidelberg, Germany, 2017; pp. 293–299. [Google Scholar]
  65. Boumanchar, I.; Charafeddine, K.; Chhiti, Y.; Alaoui, F.E.M.H.; Sahibed-dine, A.; Bentiss, F.; Jama, C.; Bensitel, M. Biomass higher heating value prediction from ultimate analysis using multiple regression and genetic programming. Biomass Convers. Biorefinery 2019, 9, 1–11. [Google Scholar] [CrossRef]
  66. Vite Company. Report on the Results of Exploration of Mong Duong Coal Mine, Cam Pha City, Quang Ninh Province; General Department of Geology and Minerals of Vietnam, Hanoi University of Mining and Geology: Hanoi, Vietnam, 2017. [Google Scholar]
  67. 1693:1995 T. TCVN 1693:1995: Hard Coal—Sampling; TCVN 172:1997: Hard Coal—Determination of Total Moisture; Ministry of Science and Technology of Vietnam; Hanoi, Vietnam, 1995.
  68. 173:1995 T. TCVN 173:1995: Solid Minerals Fuels—Determination of Ash; Ministry of Science and Technology of Vietnam; Hanoi, Vietnam, 1995.
  69. 174:1995 T. TCVN 174:1995: Hard Coal and Coke—Determination of Volatile Content; Ministry of Science and Technology of Vietnam; Hanoi, Vietnam, 1995.
  70. 200:2011 T. TCVN 200:2011: Solid Minerals Fuels—Determination of Gross Calorific Value by the Bomb Calorimetric Method and Calculation of Net Calorific Value, Ministry of Science and Technology of Vietnam; Hanoi, Vietnam, 2011.
  71. Patel, S.U.; Kumar, B.J.; Badhe, Y.P.; Sharma, B.; Saha, S.; Biswas, S.; Chaudhury, A.; Tambe, S.S.; Kulkarni, B.D. Estimation of gross calorific value of coals using artificial neural networks. Fuel 2007, 86, 334–344. [Google Scholar] [CrossRef]
  72. Nguyen, H.; Bui, X.-N. Predicting blast-induced air overpressure: A robust artificial intelligence system based on artificial neural networks and random forest. Nat. Resour. Res. 2019, 28, 893–907. [Google Scholar] [CrossRef]
  73. Asteris, P.G.; Nikoo, M. Artificial bee colony-based neural network for the prediction of the fundamental period of infilled frame structures. Neural Comput. Appl. 2019, 1–11. [Google Scholar] [CrossRef]
  74. Matin, S.; Chelgani, S.C. Estimation of coal gross calorific value based on various analyses by random forest method. Fuel 2016, 177, 274–278. [Google Scholar] [CrossRef]
  75. Vapnik, V. Three remarks on the support vector method of function estimation. In Advances in Kernel Methods; MIT Press: Cambridge, MA, USA, 1999; pp. 25–41. [Google Scholar]
  76. Smola, A.J.; Schölkopf, B. A tutorial on support vector regression. Stat. Comput. 2004, 14, 199–222. [Google Scholar] [CrossRef] [Green Version]
  77. Dutta, S.; Gupta, J. PVT correlations of Indian crude using support vector regression. Energy Fuels 2009, 23, 5483–5490. [Google Scholar] [CrossRef]
  78. Zheng, L.; Zhou, H.; Wang, C.; Cen, K. Combining support vector regression and ant colony optimization to reduce NOx emissions in coal-fired utility boilers. Energy Fuels 2008, 22, 1034–1040. [Google Scholar] [CrossRef]
  79. Liu, J.; Shi, G.; Zhu, K. Vessel trajectory prediction model based on AIS sensor data and adaptive chaos differential evolution support vector regression (ACDE-SVR). Appl. Sci. 2019, 9, 2983. [Google Scholar] [CrossRef] [Green Version]
  80. Eberhart, R.; Kennedy, J. A new optimizer using particle swarm theory. In Micro Machine and Human Science, 1995, MHS’95, Proceedings of the Sixth International Symposium on; IEEE: Piscataway, NJ, USA, 1995; pp. 39–43. [Google Scholar]
  81. Chen, H.; Asteris, P.G.; Jahed Armaghani, D.; Gordan, B.; Pham, B.T. Assessing dynamic conditions of the retaining wall: Developing two hybrid intelligent models. Appl. Sci. 2019, 9, 1042. [Google Scholar] [CrossRef] [Green Version]
  82. Zendehboudi, S.; Ahmadi, M.A.; James, L.; Chatzis, I. Prediction of condensate-to-gas ratio for retrograde gas condensate reservoirs using artificial neural network with particle swarm optimization. Energy Fuels 2012, 26, 3432–3447. [Google Scholar] [CrossRef]
  83. Esmin, A.A.; Coelho, R.A.; Matwin, S. A review on particle swarm optimization algorithm and its variants to clustering high-dimensional data. Artif. Intell. Rev. 2015, 44, 23–45. [Google Scholar] [CrossRef]
  84. Ghamisi, P.; Benediktsson, J.A. Feature selection based on hybridization of genetic algorithm and particle swarm optimization. IEEE Geosci. Remote Sens. Lett. 2015, 12, 309–313. [Google Scholar] [CrossRef]
  85. Kaloop, M.R.; Kumar, D.; Samui, P.; Gabr, A.R.; Hu, J.W.; Jin, X.; Roy, B. Particle Swarm Optimization Algorithm-Extreme Learning Machine (PSO-ELM) model for predicting resilient modulus of stabilized aggregate bases. Appl. Sci. 2019, 9, 3221. [Google Scholar] [CrossRef] [Green Version]
  86. Abdullah, N.A.; Abd Rahim, N.; Gan, C.K.; Nor Adzman, N. Forecasting solar power using Hybrid Firefly and Particle Swarm Optimization (HFPSO) for optimizing the parameters in a Wavelet Transform-Adaptive Neuro Fuzzy Inference System (WT-ANFIS). Appl. Sci. 2019, 9, 3214. [Google Scholar] [CrossRef] [Green Version]
  87. Kulkarni, R.V.; Venayagamoorthy, G.K. An estimation of distribution improved particle swarm optimization algorithm. In Proceedings of the 2007 3rd International Conference on Intelligent Sensors, Sensor Networks and Information, Melbourne, Australia, 3–6 December 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 539–544. [Google Scholar]
  88. Khademi, F.; Jamal, S.M.; Deshpande, N.; Londhe, S. Predicting strength of recycled aggregate concrete using artificial neural network, adaptive neuro-fuzzy inference system and multiple linear regression. Int. J. Sustain. Built Environ. 2016, 5, 355–369. [Google Scholar] [CrossRef] [Green Version]
  89. Breiman, L. Classification and Regression Trees; Routledge: Abingdon, UK, 2017. [Google Scholar]
  90. Byeon, H. Development of prediction model for endocrine disorders in the Korean elderly using CART algorithm. Dev. Int. J. Adv. Comput. Sci. Appl. 2015, 6, 125–129. [Google Scholar] [CrossRef] [Green Version]
  91. El Moucary, C. Data Mining for Engineering Schools Predicting Students’ Performance and Enrollment in Masters Programs. Int. J. Adv. Comput. Sci. Appl. 2011, 2, 1–9. [Google Scholar]
  92. Suknović, M.; Čupić, M.; Martić, M. Data warehousing and data mining—A case study. Yugosl. J. Oper. Res. 2005, 15, 125–145. [Google Scholar] [CrossRef]
  93. Wold, S.; Esbensen, K.; Geladi, P. Principal component analysis. Chemom. Intell. Lab. Syst. 1987, 2, 37–52. [Google Scholar] [CrossRef]
  94. Jolliffe, I. Principal Component Analysis; Springer: Berlin/Heidelberg, Germany, 2011. [Google Scholar]
  95. Abdi, H.; Williams, L.J. Principal component analysis. Wiley Interdiscip. Rev. Comput. Stat. 2010, 2, 433–459. [Google Scholar] [CrossRef]
  96. Bui, X.N.; Nguyen, H.; Le, H.A.; Bui, H.B.; Do, N.H. Prediction of blast-induced air over-pressure in open-pit mine: Assessment of different artificial intelligence techniques. Nat. Resour. Res. 2019, 1–21. [Google Scholar] [CrossRef]
  97. Nguyen, H.; Bui, X.-N.; Bui, H.-B.; Mai, N.-L. A comparative study of artificial neural networks in predicting blast-induced air-blast overpressure at Deo Nai open-pit coal mine, Vietnam. Neural Comput. Appl. 2018, 1–17. [Google Scholar] [CrossRef]
  98. Moayedi, H.; Hayati, S. Applicability of a CPT-based neural network solution in predicting load-settlement responses of bored pile. Int. J. Geomech. 2018, 18, 06018009-1–06018009-11. [Google Scholar]
  99. Marinakis, Y.; Migdalas, A.; Sifaleras, A. A hybrid particle swarm optimization–variable neighborhood search algorithm for constrained shortest path problems. Eur. J. Oper. Res. 2017, 261, 819–834. [Google Scholar] [CrossRef]
  100. Agrawal, A.P.; Kaur, A. A Comprehensive comparison of ant colony and hybrid particle swarm optimization algorithms through test case selection. In Data Engineering and Intelligent Computing; Springer: Singapore, 2018; pp. 397–405. [Google Scholar]
  101. Poli, R.; Kennedy, J.; Blackwell, T. Particle swarm optimization. Swarm Intell. 2007, 1, 33–57. [Google Scholar] [CrossRef]
  102. Kennedy, J. The behavior of particles. In International Conference on Evolutionary Programming; Springer: Berlin/Heidelberg, Germany, 1998; pp. 579–589. [Google Scholar]
  103. Clerc, M.; Kennedy, J. The particle swarm-explosion, stability, and convergence in a multidimensional complex space. IEEE Trans. Evolut. Comput. 2002, 6, 58–73. [Google Scholar] [CrossRef] [Green Version]
  104. Eberhart, R.C.; Shi, Y. Comparing inertia weights and constriction factors in particle swarm optimization. In Evolutionary Computation, Proceedings of the 2000 Congress on, 2000; IEEE: Piscataway, NJ, USA, 2000; pp. 84–88. [Google Scholar]
  105. Pérez-Guaita, D.; Kuligowski, J.; Lendl, B.; Wood, B.R.; Quintás, G. Assessment of discriminant models in infrared imaging using constrained repeated random sampling–Cross validation. Anal. Chim. Acta 2018, 1033, 156–164. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Location and geological conditions of the study area.
Figure 1. Location and geological conditions of the study area.
Applsci 09 04868 g001
Figure 2. Visualization of the data used in this study. M—moisture content; A—ash; VM—volatile matter; GCV—gross calorific value.
Figure 2. Visualization of the data used in this study. M—moisture content; A—ash; VM—volatile matter; GCV—gross calorific value.
Applsci 09 04868 g002
Figure 3. The proposed PSO-SVR framework for predicting GCV in this study.
Figure 3. The proposed PSO-SVR framework for predicting GCV in this study.
Applsci 09 04868 g003
Figure 4. The principle component analysis (PCA) procedure (source: https://machinelearningcoban.com/2017/06/15/pca/).
Figure 4. The principle component analysis (PCA) procedure (source: https://machinelearningcoban.com/2017/06/15/pca/).
Applsci 09 04868 g004
Figure 5. Performance of the PSO-SVR ensemble with the RBF kernel function on the training process.
Figure 5. Performance of the PSO-SVR ensemble with the RBF kernel function on the training process.
Applsci 09 04868 g005
Figure 6. Performance of the PSO-SVR ensemble with the polynomial kernel function on the training process.
Figure 6. Performance of the PSO-SVR ensemble with the polynomial kernel function on the training process.
Applsci 09 04868 g006
Figure 7. Performance of the PSO-SVR ensemble with the linear kernel function on the training process.
Figure 7. Performance of the PSO-SVR ensemble with the linear kernel function on the training process.
Applsci 09 04868 g007
Figure 8. The accuracy level of the CART model with various cp.
Figure 8. The accuracy level of the CART model with various cp.
Applsci 09 04868 g008
Figure 9. Analyzed versus predicted GCV by the PSO-SVR-RBF model.
Figure 9. Analyzed versus predicted GCV by the PSO-SVR-RBF model.
Applsci 09 04868 g009
Figure 10. Analyzed versus predicted GCV by the PSO-SVR-P model.
Figure 10. Analyzed versus predicted GCV by the PSO-SVR-P model.
Applsci 09 04868 g010
Figure 11. Analyzed versus predicted GCV by the PSO-SVR-L model.
Figure 11. Analyzed versus predicted GCV by the PSO-SVR-L model.
Applsci 09 04868 g011
Figure 12. Analyzed versus predicted GCV by the MLR model.
Figure 12. Analyzed versus predicted GCV by the MLR model.
Applsci 09 04868 g012
Figure 13. Analyzed versus predicted GCV by the CART model.
Figure 13. Analyzed versus predicted GCV by the CART model.
Applsci 09 04868 g013
Figure 14. Analyzed versus predicted GCV by the PCA model.
Figure 14. Analyzed versus predicted GCV by the PCA model.
Applsci 09 04868 g014
Table 1. Brief of the gross calorific value (GCV) data used.
Table 1. Brief of the gross calorific value (GCV) data used.
CategoryMoisture (M, %)Ash (A, %)Volatile Matter (VM, %)GCV (Kcal/kg)
Min.0.2001.323.5804352
1st Quarter1.5208.956.4356128
Median2.00017.707.8006816
Mean2.03717.607.8606825
3rd Quarter2.54024.899.1757625
Max.4.35039.9611.9908654
Table 2. The support vector regression (SVR) hyper-parameters with various kernel functions.
Table 2. The support vector regression (SVR) hyper-parameters with various kernel functions.
PSO-SVR ModelHyper-Parameters
Cd γ σ
Linear functionx---
Polynomial functionxxx-
Radial basis functionx--x
Table 3. The optimal values of the PSO-SVR hyper-parameters.
Table 3. The optimal values of the PSO-SVR hyper-parameters.
ModelHyper-Parameters
Cd γ σ
PSO-SVR-L147.025---
PSO-SVR-P0.15730.870-
PSO-SVR-RBF4.567--0.279
Table 4. Performance of the PCA model with various ϑ .
Table 4. Performance of the PCA model with various ϑ .
ϑ RMSER2
1475.0180.755
2432.7230.797
Table 5. The results of the developed GCV predictive models.
Table 5. The results of the developed GCV predictive models.
ModelTrainingTestingTotal Rank
RMSER2Rank for RMSERank for R2RMSER2Rank for RMSERank for R2
PSO-SVR-RBF196.8780.95666212.8310.9526624
PSO-SVR-P204.3470.95355215.7670.9505520
PSO-SVR-L247.5810.93322254.2160.931228
Multiple linear regression (MLR)224.2830.94533225.9430.9464313
Classification and regression trees (CART)220.8910.94644226.4340.9463314
Principal component analysis (PCA)432.7280.79711439.5850.794114

Share and Cite

MDPI and ACS Style

Bui, H.-B.; Nguyen, H.; Choi, Y.; Bui, X.-N.; Nguyen-Thoi, T.; Zandi, Y. A Novel Artificial Intelligence Technique to Estimate the Gross Calorific Value of Coal Based on Meta-Heuristic and Support Vector Regression Algorithms. Appl. Sci. 2019, 9, 4868. https://doi.org/10.3390/app9224868

AMA Style

Bui H-B, Nguyen H, Choi Y, Bui X-N, Nguyen-Thoi T, Zandi Y. A Novel Artificial Intelligence Technique to Estimate the Gross Calorific Value of Coal Based on Meta-Heuristic and Support Vector Regression Algorithms. Applied Sciences. 2019; 9(22):4868. https://doi.org/10.3390/app9224868

Chicago/Turabian Style

Bui, Hoang-Bac, Hoang Nguyen, Yosoon Choi, Xuan-Nam Bui, Trung Nguyen-Thoi, and Yousef Zandi. 2019. "A Novel Artificial Intelligence Technique to Estimate the Gross Calorific Value of Coal Based on Meta-Heuristic and Support Vector Regression Algorithms" Applied Sciences 9, no. 22: 4868. https://doi.org/10.3390/app9224868

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop