# Coal Gangue Recognition during Coal Preparation Using an Adaptive Boosting Algorithm

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

_{2}, CO, CO

_{2,}and NO

_{x}, when burned, which results in coal quality reduction and environmental pollution, affecting the clean and efficient use of coal [1]. The existing coal gangue sorting methods mainly involve manual and mechanical sorting. As shown in Figure 1, when the raw coal flow flows into the raw coal preparation workshop, the iron and sundries in the raw coal flow are removed through the iron remover, and then enter the raw coal classification screen for screening. Those with particle sizes of less than 50 mm directly enter the mechanical separation operation. Those with particle sizes of more than 50 mm are manually selected to remove part of the sundries in the coal and visible gangue, and then broken into qualified particle sizes (less than 50 mm), and further separation is carried out using other mechanical separation methods such as a moving screen jig. When manually sorting gangue, illustrated in Figure 2, the workers judge the gangue in the coal flow through the naked eye and pick it out by hand. The labor intensity is high, the working environment is bad, the workers can very easily inhale fine particles (although wearing protective masks) and can easily be injured by the high-speed belt or scraper conveyor, seriously affecting the health of the workers and posing great potential safety risks [2]. To keep the sorting workers away from the harsh working environment, intelligent coal gangue sorting equipment, especially coal gangue sorting robots, has received considerable attention in the industry [3,4,5,6]. Coal gangue recognition is the foundation of its intelligent sorting and is a crucial coal gangue sorting robot technology.

- Aiming at the shortcomings of the SVM classifier for coal gangue recognition, this paper used a genetic optimization algorithm to improve its noise insensitivity and difficult parameter adjustment, used an adaptive boosting (AdaBoost) algorithm to enhance its recognition accuracy, and constructed a coal gangue recognition and classification model;
- The indices of GGCM were introduced to characterize the features of the coal and gangue and a coal-gangue sample dataset was constructed with the coal and gangue images obtained by experiment and on-site to verify the performance of the proposed algorithm.

## 2. Principle and Theory

#### 2.1. SVM Algorithm and SVM Classifier

_{i}, y

_{i}), i = 1, 2, …, N, x ∈ R, y ∈ (−1,1)}, x

_{i}is the sample data to be classified, and y

_{i}is the label of the data x

_{i}. The classification plane (ω, b) can be described using the following linear equation:

_{i}≥ 0. Then, the problem of finding the optimal classification plane comes down to solving

**x**

_{i},

**x**

_{j}) is introduced, and the classification problem is reduced to

#### 2.2. GA Optimization and SVM Base Classifier Construction

- Set the evolutionary iteration counter t = 0, the maximum evolutionary iteration T, and randomly generate M individuals as the initial population P(0);
- Calculate the fitness of each individual in the population P(t);
- Obtain the next generation’s population P(t + 1) through selection, crossover, and mutation of population P(t);
- Judge whether the termination condition is reached. If t < T, repeat Step 3; else, if t = T, terminate the evolution;
- Take the individual with the greatest fitness obtained in the evolution process as the optimal solution, and the SVM base classifier is constructed using the optimal parameters.

#### 2.3. ADAB-GA-SVM Classifier Construction

- (1)
- Initialize the weight of the sample set D.

- (2)
- Let the iteration number be M, for t = 1, 2, …, M:
- Train the GA-SVM classifier using the sample set D with weight ω
_{i}^{t}and obtain a base classifier f_{t}(x); - Calculate the classification error rate e
_{t}and weight of the classifier λ_{t}:$${e}_{t}={\displaystyle \mathsf{\sum}}_{i=1}^{N}{\omega}_{i}^{t}\cdot I\left({y}_{i}\ne {f}_{t}\left({x}_{i}\right)\right)$$$${\lambda}_{t}=\frac{1}{2}\mathrm{ln}(\frac{1-{e}_{t}}{{e}_{t}})$$_{i}≠ f_{t}(x_{i})) is a discriminant function, which returns 1 when the prediction result of the base classifier f_{t}(x) is inconsistent with the sample label y_{i}; otherwise, it returns 0. - Update the weight of the training set sample to ω
_{i}^{t+}^{1}according to the prediction result of the base classifier f_{t}(x):$${\omega}_{i}^{t+1}=\{\begin{array}{c}\frac{{\omega}_{i}^{t}\mathrm{exp}(-{\lambda}_{t})}{{\displaystyle {{\displaystyle \mathsf{\sum}}}_{i=1}^{N}}{\omega}_{i}^{t}\mathrm{exp}(-{\lambda}_{t})}{f}_{t}\left({x}_{i}\right)={y}_{i}\\ \frac{{\omega}_{i}^{t}\mathrm{exp}({\lambda}_{t})}{{\displaystyle {{\displaystyle \mathsf{\sum}}}_{i=1}^{N}}{\omega}_{i}^{t}\mathrm{exp}({\lambda}_{t})}{f}_{t}\left({x}_{i}\right)\ne {y}_{i}\end{array}$$

- (3)
- Build the final strong classifier as$$F\left(x\right)={\displaystyle \mathsf{\sum}}_{t=1}^{M}{\lambda}_{t}{f}_{t}\left(x\right)={F}_{M-1}\left(x\right)+{\lambda}_{M}{f}_{M}\left(x\right)$$

## 3. Materials and Methods

#### 3.1. Coal and Gangue Image Collection and Preprocessing

#### 3.2. Gray-Level Gradient Co-Occurrence Matrix Texture Feature Extraction

_{f}, the maximum gray-level of the grayscale image; and L

_{g}, the maximum gradient level of the gradient image. The GGCM is normalized as follows:

**x**= [T

_{1}, T

_{3}, T

_{4}, T

_{5}, T

_{6}, T

_{7}, T

_{8}, T

_{9}, T

_{10}, T

_{13}, T

_{14}, T

_{15}]. The coal image was labeled as 1, and the gangue image was labeled as −1. Figure A1 presents the texture features of 100 groups of coal gangue image samples. The abscissa represents the sample serial number, and the ordinate represents the feature value of the sample. The aforementioned 12 GGCM texture features of coal gangue image training and test sets were extracted respectively to provide the training dataset {(x

_{i}, y

_{i})| x

_{i}= [T

_{1i}, T

_{3i}, T

_{4i}, T

_{5i}, T

_{6i}, T

_{7i}, T

_{8i}, T

_{9i}, T

_{10i}, T

_{13i}, T

_{14i}, T

_{15i}], y

_{i}∈ (−1,1), i = 1, 2, …, 4000} and the test dataset {(x

_{j}, y

_{j})| x

_{j}= [T

_{1j}, T

_{3j}, T

_{4j}, T

_{5j}, T

_{6j}, T

_{7j}, T

_{8j}, T

_{9j}, T

_{10j}, T

_{13j}, T

_{14j}, T

_{15j}], y

_{j}∈ (−1,1), j = 1, 2, …, 1000} for the subsequent model training and testing.

#### 3.3. Classification Model Training Process

- (1)
- Input the coal gangue training dataset (x
_{i}, y_{i}), i = 1, 2, …, 4000, set the initial weight ω_{i}^{t}= 1/4000 (t = 1), and construct a weighted training set (ω_{i}^{t}x_{i}, y_{i}); - (2)
- Set the value range of penalty factor C and parameter g of RBF-SVM as [0, 100] and [0, 10], respectively, and convert C and g into chromosomes by 8-bit binary coding. According to the abovementioned research results, the initial population size, crossover probability, and mutation probability of GA were set to 80, 0.9, and 0.0005, respectively, and the number of evolutionary iterations was set to 100. The roulette selection method was adopted;
- (3)
- Using the weighted training dataset and taking the average recognition accuracy Acc of four-fold cross-verification as the chromosome’s fitness, the current population is crossed, mutated, and selected to generate the next generation of population and calculate each fitness value;
- (4)
- Judge whether the number of iterations has been reached. If not, return to step (3); otherwise, select the individual with the highest fitness in all iterative populations to obtain the final GA-SVM base classifier f
_{t}(x); - (5)
- Calculate the error rate e
_{t}of f_{t}(x) and its weight λ_{t}, and update the weight of the sample data to ω_{i}^{t}^{+1}according to the prediction result of f_{t}(x); - (6)
- Loop through steps (2)–(5) until all 20 GA-SVM base classifiers are obtained, and the final classifier F(x) is constructed using Equation (16).

_{1}score, which are defined as follows:

#### 3.4. Experimental Configuration

## 4. Results and Discussion

#### 4.1. Kernel Function Selection

#### 4.2. Genetic Algorithm Parameter Tuning

#### 4.3. Number of Base Classifiers

#### 4.4. Classification Model Training Results and Evaluation

_{t}(x) are shown in Table A2, in which the parameters in the red box are those of the final selected base classifier. Table A3 presents the accuracy, the penalty factor c, the parameter g, the error rate e

_{t}, the classifier weight λ

_{t}, and the training time of the obtained 20 GA-SVM base classifiers.

_{1}value of the AdaB-GA-SVM model compared with the GA-SVM model increased by 4%, 7.6%, and 4.5%, reaching 95.1%, 92.8%, and 95%, respectively. During coal preparation, the industry specialists focus more on the precision rate of gangue. The gangue precision rate of the GA-SVM model was 85.2%, and the recall rate was 96.6%; the gangue precision rate of the AdaB-GA-SVM model was 92.8%, and the recall rate was 97.3%, indicating that the recognition model proposed in this paper has a better performance and effect. The KS value of 0.79 shown in Figure 9 indicates that the AdaB-GA-SVM model performs well in coal gangue identification.

#### 4.5. Comparison with other SVM Base Classifier

## 5. Conclusions

- (1)
- The coal gangue image data were been collected on-site, the gray-level gradient co-occurrence matrix texture features were extracted, and the coal gangue image dataset was constructed. The AdaB-GA-SVM classification model proposed in this paper was trained and tested. The results indicated that the model had a precision rate of 92.8% for gangue, a recall rate of 97.3%, and a KS value of 0.79, suggesting that the AdaB-GA-SVM model has excellent classification and identification performance and good generalization ability in coal gangue identification.
- (2)
- The coal gangue identification effects of the proposed algorithm with other SVM base classifiers, such as SVM, GS-SVM, and PSO-SVM, were compared and analyzed. The results indicated that the enhanced classification model’s accuracy improved. The AdaB-GA-SVM classifier had the highest accuracy of 95%, 5% to 11% higher than the AdaB-SVM, the AdaB-GS-SVM, and the AdaB-PSO-SVM classifiers with equivalent runtimes.
- (3)
- Image texture features and classification algorithms significantly impact the effect of coal gangue identification. More texture feature extraction methods or machine learning algorithms, such as improved local ternary pattern [36], XGBoost (eXtreme Gradient Boosting) [37,38] and deep learning algorithms, will be further studied for coal gangue recognition.

## Author Contributions

## Funding

## Data Availability Statement

## Conflicts of Interest

## Appendix A

No. | Texture Feature | Calculation Formula |
---|---|---|

1 | large gradient advantage | ${T}_{1}={\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}{y}^{2}\widehat{H}\left(x,y\right)/{\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\widehat{H}\left(x,y\right)$ |

2 | small gradient advantage | ${T}_{2}={\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\frac{\widehat{H}\left(x,y\right)}{{\left(y+1\right)}^{2}}/{\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\widehat{H}\left(x,y\right)$ |

3 | gray distribution nonuniformity | ${T}_{3}={\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\left[{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\widehat{H}\left(x,y\right)\right]}^{2}/{\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\widehat{H}\left(x,y\right)$ |

4 | gradient distribution nonuniformity | ${T}_{4}={\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{g}-1}}{\left[{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{f}-1}}\widehat{H}\left(x,y\right)\right]}^{2}/{\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\widehat{H}\left(x,y\right)$ |

5 | energy | ${T}_{5}={\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}{\widehat{H}}^{2}\left(x,y\right)$ |

6 | gray average | ${T}_{6}={\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}x{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\widehat{H}\left(x,y\right)$ |

7 | gradient average | ${T}_{7}={\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}y{\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}\widehat{H}\left(x,y\right)$ |

8 | gray mean square error | ${T}_{8}={\left[{\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\left(x-{T}_{6}\right)}^{2}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\widehat{H}\left(x,y\right)\right]}^{\frac{1}{2}}$ |

9 | gradient mean square error | ${T}_{9}={\left[{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}{\left(y-{T}_{7}\right)}^{2}{\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}\widehat{H}\left(x,y\right)\right]}^{\frac{1}{2}}$ |

10 | correlation | ${T}_{10}={\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\left(x-{T}_{6}\right)\left(y-{T}_{7}\right)\widehat{H}\left(x,y\right)$ |

11 | gray-level entropy | ${T}_{11}=-{\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\widehat{H}\left(x,y\right)\mathrm{log}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\widehat{H}\left(x,y\right)$ |

12 | gradient entropy | ${T}_{12}=-{\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\widehat{H}\left(x,y\right)\mathrm{log}{\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}\widehat{H}\left(x,y\right)$ |

13 | mixed entropy | ${T}_{13}=-{\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\widehat{H}\left(x,y\right)\mathrm{log}\widehat{H}\left(x,y\right)$ |

14 | inertia | ${T}_{14}={\displaystyle \mathsf{\sum}_{x=0}^{{L}_{f}-1}{\displaystyle \mathsf{\sum}_{y=0}^{{L}_{g}-1}{\left(x-y\right)}^{2}\widehat{H}(x,y)}}$ |

15 | inverse difference moment | ${T}_{15}={\displaystyle {\displaystyle \mathsf{\sum}}_{x=0}^{{L}_{f}-1}}{\displaystyle {\displaystyle \mathsf{\sum}}_{y=0}^{{L}_{g}-1}}\widehat{H}\left(x,y\right){\displaystyle /}\left[1+{\left(x-y\right)}^{2}\right]$ |

1 | 2 | 3 | 4 | … | 46 | … | 97 | 98 | 99 | 100 | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|

1 | Acc | 0.9013 | 0.9005 | 0.8982 | 0.9013 | … | 0.9009 | … | 0.9004 | 0.9004 | 0.9101 | 0.9018 |

C | 32.560 | 87.607 | 71.395 | 88.029 | … | 25.495 | … | 14.757 | 27.174 | 60.671 | 52.160 | |

g | 1.405 | 6.562 | 0.027 | 6.561 | … | 2.870 | … | 6.015 | 6.739 | 3.300 | 23.737 | |

2 | Acc | 0.8924 | 0.8888 | 0.8911 | 0.8926 | … | 0.9108 | … | 0.9013 | 0.9044 | 0.8960 | 0.8906 |

C | 83.454 | 92.829 | 84.333 | 35.835 | … | 60.278 | … | 3.282 | 78.138 | 39.206 | 28.448 | |

g | 3.944 | 1.337 | 2.899 | 2.4575 | … | 2.739 | … | 7.124 | 2.966 | 6.233 | 6.873 | |

… | … | … | … | … | … | … | … | … | … | … | … | … |

79 | Acc | 0.8968 | 0.8960 | 0.9105 | 0.8946 | … | 0.9031 | … | 0.8986 | 0.8991 | 0.9009 | 0.9039 |

C | 50.863 | 54.997 | 53.990 | 72.738 | … | 84.658 | … | 45.364 | 84.817 | 84.817 | 65.289 | |

g | 8.624 | 8.624 | 2.312 | 9.072 | … | 5.033 | … | 7.015 | 1.391 | 5.189 | 0.999 | |

80 | Acc | 0.8977 | 0.9035 | 0.8964 | 0.8964 | … | 0.9018 | … | 0.9022 | 0.9098 | 0.8960 | 0.9012 |

C | 10.203 | 54.631 | 54.631 | 29.729 | … | 72.700 | … | 52.399 | 72.750 | 3.401 | 35.384 | |

g | 8.163 | 3.749 | 8.291 | 3.134 | 1.884 | 6.040 | 2.011 | 4.009 | 2.291 |

Base Classifier | Accuracy | C | g | e_{t} | λ_{t} | Train Time | Base Classifier | Accuracy | C | g | e_{t} | λ_{t} | Train Time |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|

f_{1}(x) | 0.91 | 51.73 | 3.16 | 0.18 | 1.475 | 15 min 26 s | f_{11}(x) | 0.91 | 62.02 | 2.86 | 0.45 | 0.187 | 17 min 22 s |

f_{2}(x) | 0.91 | 66.54 | 2.82 | 0.34 | 0.635 | 17 min 42 s | f_{12}(x) | 0.91 | 61.47 | 2.85 | 0.48 | 0.064 | 17 min 42 s |

f_{3}(x) | 0.91 | 55.30 | 3.08 | 0.39 | 0.444 | 18 min 21 s | f_{13}(x) | 0.91 | 61.73 | 2.86 | 0.45 | 0.188 | 17 min 16 s |

f_{4}(x) | 0.91 | 57.36 | 2.94 | 0.40 | 0.418 | 17 min 36 s | f_{14}(x) | 0.91 | 67.35 | 2.77 | 0.45 | 0.182 | 18 min 13 s |

f_{5}(x) | 0.91 | 94.35 | 2.52 | 0.40 | 0.400 | 17 min 20 s | f_{15}(x) | 0.91 | 83.86 | 2.02 | 0.46 | 0.156 | 17 min 43 s |

f_{6}(x) | 0.91 | 55.41 | 3.02 | 0.43 | 0.279 | 17 min 30 s | f_{16}(x) | 0.91 | 66.99 | 2.79 | 0.47 | 0.110 | 15 min 58 s |

f_{7}(x) | 0.91 | 63.79 | 2.87 | 0.47 | 0.099 | 17 min 43 s | f_{17}(x) | 0.91 | 67.71 | 2.80 | 0.49 | 0.009 | 17 min 22 s |

f_{8}(x) | 0.91 | 54.26 | 2.82 | 0.47 | 0.119 | 17 min 25 s | f_{18}(x) | 0.91 | 93.30 | 2.51 | 0.44 | 0.233 | 17 min 23 s |

f_{9}(x) | 0.91 | 56.29 | 2.98 | 0.48 | 0.077 | 16 min 30 s | f_{19}(x) | 0.91 | 69.78 | 2.76 | 0.45 | 0.201 | 16 min 47 s |

f_{10}(x) | 0.91 | 44.73 | 3.26 | 0.41 | 0.361 | 17 min 35 s | f_{20}(x) | 0.91 | 68.63 | 1.94 | 0.50 | 0.001 | 17 min 35 s |

## References

- Liu, Q.; Li, J.; Li, Y.; Gao, M. Recognition Methods for Coal and Coal Gangue Based on Deep Learning. IEEE Access
**2021**, 9, 77599–77610. [Google Scholar] [CrossRef] - Sun, Z.; Huang, L.; Jia, R. Coal and Gangue Separating Robot System Based on Computer Vision. Sensors
**2021**, 21, 1349. [Google Scholar] [CrossRef] [PubMed] - Lei, S.; Xiao, X.; Zhang, M.; Dai, J. Visual classification method based on CNN for coal-gangue sorting robots. In Proceedings of the 2020 5th International Conference on Automation, Control and Robotics Engineering (CACRE), Dalian, China, 18–20 September 2020; pp. 543–547. [Google Scholar] [CrossRef]
- Li, M.; Duan, Y.; He, X.; Yang, M. Image positioning and identification method and system for coal and gangue sorting robot. Int. J. Coal Prep. Util.
**2020**, 42, 1759–1777. [Google Scholar] [CrossRef] - Liu, P.; Qiao, X.; Zhang, X. Stability sensitivity for a cable-based coal–gangue picking robot based on grey relational analysis. Int. J. Adv. Robot. Syst.
**2021**, 18, 1–12. [Google Scholar] [CrossRef] - Wang, Z.; Xie, S.; Chen, G.; Chi, W.; Ding, Z.; Wang, P. An Online Flexible Sorting Model for Coal and Gangue Based on Multi-Information Fusion. IEEE Access
**2021**, 9, 90816–90827. [Google Scholar] [CrossRef] - Eshaq, R.M.A.; Hu, E.; Li, M.; Alfarzaeai, M.S. Separation Between Coal and Gangue Based on Infrared Radiation and Visual Extraction of the YCbCr Color Space. IEEE Access
**2020**, 8, 55204–55220. [Google Scholar] [CrossRef] - Guo, Y.; Wang, X.; Wang, S.; Hu, K.; Wang, W. Identification Method of Coal and Coal Gangue Based on Dielectric Characteristics. IEEE Access
**2021**, 9, 9845–9854. [Google Scholar] [CrossRef] - Hu, F.; Zhou, M.; Yan, P.; Bian, K.; Dai, R. Multispectral Imaging: A New Solution for Identification of Coal and Gangue. IEEE Access
**2019**, 7, 169697–169704. [Google Scholar] [CrossRef] - Zhang, Y.; Zhu, H.; Zhu, J.; Ou, Z.; Shen, T.; Sun, J.; Feng, A. Experimental study on separation of lumpish coal and gangue using X-ray. Energy Sources Part A Recovery Util. Environ. Eff.
**2021**, 9, 1–13. [Google Scholar] [CrossRef] - Zou, L.; Yu, X.; Li, M.; Lei, M.; Yu, H. Nondestructive Identification of Coal and Gangue via Near-infrared Spectroscopy based on Improved Broad Learning. IEEE Trans. Instrum. Meas.
**2020**, 69, 8043–8052. [Google Scholar] [CrossRef] - Wang, J.; Li, L.; Yang, S. Experimental study on gray and texture features extraction of coal and gangue image under different illuminance. J. China Coal Soc.
**2018**, 43, 3051–3061. (In Chinese) [Google Scholar] [CrossRef] - Zhao, Y.; Wang, S.; Guo, Y.; Cheng, G.; He, L.; Wang, W. The identification of coal and gangue and the prediction of the degree of coal metamorphism based on the EDXRD principle and the PSO-SVM model. Gospod. Surowcami Miner.
**2022**, 38, 113–129. [Google Scholar] [CrossRef] - Su, L.; Cao, X.; Ma, H.; Li, Y. Research on Coal Gangue Identification by Using Convolutional Neural Network. In Proceedings of the 2018 2nd IEEE Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC), Xi’an, China, 25–27 May 2018; pp. 810–814. [Google Scholar]
- Pu, Y.; Apel, D.B.; Szmigiel, A.; Chen, J. Image Recognition of Coal and Coal Gangue Using a Convolutional Neural Network and Transfer Learning. Energies
**2019**, 12, 1735. [Google Scholar] [CrossRef] - Li, D.; Zhang, Z.; Xu, Z.; Xu, L.; Meng, G.; Li, Z.; Chen, S. An Image-Based Hierarchical Deep Learning Framework for Coal and Gangue Detection. IEEE Access
**2019**, 7, 184686–184699. [Google Scholar] [CrossRef] - McCoy, J.; Auret, L. Machine learning applications in minerals processing: A review. Miner. Eng.
**2019**, 132, 95–109. [Google Scholar] [CrossRef] - Hou, W. Identification of Coal and Gangue by Feed-forward Neural Network Based on Data Analysis. Int. J. Coal Prep. Util.
**2017**, 39, 33–43. [Google Scholar] [CrossRef] - Alfarzaeai, M.S.; Niu, Q.; Zhao, J.; Eshaq, R.M.A.; Hu, E. Coal/Gangue Recognition Using Convolutional Neural Networks and Thermal Images. IEEE Access
**2020**, 8, 76780–76789. [Google Scholar] [CrossRef] - Li, D.; Wang, G.; Zhang, Y.; Wang, S. Coal gangue detection and recognition algorithm based on deformable convolution YOLOv3. IET Image Process.
**2022**, 16, 134–144. [Google Scholar] [CrossRef] - Yan, P.; Sun, Q.; Yin, N.; Hua, L.; Shang, S.; Zhang, C. Detection of coal and gangue based on improved YOLOv5.1 which embedded scSE module. Measurement
**2022**, 188, 110530. [Google Scholar] [CrossRef] - Li, M.; He, X.; Duan, Y.; Yang, M. Experimental study on the influence of external factors on image features of coal and gangue. Int. J. Coal Prep. Util.
**2021**, 42, 2770–2787. [Google Scholar] [CrossRef] - Li, N.; Gong, X. An Image Preprocessing Model of Coal and Gangue in High Dust and Low Light Conditions Based on the Joint Enhancement Algorithm. Comput. Intell. Neurosci.
**2021**, 2021, 1–10. [Google Scholar] [CrossRef] [PubMed] - Vapnik, V.N.; Chervonenkis, A. A note on one class of perceptrons. Autom. Remote Control
**1964**, 25, 821–837. [Google Scholar] - Goldberg, D.E. Genetic algorithms in search, optimization, and machine learning. Choice Rev.
**1989**, 27, 39–45. [Google Scholar] [CrossRef] - Kennedy, J.; Eberhart, R. Particle Swarm Optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar] [CrossRef]
- Dorigo, M.; Caro, G.D. Ant colony optimization: A new meta-heuristic. In Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406), Washington, DC, USA, 6–9 July 1999; pp. 1470–1477. [Google Scholar]
- Freund, Y.; Schapire, R.E. Experiments with a New Boosting Algorithm. In Proceedings of the 13th International Conference on Machine Learning, Bari, Italy, 3–6 July 1996; pp. 148–156. [Google Scholar] [CrossRef]
- Dou, P.; Chen, Y.; Yue, H. Remote-sensing imagery classification using multiple classification algorithm-based AdaBoost. Int. J. Remote. Sens.
**2018**, 39, 619–639. [Google Scholar] [CrossRef] - Zhang, Y.; Ni, M.; Zhang, C.; Liang, S.; Fang, S.; Li, R.; Tan, Z. Research and Application of AdaBoost Algorithm Based on SVM. In Proceedings of the 2019 IEEE 8th Joint International Information Technology and Artificial Intelligence Conference (ITAIC), Chongqing, China, 24–26 May 2019; pp. 662–666. [Google Scholar] [CrossRef]
- Hong, J. Gray-gradient co-occurrence matrix texture analysis method. Acta Autom. Sin.
**1984**, 10, 22–25. [Google Scholar] - Rezaei, M.; Saberi, M.; Ershad, S.F. Texture classification approach based on combination of random threshold vector technique and co-occurrence matrixes. In Proceedings of the 2011 International Conference on Computer Science and Network Technology (ICCSNT), Harbin, China, 24–26 December 2011; Volume 4, pp. 2303–2306. [Google Scholar] [CrossRef]
- Xue, G.; Li, X.; Qian, X. Coal-gangue image recognition in fully-mechanized caving face based on random forest. Ind. Mine Autom.
**2020**, 46, 57–62. (In Chinese) [Google Scholar] [CrossRef] - Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res.
**2011**, 12, 2825–2830. [Google Scholar] [CrossRef] - Cho, G.-S.; Gantulga, N.; Choi, Y.-W. A comparative study on multi-class SVM & kernel function for land cover classification in a KOMPSAT-2 image. KSCE J. Civ. Eng.
**2017**, 21, 1894–1904. [Google Scholar] [CrossRef] - Fekri-Ershad, S. Bark texture classification using improved local ternary patterns and multilayer neural network. Expert Syst. Appl.
**2020**, 158, 113509. [Google Scholar] [CrossRef] - Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. In Proceedings of the KDD’16: 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar] [CrossRef]
- Zhou, M.; Lai, W. Coal gangue recognition based on spectral imaging combined with XGBoost. PLoS ONE
**2023**, 18, e0279955. [Google Scholar] [CrossRef]

**Figure 2.**The picture of the manual sorting scene and the sorted coal gangue. (

**a**) Manual sorting; (

**b**) sorted coal gangue.

**Figure 4.**Effect comparison of the coal and gangue images before and after preprocessing: (

**a**) raw image, (

**b**) gray conversion, (

**c**) gamma correction, and (

**d**) enhancement.

**Figure 6.**Importance of the texture features of the coal gangue image [33].

**Figure 7.**The variation curve of the coal gangue image recognition accuracy of AdaB-GA-SVM with a different number of integrated base classifiers.

**Figure 8.**The variation curve of the highest fitness of all individuals of each base classifier during the training process of each of the 20 base classifiers.

**Table 1.**The accuracy and runtime of the coal-gangue identification by SVM models with a different kernel function.

Kernel Function | Accuracy Rate | Runtime (s) |
---|---|---|

Polynomial | 73% | 0.0078 |

RBF | 83% | 0.0142 |

Sigmoid | 82% | 0.0128 |

No. | Factor | Accuracy | ||
---|---|---|---|---|

Population Size | Crossover Probability | Mutation Probability | ||

1 | 30 | 0.4 | 0.01 | 0.892 |

2 | 20 | 0.4 | 0.0001 | 0.884 |

… | … | … | … | … |

80 | 70 | 0.5 | 0.01 | 0.897 |

81 | 80 | 0.8 | 0.05 | 0.889 |

k_{1} | 0.8976 | 0.8995 | 0.8977 | |

k_{2} | 0.9016 | 0.8958 | 0.9047 | |

k_{3} | 0.8936 | 0.8997 | 0.8996 | |

k_{4} | 0.9012 | 0.9001 | 0.8999 | |

k_{5} | 0.9 | 0.8967 | 0.8966 | |

k_{6} | 0.9031 | 0.9029 | 0.9004 | |

k_{7} | 0.9038 | 0.9017 | 0.8997 | |

k_{8} | 0.8963 | |||

k_{9} | 0.8976 |

**Table 3.**The results of the GA-SVM model and the AdaB-GA-SVM model when tested with the aforementioned test set.

Evaluation Indicator | GA-SVM | AdaB-GA-SVM |
---|---|---|

TP | 426 | 464 |

FP | 74 | 36 |

FN | 15 | 13 |

TN | 485 | 487 |

Acc | 0.911 | 0.951 |

P | 0.852 | 0.928 |

R | 0.966 | 0.973 |

F_{1} | 0.905 | 0.950 |

**Table 4.**Coal gangue recognition accuracy and recognition runtime of different base classifiers before and after adaptive boosting.

Base Classifier | Accuracy (%) | Recognition Runtime (s) | ||
---|---|---|---|---|

Before | After | Before | After | |

SVM | 83 | 84 | 0.0142 | 0.076 |

GS-SVM | 85 | 86 | 0.0121 | 0.104 |

PSO-SVM | 86 | 90 | 0.0139 | 0.171 |

GA-SVM | 91 | 95 | 0.0173 | 0.124 |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Xue, G.; Hou, P.; Li, S.; Qian, X.; Han, S.; Gao, S.
Coal Gangue Recognition during Coal Preparation Using an Adaptive Boosting Algorithm. *Minerals* **2023**, *13*, 329.
https://doi.org/10.3390/min13030329

**AMA Style**

Xue G, Hou P, Li S, Qian X, Han S, Gao S.
Coal Gangue Recognition during Coal Preparation Using an Adaptive Boosting Algorithm. *Minerals*. 2023; 13(3):329.
https://doi.org/10.3390/min13030329

**Chicago/Turabian Style**

Xue, Guanghui, Peng Hou, Sanxi Li, Xiaoling Qian, Sicong Han, and Song Gao.
2023. "Coal Gangue Recognition during Coal Preparation Using an Adaptive Boosting Algorithm" *Minerals* 13, no. 3: 329.
https://doi.org/10.3390/min13030329