A Review of Artificial Intelligence Technologies in Mineral Identification: Classification and Visualization
Abstract
:1. Introduction
- In this paper, artificial-intelligence-based mineral identification models are classified into three categories. (1) Artificial neural network. Mineral identification models based on artificial neural networks are accurate and have a potential advantage over other methods when, for example, Raman spectroscopy datasets are used for mineral identification, without the need to remove fluorescence. However, artificial neural networks require too much mineral expertise and experience to avoid overtraining and undertraining. (2) Machine learning. In this paper, machine learning is divided into statistical-based machine learning and rule-based machine learning. The model is given the ability to identify minerals in the process of training the model. In the process of training the model, certain rules are adopted to improve efficiency by influencing the training of the model i.e., rule-based machine learning. Statistical-based machine learning, on the other hand, requires little mineral expertise, relies mainly on the quality of the dataset, and requires large amounts of data for training. (3) Deep learning. The emergence of deep learning breaks the deadlock of artificial intelligence and is an extension of artificial neural networks. Deep learning models have deeper hidden layers to achieve results that are as close to reality as possible; so, deep learning has more learning power and better performance. However, the accuracy of deep learning is highly dependent on data, and the larger the amount of data, the higher the accuracy. In addition, the nonlinear function mapping capability is applied to the deep learning model. As a result, it reduces the computation time for extracting features from images and improves the accuracy of the deep learning model.
- Visualization analysis is performed to explore the panorama of developments in the field based on literature related to intelligent mineral identification. Specifically, we first analyzed the development paths of the field to obtain information on the evolution of academic hot spots and themes over time. Then, recent literature was analyzed to explore what topics are of concern to recent scholars. Finally, based on the taxonomy established above, we perform keyword detection analysis on the literature related to each of the three intelligent identification methods and summarize the focus and development trend of the research of different identification methods. The visualization results show that a wider range of application scenarios with more accurate identification results become the main goal pursued by scholars.
2. Related Work
3. Preliminaries
3.1. Basic Process of Intelligent Mineral Identification
3.2. Network Architecture Overview
3.2.1. Artificial Neural Network (ANN)
3.2.2. Convolutional Neural Network (CNN)
- Convolutional Layer: As the central layer of the convolutional neural network, the convolutional layer extracts different features of the input data by convolutional operations, and it also reduces the number of parameters to prevent overfitting caused by too many parameters. The convolutional layer can have multiple convolutional kernels, and each element of each convolutional kernel also has corresponding weight coefficients and deviations. When the convolutional kernel slides to each position, it performs an operation with the input image and projects the information in its field of perception onto the feature map. The parameters of the convolution layer mainly consist of the convolution kernel size, padding, and step size. The size of the convolution kernel needs to be smaller than the size of the input image, and as the convolution kernel gets larger, the input features that can be extracted become more and more complex. The padding process is to artificially increase the size of the feature map before it passes through the convolution kernel to counteract the negative effects of size shrinkage during the computation. The step size of the convolution mainly defines the distance between two adjacent positions of the convolution kernel during the scanning of the feature map. When its value is 1, the convolution kernel scans every element of the entire feature map; when its value is n, it skips n-1 pixels after each scan to continue scanning.The dimensionality of the convolution layer can be calculated from the filter of size (,,c), the input image of fixed size (H,W,C), the step size , and the number of zero padding . The calculation formula is as follows:The convolutional layer usually has an activation layer, which is usually combined with the convolutional layer and called the “convolutional layer”. The activation layer is a nonlinear mapping of the output of the convolutional layer, and the activation function used is usually the ReLU function.
- Pooling Layer: The pooling layer is in the middle of the successive convolutional layers, which is mainly used for feature selection and information filtering. Feature selection is mainly used to reduce the number of training parameters, thus reducing the dimensionality of the output feature vector of the convolutional layer, while information filtering is performed to retain only useful information, so as to reduce the transmission of noise and also effectively prevent the generation of overfitting phenomena. Usually, pooling layers are inserted periodically between successive convolutional layers. The two main pooling methods are Max Pooling, which picks the maximum value of the sliding window, and Average Pooling, which picks the average value of the sliding window. The dimensionality of the pooling layer can be calculated as
- Fully Connected Layer: The fully connected layer is located in the last part of the implicit layer of the convolutional neural network and only passes signals to the other fully connected layers. The role of the fully connected layer is to perform a nonlinear combination of the extracted features to obtain the output for classification, i.e., the features obtained from the convolutional and pooling layers are classified by the fully connected layer. The fully connected layer mainly obtains the weight of each neuron feedback based on the weights, and then adjusts the weights and the network to obtain the final classification results.
3.2.3. Difference between ANN and CNN
4. Mineral Identification Method Based on Artificial Intelligence
4.1. Artificial Neural Network
4.2. Machine Learning
4.3. Deep Learning
4.4. Other Models
5. Visualization and Analysis
5.1. Data and Visualization Tools
5.2. Field of Mineral Identification
5.3. Mineral Identification Methods Based on Taxonomy
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Ramil, A.; López, A.; Pozo-Antonio, J.; Rivas, T. A computer vision system for identification of granite-forming minerals based on RGB data and artificial neural networks. Measurement 2018, 117, 90–95. [Google Scholar] [CrossRef]
- López, A.; Ramil, A.; Pozo-Antonio, J.; Fiorucci, M.; Rivas, T. Automatic Identification of Rock-Forming Minerals in Granite Using Laboratory Scale Hyperspectral Reflectance Imaging and Artificial Neural Networks. J. Nondestruct. Eval. 2017, 36, 52. [Google Scholar] [CrossRef]
- DeTore, A.W. An introduction to expert systems. J. Insur. Med. 1989, 21, 233–236. [Google Scholar]
- Folorunso, I.; Abikoye, O.; Jimoh, R.; Raji, K. A rule-based expert system for mineral identification. J. Emerg. Trends Comput. Inf. Sci. 2012, 3, 205–210. [Google Scholar]
- Okada, N.; Maekawa, Y.; Owada, N.; Haga, K.; Shibayama, A.; Kawamura, Y. Automated identification of mineral types and grain size using hyperspectral imaging and deep learning for mineral processing. Minerals 2020, 10, 809. [Google Scholar] [CrossRef]
- Mishra, K.A. AI4R2R (AI for Rock to Revenue): A Review of the Applications of AI in Mineral Processing. Minerals 2021, 11, 1118. [Google Scholar] [CrossRef]
- Izadi, H.; Sadri, J.; Bayati, M. An intelligent system for mineral identification in thin sections based on a cascade approach. Comput. Geosci. 2017, 99, 37–49. [Google Scholar] [CrossRef]
- Prabhavathy, P.; Tripathy, B.; Venkatesan, M. Unsupervised learning method for mineral identification from hyperspectral data. In Proceedings of the International Conference on Innovations in Bio-Inspired Computing and Applications, Gunupur, India, 16–18 December 2019; Springer: Cham, Switzerland, 2019; pp. 148–160. [Google Scholar]
- Chen, Z.; Liu, X.; Yang, J.; Little, E.; Zhou, Y. Deep learning-based method for SEM image segmentation in mineral characterization, an example from Duvernay Shale samples in Western Canada Sedimentary Basin. Comput. Geosci. 2020, 138, 104450. [Google Scholar] [CrossRef]
- Lobo, A.; Garcia, E.; Barroso, G.; Martí, D.; Fernandez-Turiel, J.L.; Ibáñez-Insa, J. Machine-learning for mineral identification and ore estimation from hyperspectral imagery in tin-tungsten deposits. Remote Sens. 2021, 13, 3258. [Google Scholar] [CrossRef]
- De Lima, R.P.; Duarte, D.; Nicholson, C.; Slatt, R.; Marfurt, K.J. Petrographic microfacies classification with deep convolutional neural networks. Comput. Geosci. 2020, 142, 104481. [Google Scholar] [CrossRef]
- Tang, D.G.; Milliken, K.L.; Spikes, K.T. Machine learning for point counting and segmentation of arenite in thin section. Mar. Pet. Geol. 2020, 120, 104518. [Google Scholar] [CrossRef]
- Wang, Q.; Zhang, X.; Tang, B.; Ma, Y.; Xing, J.; Liu, L. Lithology identification technology using BP neural network based on XRF. Acta Geophys. 2021, 69, 2231–2240. [Google Scholar] [CrossRef]
- Thompson, S.; Fueten, F.; Bockus, D. Mineral identification using artificial neural networks and the rotating polarizer stage. Comput. Geosci. 2001, 27, 1081–1089. [Google Scholar] [CrossRef]
- Baykan, N.A.; Yılmaz, N. Mineral identification using color spaces and artificial neural networks. Comput. Geosci. 2010, 36, 91–97. [Google Scholar] [CrossRef]
- Mlynarczuk, M.; Skiba, M. The application of artificial intelligence for the identification of the maceral groups and mineral components of coal. Comput. Geosci. 2017, 103, 133–141. [Google Scholar] [CrossRef]
- Xu, S.T.; Zhou, Y.Z. Artificial intelligence identification of ore minerals under microscope based on deep learning algorithm. Acta Petrol. Sin. 2018, 34, 3244–3252. [Google Scholar]
- Guo, Y.J.; Zhou, Z.; Lin, H.X.; Liu, X.H.; Chen, D.Q.; Zhu, J.Q.; Wu, J.Q. The mineral intelligence identification method based on deep learning algorithms. Earth Sci. Front. 2020, 27, 39–47. [Google Scholar]
- Ren, W.; Zhang, S.; Huang, J.Q. The rock and mineral intelligence identification method based on deep learning. Geol. Rev. 2021, 67, 2. [Google Scholar]
- Wen, L.; Jia, M.; Wang, Q.; Fu, Q.; Zhao, J. Automated mineralogy part I. advances and applications in quantitative, automated mineralogical methods. China Min. Mag. 2020, 341–349. [Google Scholar]
- Qiang, Z.; Runxin, Z.; Junming, L.; Cong, W.; Hezhe, Z.; Ying, T. Review on coal and rock identification technology for intelligent mining in coal mines. Coal Sci. Technol. 2022. [Google Scholar]
- Huizhen, H.; Qing, G.; Xiumian, H. Research Advances and Prospective in Mineral Intelligent Identification Based on Machine Learning. Editor. Comm. Earth-Sci.-J. China Univ. Geosci. 2021, 46, 3091–3106. [Google Scholar]
- Murugan, P. Feed forward and backward run in deep convolution neural network. arXiv 2017, arXiv:1711.03278. [Google Scholar]
- Mi, Z.; Kai, Q.; Ling, Z.; Yuechao, Y. Deep Learning Method for Mineral Spectral Unmixing. Uranium Geol. 2022. [Google Scholar]
- Jiang, G.; Zhou, K.; Wang, J.; Cui, S.; Zhou, S.; Tang, C. Identification of iron-bearing minerals based on HySpex hyperspectral remote sensing data. J. Appl. Remote. Sens. 2019, 13, 047501. [Google Scholar] [CrossRef]
- Izadi, H.; Sadri, J.; Mehran, N.A. Intelligent mineral identification using clustering and artificial neural networks techniques. In Proceedings of the 2013 First Iranian Conference on Pattern Recognition and Image Analysis (PRIA), Birjand, Iran, 6–8 March 2013; pp. 1–5. [Google Scholar]
- Aligholi, S.; Khajavi, R.; Razmara, M. Automated mineral identification algorithm using optical properties of crystals. Comput. Geosci. 2015, 85, 175–183. [Google Scholar] [CrossRef]
- Anigbogu, P.; Odunayo, D.; Okwudili, S.; Olanloye. An Intelligent System for Mineral Prospecting Using Supervised and Unsupervised Learning Approach. Int. J. Eng. Tech. Res. (IJETR) 2015, 3, 200–208. [Google Scholar]
- Yousefi, B.; Castanedo, C.I.; Maldague, X.P.; Beaudoin, G. Assessing the reliability of an automated system for mineral identification using LWIR Hyperspectral Infrared imagery. Miner. Eng. 2020, 155, 106409. [Google Scholar] [CrossRef]
- Khajehzadeh, N.; Haavisto, O.; Koresaar, L. On-stream and quantitative mineral identification of tailing slurries using LIBS technique. Miner. Eng. 2016, 98, 101–109. [Google Scholar] [CrossRef]
- Cochrane, C.J.; Blacksberg, J. A fast classification scheme in Raman spectroscopy for the identification of mineral mixtures using a large database with correlated predictors. IEEE Trans. Geosci. Remote Sens. 2015, 53, 4259–4274. [Google Scholar] [CrossRef]
- El Haddad, J.; de Lima Filho, E.S.; Vanier, F.; Harhira, A.; Padioleau, C.; Sabsabi, M.; Wilkie, G.; Blouin, A. Multiphase mineral identification and quantification by laser-induced breakdown spectroscopy. Miner. Eng. 2019, 134, 281–290. [Google Scholar] [CrossRef]
- Domínguez-Olmedo, J.L.; Toscano, M.; Mata, J. Application of classification trees for improving optical identification of common opaque minerals. Comput. Geosci. 2020, 140, 104480. [Google Scholar] [CrossRef]
- Tiwary, A.K.; Ghosh, S.; Singh, R.; Mukherjee, D.P.; Shankar, B.U.; Dash, P.S. Automated coal petrography using random forest. Int. J. Coal Geol. 2020, 232, 103629. [Google Scholar] [CrossRef]
- Liu, C.; Li, M.; Zhang, Y.; Han, S.; Zhu, Y. An enhanced rock mineral recognition method integrating a deep learning model and clustering algorithm. Minerals 2019, 9, 516. [Google Scholar] [CrossRef]
- Liu, X.; Song, H. Automatic identification of fossils and abiotic grains during carbonate microfacies analysis using deep convolutional neural networks. Sediment. Geol. 2020, 410, 105790. [Google Scholar] [CrossRef]
- Anderson, T.I.; Vega, B.; Kovscek, A.R. Multimodal imaging and machine learning to enhance microscope images of shale. Comput. Geosci. 2020, 145, 104593. [Google Scholar] [CrossRef]
- Latif, G.; Bouchard, K.; Maitre, J.; Back, A.; Bédard, L.P. Deep-Learning-Based Automatic Mineral Grain Segmentation and Recognition. Minerals 2022, 12, 455. [Google Scholar] [CrossRef]
- Zhao, H.; Deng, K.; Li, N.; Wang, Z.; Wei, W. Hierarchical Spatial-Spectral Feature Extraction with Long Short Term Memory (LSTM) for Mineral Identification Using Hyperspectral Imagery. Sensors 2020, 20, 6854. [Google Scholar] [CrossRef]
- Tanaka, S.; Tsuru, H.; Someno, K.; Yamaguchi, Y. Identification of alteration minerals from unstable reflectance spectra using a deep learning method. Geosciences 2019, 9, 195. [Google Scholar] [CrossRef]
- Jahoda, P.; Drozdovskiy, I.; Payler, S.J.; Turchi, L.; Bessone, L.; Sauro, F. Machine learning for recognizing minerals from multispectral data. Analyst 2021, 146, 184–195. [Google Scholar] [CrossRef]
- Cai, Y.; Xu, D.; Shi, H. Rapid identification of ore minerals using multi-scale dilated convolutional attention network associated with portable Raman spectroscopy. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2022, 267, 120607. [Google Scholar] [CrossRef]
- Qing-Lin, T.; Bang-Jie, G.; Fa-Wang, Y.; Yao, L.; Peng-Fei, L.; Xue-Jiao, C. Mineral Spectra Classification Based on One-Dimensional Dilated Convolutional Neural Network. Spectrosc. Spectr. Anal. 2022, 42, 873–877. [Google Scholar]
- Zeng, X.; Xiao, Y.; Ji, X.; Wang, G. Mineral identification based on deep learning that combines image and mohs hardness. Minerals 2021, 11, 506. [Google Scholar] [CrossRef]
- Lou, W.; Zhang, D.; Bayless, R.C. Review of mineral recognition and its future. Appl. Geochem. 2020, 122, 104727. [Google Scholar] [CrossRef]
- Zhang, Y.; Li, M.; Han, S.; Ren, Q.; Shi, J. Intelligent identification for rock-mineral microscopic images using ensemble machine learning algorithms. Sensors 2019, 19, 3914. [Google Scholar] [CrossRef] [PubMed]
- Peng, W.; Bai, L.; Shang, S.; Tang, X.; Zhang, Z. Common mineral intelligent recognition based on improved InceptionV3. Geol. Bull. China 2019, 38, 2059–2066. [Google Scholar]
- Wei, R.; Zhang, M.; Zhang, S.; Qiao, J.; Huang, J. Identifying rock thin section based on convolutional neural networks. In Proceedings of the 2019 the 9th International Workshop on Computer Science and Engineering, Hong Kong, China, 15–17 June 2019. [Google Scholar]
- Remus, J.J.; Harmon, R.S.; Hark, R.R.; Haverstock, G.; Baron, D.; Potter, I.K.; Bristol, S.K.; East, L.J. Advanced signal processing analysis of laser-induced breakdown spectroscopy data for the discrimination of obsidian sources. Appl. Opt. 2012, 51, B65–B73. [Google Scholar] [CrossRef]
- Xuefeng, L.; Xiaowei, Z.; Xin, Z.; Daojie, C.; Hao, N.; Chaoliu, L.; Jun, Y.; Falong, H.; Changxi, L.; Baojun, W. Pore structure characterization of shales using SEM and machine learning-based segmentation method. J. China Univ. Pet. (Edition Nat. Sci. 2022, 46, 23–33. [Google Scholar]
- Baklanova, O.; Shvets, O. Cluster analysis methods for recognition of mineral rocks in the mining industry. In Proceedings of the 2014 4th International Conference on Image Processing Theory, Tools and Applications (IPTA), Paris, France, 14–17 October 2014; pp. 1–5. [Google Scholar]
- Zhang, Z.L.; Zhang, Z.W.; Hu, Q.; Wang, L. Study on multi-product coal image classification miethod based on deep learning. Coal Sci. Technol. 2021, 49. [Google Scholar]
- Wu, B.; Li, X.; Yuan, F.; Li, H.; Zhang, M. Transfer learning and siamese neural network based identification of geochemical anomalies for mineral exploration: A case study from the CuAu deposit in the NW Junggar area of northern Xinjiang Province, China. J. Geochem. Explor. 2022, 232, 106904. [Google Scholar] [CrossRef]
- Van Eck, N.; Waltman, L. Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics 2010, 84, 523–538. [Google Scholar] [CrossRef]
- Aria, M.; Cuccurullo, C. bibliometrix: An R-tool for comprehensive science mapping analysis. J. Inf. 2017, 11, 959–975. [Google Scholar] [CrossRef]
- Pooley, F. The Identification Of Asbestos Dust With An Electron Microscope Microprobe Analaser. Ann. Occup. Hyg. 1975, 18, 181–186. [Google Scholar]
- Susilowati, Y.; Rahyuwibowo, H.; Mengko, T.R. Characteristic of interference color in rock forming mineral images. In Proceedings of the Asia-Pacific Conference on Circuits and Systems, Denpasar, Indonesia, 28–31 October 2002; Volume 2, pp. 265–268. [Google Scholar]
- Wang, A.; Haskin, L.A.; Lane, A.L.; Wdowiak, T.J.; Squyres, S.W.; Wilson, R.J.; Hovland, L.E.; Manatt, K.S.; Raouf, N.; Smith, C.D. Development of the Mars microbeam Raman spectrometer (MMRS). J. Geophys. Res. Planets 2003, 108, E1. [Google Scholar] [CrossRef]
- Verberckmoes, S.; Persy, V.; Behets, G.; Neven, E.; Hufkens, A.; Zebger-Gong, H.; Müller, D.; Haffner, D.; Querfeld, U.; Bohic, S.; et al. Uremia-related vascular calcification: More than apatite deposition. Kidney Int. 2007, 71, 298–303. [Google Scholar] [CrossRef]
- Lemmens, H.; Butcher, A.; Botha, P. FIB/SEM and SEM/EDX: A New Dawn for the SEM in the Core Lab? Petrophysics-SPWLA J. Form. Eval. Reserv. Descr. 2011, 52, 452–456. [Google Scholar]
- Li, Y.; Lu, H.; Zhang, L.; Li, J.; Serikawa, S. Real-time visualization system for deep-sea surveying. Math. Probl. Eng. 2014, 2014, 437071. [Google Scholar] [CrossRef]
- Li, N.; Huang, P.; Zhao, H.; Jia, G. The quantitative evaluation of application of hyperspectral data based on multi-parameters joint optimization. Sci. China Technol. Sci. 2014, 57, 2249–2255. [Google Scholar] [CrossRef]
- Izadi, H.; Sadri, J.; Mehran, N.A. A new approach to apply texture features in minerals identification in petrographic thin sections using ANNs. In Proceedings of the 2013 8th Iranian Conference on Machine Vision and Image Processing (MVIP), Zanjan, Iran, 10–12 September 2013; pp. 257–261. [Google Scholar]
- Ellis, B.R.; Peters, C.A. 3D Mapping of calcite and a demonstration of its relevance to permeability evolution in reactive fractures. Adv. Water Resour. 2016, 95, 246–253. [Google Scholar] [CrossRef]
- Hao, H.; Guo, R.; Gu, Q.; Hu, X. Machine learning application to automatically classify heavy minerals in river sand by using SEM/EDS data. Miner. Eng. 2019, 143, 105899. [Google Scholar] [CrossRef]
- Iglesias, J.C.Á.; Santos, R.B.M.; Paciornik, S. Deep learning discrimination of quartz and resin in optical microscopy images of minerals. Miner. Eng. 2019, 138, 79–85. [Google Scholar] [CrossRef]
- Tang, K.; Da Wang, Y.; Mostaghimi, P.; Knackstedt, M.; Hargrave, C.; Armstrong, R.T. Deep convolutional neural network for 3D mineral identification and liberation analysis. Miner. Eng. 2022, 183, 107592. [Google Scholar] [CrossRef]
- Ehlmann, B.L.; Mustard, J.F.; Swayze, G.A.; Clark, R.N.; Bishop, J.L.; Poulet, F.; Des Marais, D.J.; Roach, L.H.; Milliken, R.E.; Wray, J.J.; et al. Identification of hydrated silicate minerals on Mars using MRO-CRISM: Geologic context near Nili Fossae and implications for aqueous alteration. J. Geophys. Res. Planets 2009, 114, E2. [Google Scholar] [CrossRef]
- Das, R.; Mondal, A.; Chakraborty, T.; Ghosh, K. Deep neural networks for automatic grain-matrix segmentation in plane and cross-polarized sandstone photomicrographs. Appl. Intell. 2022, 52, 2332–2345. [Google Scholar] [CrossRef]
- Casetou-Gustafson, S.; Akselsson, C.; Hillier, S.; Olsson, B.A. The importance of mineral determinations to PROFILE base cation weathering release rates: A case study. Biogeosciences 2019, 16, 1903–1920. [Google Scholar] [CrossRef]
- Ting-Yue, L.; Jing-Jing, D.; Shu-Fang, T. A Neural Network Recognition Method for Garnets Subclass Based on Hyper Spectroscopy. Spectrosc. Spectr. Anal. 2021, 41, 1758–1763. [Google Scholar]
- Shao, Y.; Chen, Q.; Zhang, D. The application of improved BP neural network algorithm in lithology recognition. In Proceedings of the International Symposium on Intelligence Computation and Applications, Wuhan, China, 19–21 December 2008; Springer: Cham, Swizterland, 2008; pp. 342–349. [Google Scholar]
- Tsuji, T.; Yamaguchi, H.; Ishii, T.; Matsuoka, T. Mineral classification from quantitative X-ray maps using neural network: Application to volcanic rocks. Isl. Arc 2010, 19, 105–119. [Google Scholar] [CrossRef]
- Ritz, M.; Vaculíková, L.; Plevová, E. Application of infrared spectroscopy and chemometric methods to identification of selected minerals. Acta Geodyn. Geomater. 2011, 8, 47–58. [Google Scholar]
- Carey, C.; Boucher, T.; Mahadevan, S.; Bartholomew, P.; Dyar, M. Machine learning tools formineral recognition and classification from Raman spectroscopy. J. Raman Spectrosc. 2015, 46, 894–903. [Google Scholar] [CrossRef]
- Bi, Y.; Yan, M.; Dong, X.; Li, Z.; Zhang, Y.; Li, Y. Recognition of 25 natural geological samples using a modified correlation analysis method and laser-induced breakdown spectroscopic data. Optik 2018, 158, 1058–1062. [Google Scholar] [CrossRef]
- Wang, Q.; Li, F.; Jiang, X.; Wu, S.; Xu, M. On-stream mineral identification of tailing slurries of tungsten via NIR and XRF data fusion measurement techniques. Anal. Methods 2020, 12, 3296–3307. [Google Scholar] [CrossRef]
- Yousefi, B.; Sojasi, S.; Castanedo, C.I.; Maldague, X.P.; Beaudoin, G.; Chamberland, M. Comparison assessment of low rank sparse-PCA based-clustering/classification for automatic mineral identification in long wave infrared hyperspectral imagery. INfrared Phys. Technol. 2018, 93, 103–111. [Google Scholar] [CrossRef]
- Li, N.; Hao, H.; Jiang, Z.; Jiang, F.; Guo, R.; Gu, Q.; Hu, X. A multi-task multi-class learning method for automatic identification of heavy minerals from river sand. Comput. Geosci. 2020, 135, 104403. [Google Scholar] [CrossRef]
Identification Model | Algorithm Type | Main Advantages and Disadvantages |
---|---|---|
Artificial Neural Networks | Perceptron [1,2], Autoencoder [24], BP neural network [13,25], multilayer perceptron network [26] | It is possible to fit complex patterns and solve linearly nonseparable problems. User expertise and experience are required. |
Statistical-based machine learning | Statistical learning [27], clustering [8,28,29] | The principle is relatively simple, easy to implement, and the convergence speed is fast. It is easy to fall into a local optimum. |
Rule-based machine learning | Principal component analysis [29] | Any dataset can be used as input to the principal component analysis algorithm. |
Partial least squares regression [30,31,32], decision trees [33], random forests [10,34] | Easy to understand, requires a small amount of data to work with, can handle both numerical and categorical data, and has strong robustness. | |
Deep Learning | Transfer learning [5,11,35,36], Convolutional Neural Networks [5,9,12,37,38,39,40,41,42,43,44,45], Inception-v3 [35,46,47], ResNet [18,38,48] | Complex structured data can be represented and end-to-end learning can be achieved. |
Algorithm | Pros | Cons |
---|---|---|
Perceptron [1,2] | The model is simple and easy to implement. | Cannot handle linearly indistinguishable training data perfectly. The final number of iterations is strongly influenced by the hyperplane results as well as the data in the training set. The goal of the loss function is only to reduce all misclassified points with the hyperplane. Eventually, it is likely that some of the sample points will be very close to the hyperplane; in a way, such a classification effect is not particularly good, and this problem will be well-solved in the support vector machine. |
Autoencoder [24] | Generalization is strong and it is unsuperv- ized learning, so no data labeling is required. | It is lossy, and the decompressed output is degraded compared with the original input. It is data-dependent and can only compress those data that are similar to the training data. |
BP Neural Network [13,25] | Strong nonlinear mapping capability. Highly self-learning and self-adaptive capabilities. With some generalization ability. Fault-tolerance capability. | With local miniaturization problem. Slow convergence rate. Structure selection varies. Paradoxical problems with application examples and network size. Paradoxical problems with predictive and training abilities. |
Multilayer perceptual networks [26] | High parallelism. High nonlinear generic effect. Good fault tolerance and associative memory function. | The number of parameters makes training difficult. The spatial information between pixels is lost and only vector input is accepted. |
Algorithm | Pros | Cons |
---|---|---|
Statistical Learning [27] | Simple and stable. | The iteration speed is slow, the number of iterations is high, and it is easy to fall into local optimum. |
Clustering [8,28,29] | Simple, direct, and efficient. Fast convergence. Strong interpretability of results. Good clustering effect. | The mean value must be defined. The number of clusters needs to be specified. The value of the number of clusters affects the clustering effect. High impact on outliers. |
Algorithm | Pros | Cons |
---|---|---|
Principal component analysis [29] | Greater ease of use of datasets. Reducing the computational overhead of the algorithm. Removing noise. Making the results easier to understand. Complete absence of parameter restrictions. | If the user has some a priori knowledge of the observed object and has mastered some features of the data, but is unable to intervene in the processing process through methods such as parameterization, the expected results may not be obtained and the efficiency may not be high. The decomposition of eigenvalues has certain limitations. In the case of non-Gaussian distribution, the resulting principal elements may not be optimal. |
Partial least squares regression [30,31,32] | The regression of multiple dependent variables on multiple independent variables can be performed simultaneously, which is also applicable when the sample is small, and the exact regression equation can be obtained. The degree of influence of independent variables on depen- dent variables can be quantified when the number of variables is suitable. It is possible to control and predict more effectively. | The regression coefficients are difficult to interpret. Not applicable when the number of independent variables is small. |
Decision Tree [33] | Easy to understand and simple to explain the mechanism. Can be used for small datasets. Less time complexity. Can handle numbers and classes of data. Can handle multiple output problems. Insensitive to missing values. Can handle uncorrelated feature data. High efficiency, requiring only one construction and repeated use, with the maximum number of calculations per prediction not exceeding the depth of the decision tree. | More difficult to predict for continuous fields. Prone to overfitting. When there are too many categories, the error may incre- ase faster. Does not perform too well when dealing with data with strong feature correlation. For data with inconsistent sample sizes in each category, the information gain results in favor of those features with more values in the decision tree. |
Random Forest [10,34] | Training can be highly parallelized. When the sample features are of high dimensionality, the model can still be trained efficiently. After training, the importance of each feature for the output can be given. Due to the use of random sampling, the variance of the trained model is small and the generalization ability is strong. The implementation is relatively simple. Insensitive to partial missing features. | On certain sample sets with more noise, it is easy to fall into overfitting. Features that take more divided values tend to have a greater impact, which affects the effectiveness of the fitted model. |
Algorithm | Pros | Cons |
---|---|---|
Transfer learning [5,11,35,36] | Requires less training data and can make more efficient use of existing data. Better generalization of the model by migration learning. The training process is more stable and easier to debug, increasing the robustness of the model. Makes deep learning easier. Enables customization. | Although it can be quantified, it has an upper limit and is not suitable for solving all problems. |
Convolutional Neural Network [5,9,12,37,38,39,40,41,42,43,44,45] | Shared convolutional kernel, which can handle high-dimensional data. No manual feature selection and good feature classification. | Need to normalize the dataset; difficult to train with different sizes mixed together. No memory function. Physical meaning is not clear enough. Need to tune the reference; need a large num- ber of samples; training is best to use GPU. Natural language processing capability for video speech. |
Algorithm | Pros | Cons |
---|---|---|
Inception-v3 [35,46,47] | Fast calculation speed. Increased network depth. Increased network width. Decomposing into small convolutions is effective to reduce the number of parameters, mitigate overfitting, and increase the expressiveness of the network nonlinearity. Making spatial structured, transforming spatial information into higher-order abstract feature information. Having higher expressiveness of the rich network. | The problem of information loss due to infor- mation compression cannot be solved without an increase in computational volume. It is not possible to increase the topology of the model to improve its expressiveness without increasing the computational volume. |
ResNet [18,38,48] | Enables feedforward/feedback propa- gation algorithms to proceed smoothly and with a simpler structure. Constant mapping increase basically does not degrade the performance of the network. | Long training time. |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Long, T.; Zhou, Z.; Hancke, G.; Bai, Y.; Gao, Q. A Review of Artificial Intelligence Technologies in Mineral Identification: Classification and Visualization. J. Sens. Actuator Netw. 2022, 11, 50. https://doi.org/10.3390/jsan11030050
Long T, Zhou Z, Hancke G, Bai Y, Gao Q. A Review of Artificial Intelligence Technologies in Mineral Identification: Classification and Visualization. Journal of Sensor and Actuator Networks. 2022; 11(3):50. https://doi.org/10.3390/jsan11030050
Chicago/Turabian StyleLong, Teng, Zhangbing Zhou, Gerhard Hancke, Yang Bai, and Qi Gao. 2022. "A Review of Artificial Intelligence Technologies in Mineral Identification: Classification and Visualization" Journal of Sensor and Actuator Networks 11, no. 3: 50. https://doi.org/10.3390/jsan11030050
APA StyleLong, T., Zhou, Z., Hancke, G., Bai, Y., & Gao, Q. (2022). A Review of Artificial Intelligence Technologies in Mineral Identification: Classification and Visualization. Journal of Sensor and Actuator Networks, 11(3), 50. https://doi.org/10.3390/jsan11030050