Pollen Grain Classification Using Some Convolutional Neural Network Architectures
Abstract
:1. Introduction
2. Literature Review of Related Work
3. Description of Dataset POLLEN73S
4. Methods
4.1. Block Diagram of the Proposed Methodology
4.2. CNN Architectures for Pollen Grain Classification
4.2.1. Inception V3
4.2.2. VGG 16
4.2.3. VGG 19
4.2.4. ResNet 50
4.2.5. NASNet
4.2.6. Xception
4.2.7. DenseNet 201
4.2.8. InceptionResNet V2
4.3. Holdout Cross-Validation Method
4.4. Experimental Setup
- Data augmentation: In deep learning, this is a common way to increase the number of training data available to the model, which can enhance generalization and tackle underlearning. The various keras (https://keras.io/api/ accessed on 9 November 2022) parameters we have used are as follows:
- -
- rescale: 1./255, resize pixel values to between 0 and 1 by dividing each pixel by 255;
- -
- shear_range: 0.2, applies a random shear transformation to the image;
- -
- zoom_range: 0.2, randomly zooms the image in or out;
- -
- featurewise_center: True, subtracts the average of all training images from each image;
- -
- featurewise_std_normalization: True, divides each pixel by the standard deviation of the set of training images;
- -
- rotation_range: 20%, applies a 20% rotation to the image;
- -
- width_shift_range: 0.2, applies a random horizontal shift to the image;
- -
- height_shift_range: 0.2, applies a random vertical shift to the image;
- -
- horizontal_flip: True, applies random horizontal symmetry to the image;
- -
- vertical_flip: True, applies random vertical symmetry to the image;
- -
- validation_split: 0.2, separates 80% of training data and 20% for validation.
- Dropout layers: Typically, connecting all features directly to the Fully Connected (FC) layer can result in overfitting to the training dataset. Overfitting transpires when a model excels excessively on the training data, subsequently hindering its performance on new data. To mitigate this issue, a dropout layer is implemented, selectively removing a few neurons from the neural network during training to trim down the model’s complexity. Specifically with ResNet50, a dropout rate of 0.7 is utilized, leading to the random elimination of 70% of nodes from the neural network.
- Fine Tuning: This pertains to the notion of transfer learning, a machine learning methodology where knowledge gained from training for one specific problem is leveraged to train for another related task or domain. In transfer learning, the final layers of the pretrained network are removed, and new layers are trained for the intended task. This process was executed on the CNN InceptionResNetV2 architecture, with the batch size transitioning from 12 to 16 along with data augmentation. Truncation of the top layer was implemented for this model by introducing a new softmax layer fully connected to the top layer. Additionally, fine-tuning was conducted using the Adam (adaptive moment estimation) [32] optimizer and a learning rate of 0.001.
- Hyperparameters: These are parameters whose values are used to control the learning process. These parameters are obtained using grid search and random search. These methods optimize and sort the parameters of CNN models. These values can be adjusted to optimize model performance. Here are the hyperparameters used in the experimental setup (see Table 1).
4.5. Performance Evaluation Measures
- Confusion matrix: This entails a comprehensive evaluation of the model’s performance, involving four key metrics: true positive (TP), false positive (FP), false negative (FN) and true negative (TN). These metrics are organized in a tabular format with rows and columns, where the rows portray the actual classes and the columns depict the predicted classes (see Figure 11).
- Precision: This indicates the percentage of cases that were correctly identified out of all identified cases.
- Recall: This is the number of cases correctly identified among all positive representations. It measures a model’s ability to identify all positive results.
- : This is the harmonic mean of precision and recall. It provides a compromise between precision and recall.
- Accuracy: This is the ratio of correct predictions to the total number of observations (total input samples). However, this measure is not very reliable if the classes are unbalanced.
5. Results and Discussion
5.1. Architectural Performance Summary Table
5.2. CNN Architectures’ Performances
5.3. Performance Comparison of CNN Architectures for Pollen Grain Classification: Current Results Compared to [4]
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A. Performance Graphs for the Eight Architectures
Appendix B. Graphs of pollen73S Class Probabilities When Predicting a Pollen Grain Using the Best-Performing Architecture (DenseNet201)
References
- Potts, S.G.; Biesmeijer, J.C.; Kremen, C.; Neumann, P.; Schweiger, O.; Kunin, W.E. Global pollinator declines: Trends, impacts and drivers. Trends Ecol. Evol. 2010, 25, 345–353. [Google Scholar] [CrossRef]
- Demers, I. État des Connaissances sur le Pollen et les Alergies; desLibris, Institut national de santé publique du Québec: Québec, QC, Canada, 2013. [Google Scholar]
- Girard, M. La méLissopalynologie L’étude des Pollens dans le Miel; Université Laval: Québec, QC, Canada, 2014. [Google Scholar]
- Astolfi, G.; Goncalves, A.B.; Menezes, G.V.; Borges, F.S.B.; Astolfi, A.C.M.N.; Matsubara, E.T.; Alvarez, M.; Pistori, H. POLLEN73S: An image dataset for pollen grains classification. Ecol. Inform. 2020, 60, 101165. [Google Scholar] [CrossRef]
- Bianco, S.; Cadene, R.; Celona, L.; Napoletano, P. Benchmark analysis of representative deep neural network architectures. IEEE Access 2018, 6, 64270–64277. [Google Scholar] [CrossRef]
- Li, C.; Polling, M.; Cao, L.; Gravendeel, B.; Verbeek, F.J. Analysis of automatic image classification methods for Urticaceae pollen classification. Neurocomputing 2023, 522, 181–193. [Google Scholar] [CrossRef]
- Bates, S.; Hastie, T.; Tibshirani, R. Cross-validation: What does it estimate and how well does it do it? J. Am. Stat. Assoc. 2023, 119, 1–12. [Google Scholar] [CrossRef]
- Kee, E.; Chong, J.J.; Choong, Z.J.; Lau, M. A comparative analysis of cross-validation techniques for a smart and lean pick-and-place solution with deep learning. Electronics 2023, 12, 2371. [Google Scholar] [CrossRef]
- Daood, A.; Ribeiro, E.; Bush, M. Pollen grain recognition using deep learning. In Proceedings of the Advances in Visual Computing: 12th International Symposium, ISVC 2016, Las Vegas, NV, USA, 12–14 December 2016; Proceedings, Part I 12. Springer: Berlin/Heidelberg, Germany, 2016; pp. 321–330. [Google Scholar]
- Sevillano, V.; Holt, K.; Aznarte, J.L. Precise automatic classification of 46 different pollen types with convolutional neural networks. PLoS ONE 2020, 15, e0229751. [Google Scholar] [CrossRef]
- Battiato, S.; Ortis, A.; Trenta, F.; Ascari, L.; Politi, M.; Siniscalco, C. Detection and classification of pollen grain microscope images. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Seattle, WA, USA, 14–19 June 2020; pp. 980–981. [Google Scholar]
- Treloar, W.; Taylor, G.; Flenley, J. Towards automation of palynology 1: Analysis of pollen shape and ornamentation using simple geometric measures, derived from scanning electron microscope images. J. Quat. Sci. Publ. Quat. Res. Assoc. 2004, 19, 745–754. [Google Scholar] [CrossRef]
- Travieso, C.M.; Briceño, J.C.; Ticay-Rivas, J.R.; Alonso, J.B. Pollen classification based on contour features. In Proceedings of the 2011 15th IEEE International Conference on Intelligent Engineering Systems, Poprad, Slovakia, 23–25 June 2011; pp. 17–21. [Google Scholar] [CrossRef]
- García, N.M.; Chaves, V.A.E.; Briceño, J.C.; Travieso, C.M. Pollen Grains Contour Analysis on Verification Approach. In Proceedings of the Hybrid Artificial Intelligent Systems, Salamanca, Spain, 8–12 March 2012; Corchado, E., Snášel, V., Abraham, A., Woźniak, M., Graña, M., Cho, S.B., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; pp. 521–532. [Google Scholar]
- Ticay-Rivas, J.R.; del Pozo-Baños, M.; Travieso, C.M.; Arroyo-Hernández, J.; Pérez, S.T.; Alonso, J.B.; Mora-Mora, F. Pollen Classification Based on Geometrical, Descriptors and Colour Features Using Decorrelation Stretching Method. In Proceedings of the Artificial Intelligence Applications and Innovations, Wroclaw, Poland, 23–25 May 2011; Iliadis, L., Maglogiannis, I., Papadopoulos, H., Eds.; Springer: Berlin/Heidelberg, Germany, 2011; pp. 342–349. [Google Scholar]
- Sevillano, V.; Aznarte, J.L. Improving classification of pollen grain images of the POLEN23E dataset through three different applications of deep learning convolutional neural networks. PLoS ONE 2018, 13, e0201807. [Google Scholar] [CrossRef] [PubMed]
- Polling, M.; Li, C.; Cao, L.; Verbeek, F.; de Weger, L.A.; Belmonte, J.; De Linares, C.; Willemse, J.; de Boer, H.; Gravendeel, B. Neural networks for increased accuracy of allergenic pollen monitoring. Sci. Rep. 2021, 11, 11357. [Google Scholar] [CrossRef]
- Gallardo, R.; García-Orellana, C.J.; González-Velasco, H.M.; García-Manso, A.; Tormo-Molina, R.; Macías-Macías, M.; Abengózar, E. Automated multifocus pollen detection using deep learning. Multimed. Tools Appl. 2024, 1–16. [Google Scholar] [CrossRef]
- Gallardo-Caballero, R.; García-Orellana, C.J.; García-Manso, A.; González-Velasco, H.M.; Tormo-Molina, R.; Macías-Macías, M. Precise pollen grain detection in bright field microscopy using deep learning techniques. Sensors 2019, 19, 3583. [Google Scholar] [CrossRef] [PubMed]
- Gonçalves, A.B.; Souza, J.S.; Silva, G.G.d.; Cereda, M.P.; Pott, A.; Naka, M.H.; Pistori, H. Feature extraction and machine learning for the classification of Brazilian Savannah pollen grains. PLoS ONE 2016, 11, e0157044. [Google Scholar]
- Manikis, G.C.; Marias, K.; Alissandrakis, E.; Perrotto, L.; Savvidaki, E.; Vidakis, N. Pollen grain classification using geometrical and textural features. In Proceedings of the 2019 IEEE International Conference on Imaging Systems and Techniques (IST), Abu Dhabi, UAE, 8–10 December 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–6. [Google Scholar]
- del Pozo-Banos, M.; Ticay-Rivas, J.R.; Alonso, J.B.; Travieso, C.M. Features extraction techniques for pollen grain classification. Neurocomputing 2015, 150, 377–391. [Google Scholar] [CrossRef]
- Marcos, J.V.; Nava, R.; Cristóbal, G.; Redondo, R.; Escalante-Ramírez, B.; Bueno, G.; Déniz, Ó.; González-Porto, A.; Pardo, C.; Chung, F.; et al. Automated pollen identification using microscopic imaging and texture analysis. Micron 2015, 68, 36–46. [Google Scholar] [CrossRef]
- Djoulde, K.; Ousman, B.; Abboubakar, H.; Bitjoka, L.; Tchiegang, C. Classification of Pepper Seeds by Machine Learning Using Color Filter Array Images. J. Imaging 2024, 10, 41. [Google Scholar] [CrossRef] [PubMed]
- Chen, X.; Ju, F. Automatic classification of pollen grain microscope images using a multi-scale classifier with SRGAN deblurring. Appl. Sci. 2022, 12, 7126. [Google Scholar] [CrossRef]
- Kubera, E.; Kubik-Komar, A.; Kurasiński, P.; Piotrowska-Weryszko, K.; Skrzypiec, M. Detection and recognition of pollen grains in multilabel microscopic images. Sensors 2022, 22, 2690. [Google Scholar] [CrossRef] [PubMed]
- Minowa, Y.; Shigematsu, K.; Takahara, H. A deep learning-based model for tree species identification using pollen grain images. Appl. Sci. 2022, 12, 12626. [Google Scholar] [CrossRef]
- Tsiknakis, N.; Savvidaki, E.; Manikis, G.C.; Gotsiou, P.; Remoundou, I.; Marias, K.; Alissandrakis, E.; Vidakis, N. Pollen grain classification based on ensemble transfer learning on the Cretan Pollen Dataset. Plants 2022, 11, 919. [Google Scholar] [CrossRef]
- Viertel, P.; König, M. Pattern recognition methodologies for pollen grain image classification: A survey. Mach. Vis. Appl. 2022, 33, 18. [Google Scholar] [CrossRef]
- Shazia, A.; Xuan, T.Z.; Chuah, J.H.; Usman, J.; Qian, P.; Lai, K.W. A comparative study of multiple neural network for detection of COVID-19 on chest X-ray. EURASIP J. Adv. Signal Process. 2021, 2021, 50. [Google Scholar] [CrossRef]
- Albahli, S.; Albattah, W. Detection of coronavirus disease from X-ray images using deep learning and transfer learning algorithms. J. X-ray Sci. Technol. 2020, 28, 841–850. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Khanzhina, N.; Putin, E.; Filchenkov, A.; Zamyatina, E. Pollen grain recognition using convolutional neural network. In Proceedings of the ESANN, Bruges, Belgium, 25–27 April 2018. [Google Scholar]
Parameters | Parameter Values |
---|---|
batch size | 4, 8, 12 and 16 |
optimizer | Adam and RMSprop |
lr(learning rate) | 0.001, 0.00001 (min_lr = 0.0001) |
and 0.00003 (min_lr = 0.0003) | |
betas( and ) | |
eps() | |
weight decay | 0 |
momentum | 0.0 |
ema_momentum | 0.99 |
CNN Models | Batch Size | Precision | Recall | Accuracy (Loss) | |
---|---|---|---|---|---|
VGG19 | 4 | 92.519 | 89.713 | 89.941 | 89.861 (0.3608) |
8 | 93.164 | 90.966 | 91.083 | 91.054 (0.3512) | |
12 | 92.824 | 90.839 | 90.920 | 90.855 (0.3453) | |
16 | 92.075 | 89.528 | 89.550 | 89.463 (0.3989) | |
VGG16 | 4 | 91.471 | 88.205 | 88.963 | 88.867 (0.3635) |
8 | 93.726 | 92.180 | 92.172 | 92.048 (0.3154) | |
12 | 92.737 | 90.648 | 90.920 | 90.855 (0.3183) | |
16 | 91.466 | 89.945 | 90.333 | 90.258 (0.3755) | |
DenseNet201 | 4 | 96.686 | 96.193 | 96.282 | 96.223 (0.2232) |
8 | 96.790 | 96.368 | 96.477 | 96.400 (0.17) | |
12 | 97.057 | 96.462 | 96.477 | 96.421 (0.1918) | |
16 | 97.554 | 97.202 | 97.260 | 97.217 (0.1352) | |
InceptionResNetV2 | 4 | 92.681 | 90.207 | 90.235 | 90.258 (0.8402) |
8 | 91.428 | 88.849 | 89.159 | 89.066 (0.5553) | |
12 | 93.212 | 90.739 | 91.279 | 91.252 (0.4930) | |
16 | 94.012 | 92.352 | 92.407 | 92.445 (0.5578) | |
InceptionV3 | 4 | 94.906 | 93.511 | 93.509 | 93.439 (0.5030) |
8 | 94.346 | 93.035 | 93.040 | 93.042 (0.4374) | |
12 | 93.400 | 92.281 | 92.368 | 92.247 (0.4073) | |
16 | 93.441 | 92.309 | 92.531 | 92.445 (0.3779) | |
NASNet | 4 | 92.297 | 90.742 | 90.809 | 90.855 (0.5790) |
8 | 93.240 | 91.696 | 91.553 | 91.650 (0.5016) | |
12 | 93.120 | 92.030 | 92.094 | 92.048 (0.4452) | |
16 | 92.667 | 91.429 | 91.553 | 91.451 (0.4603) | |
ResNet50 | 4 | 93.788 | 92.596 | 92.239 | 92.673 (0.2995) |
8 | 94.301 | 93.542 | 93.319 | 93.465 (0.3405) | |
12 | 94.834 | 94.292 | 94.168 | 94.257 (0.2869) | |
16 | 95.054 | 94.260 | 94.210 | 94.257 (0.2119) | |
Xception | 4 | 93.115 | 90.862 | 91.115 | 91.054 (0.5123) |
8 | 93.431 | 92.149 | 92.211 | 92.247 (0.3052) | |
12 | 92.655 | 90.670 | 90.920 | 90.855 (0.4729) | |
16 | 91.998 | 90.841 | 90.670 | 90.855 (0.3102) |
CNN Models | Batch Size | P | P-Aut | R | R-Aut | F | F-Aut | A | A-Aut |
---|---|---|---|---|---|---|---|---|---|
VGG19 | 4 | 92.519 | 89.9 | 89.713 | 89.2 | 89.941 | 91.5 | 89.861 | 89.7 |
8 | 93.164 | 89.0 | 90.966 | 88.5 | 91.083 | 90.7 | 91.054 | 88.9 | |
12 | 92.824 | 89.3 | 90.839 | 88.8 | 90.920 | 91.0 | 90.855 | 89.1 | |
16 | 92.075 | 87.7 | 89.528 | 87.3 | 89.550 | 90.1 | 89.463 | 87.7 | |
VGG16 | 4 | 91.471 | 87.6 | 88.205 | 87.0 | 88.963 | 89.5 | 88.867 | 87.4 |
8 | 93.726 | 87.4 | 92.180 | 86.8 | 92.172 | 89.7 | 92.048 | 87.1 | |
12 | 92.737 | 88.6 | 90.648 | 88.1 | 90.920 | 90.1 | 90.855 | 88.4 | |
16 | 91.466 | 88.0 | 89.945 | 87.5 | 90.333 | 90.0 | 90.258 | 87.9 | |
DenseNet-201 | 4 | 96.686 | 87.9 | 96.193 | 87.0 | 96.282 | 89.3 | 96.223 | 87.8 |
8 | 96.790 | 95.6 | 96.368 | 95.6 | 96.477 | 96.3 | 96.4 | 95.7 | |
12 | 97.057 | 95.7 | 96.462 | 95.7 | 96.477 | 96.4 | 96.421 | 95.8 | |
16 | 97.554 | 95.3 | 97.202 | 95.1 | 97.260 | 95.9 | 97.217 | 95.2 | |
InceptionResNet-V2 | 4 | 92.681 | 74.4 | 90.207 | 71.8 | 90.235 | 78.8 | 90.258 | 74.0 |
8 | 91.428 | 88.6 | 88.849 | 88.1 | 89.159 | 90.9 | 89.066 | 88.6 | |
12 | 93.212 | 89.7 | 90.739 | 89.4 | 91.279 | 91.3 | 91.252 | 89.7 | |
16 | 94.012 | 89.6 | 92.352 | 89.3 | 92.407 | 90.9 | 92.445 | 89.6 | |
Inception-V3 | 4 | 94.906 | 72.7 | 93.511 | 70.7 | 93.509 | 76.2 | 93.439 | 72.5 |
8 | 94.346 | 87.8 | 93.035 | 86.9 | 93.040 | 89.0 | 93.042 | 87.6 | |
12 | 93.400 | 90.3 | 92.281 | 90.0 | 92.368 | 91.6 | 92.247 | 90.4 | |
16 | 93.441 | 89.9 | 92.309 | 89.5 | 92.531 | 91.2 | 92.445 | 89.9 | |
NASNet | 4 | 92.297 | 76.8 | 90.742 | 77.0 | 90.809 | 81.7 | 90.855 | 77.0 |
8 | 93.240 | 86.2 | 91.696 | 85.9 | 91.553 | 87.9 | 91.650 | 85.9 | |
12 | 93.120 | 87.4 | 92.030 | 87.4 | 92.094 | 89.8 | 92.048 | 87.4 | |
16 | 92.667 | 86.6 | 91.429 | 86.5 | 91.553 | 89.1 | 91.451 | 86.5 | |
ResNet-50 | 4 | 93.788 | 85.0 | 92.596 | 85.0 | 92.239 | 88.0 | 92.673 | 85.0 |
8 | 94.301 | 93.0 | 93.542 | 93.0 | 93.319 | 94.0 | 93.465 | 93.0 | |
12 | 94.834 | 94.0 | 94.292 | 95.0 | 94.168 | 95.0 | 94.257 | 95.0 | |
16 | 95.054 | 93.9 | 94.260 | 94.0 | 94.210 | 94.9 | 94.257 | 94.0 | |
Xception | 4 | 93.115 | 85.0 | 90.862 | 85.0 | 91.115 | 86.0 | 91.054 | 85.0 |
8 | 93.431 | 90.0 | 92.149 | 90.0 | 92.211 | 91.0 | 92.247 | 90.0 | |
12 | 92.655 | 90.1 | 90.670 | 90.1 | 90.920 | 92.0 | 90.855 | 90.1 | |
16 | 91.998 | 90.0 | 90.841 | 90.0 | 90.670 | 91.0 | 90.855 | 90.0 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Garga, B.; Abboubakar, H.; Sourpele, R.S.; Gwet, D.L.L.; Bitjoka, L. Pollen Grain Classification Using Some Convolutional Neural Network Architectures. J. Imaging 2024, 10, 158. https://doi.org/10.3390/jimaging10070158
Garga B, Abboubakar H, Sourpele RS, Gwet DLL, Bitjoka L. Pollen Grain Classification Using Some Convolutional Neural Network Architectures. Journal of Imaging. 2024; 10(7):158. https://doi.org/10.3390/jimaging10070158
Chicago/Turabian StyleGarga, Benjamin, Hamadjam Abboubakar, Rodrigue Saoungoumi Sourpele, David Libouga Li Gwet, and Laurent Bitjoka. 2024. "Pollen Grain Classification Using Some Convolutional Neural Network Architectures" Journal of Imaging 10, no. 7: 158. https://doi.org/10.3390/jimaging10070158
APA StyleGarga, B., Abboubakar, H., Sourpele, R. S., Gwet, D. L. L., & Bitjoka, L. (2024). Pollen Grain Classification Using Some Convolutional Neural Network Architectures. Journal of Imaging, 10(7), 158. https://doi.org/10.3390/jimaging10070158