You are currently viewing a new version of our website. To view the old version click .
Sensors
  • Article
  • Open Access

24 May 2023

PlantInfoCMS: Scalable Plant Disease Information Collection and Management System for Training AI Models

,
,
,
and
1
Department of Computer Science and Engineering, Sejong University, Seoul 05006, Republic of Korea
2
Department of Convergence Engineering for Intelligent Drone, Sejong University, Seoul 05006, Republic of Korea
3
Department of Artificial Intelligence, Sejong University, Seoul 05006, Republic of Korea
*
Author to whom correspondence should be addressed.
This article belongs to the Special Issue Sensor and AI Technologies in Intelligent Agriculture

Abstract

In recent years, the development of deep learning technology has significantly benefited agriculture in domains such as smart and precision farming. Deep learning models require a large amount of high-quality training data. However, collecting and managing large amounts of guaranteed-quality data is a critical issue. To meet these requirements, this study proposes a scalable plant disease information collection and management system (PlantInfoCMS). The proposed PlantInfoCMS consists of data collection, annotation, data inspection, and dashboard modules to generate accurate and high-quality pest and disease image datasets for learning purposes. Additionally, the system provides various statistical functions allowing users to easily check the progress of each task, making management highly efficient. Currently, PlantInfoCMS handles data on 32 types of crops and 185 types of pests and diseases, and stores and manages 301,667 original and 195,124 labeled images. The PlantInfoCMS proposed in this study is expected to significantly contribute to the diagnosis of crop pests and diseases by providing high-quality AI images for learning about and facilitating the management of crop pests and diseases.

1. Introduction

Agriculture is one of the oldest and most significant industries in the world. According to the United Nations, the global population is expected to reach approximately 9 billion by 2050 [1], and meeting the increased food demand of the growing population is an important challenge [2]. Therefore, increasing the yield and quality of agricultural products becomes necessary. However, according to the forestry and fishery survey, as of December 2021, the agricultural population in South Korea has decreased by 23.9% compared to the last 10 years, and the aging rate of the agricultural population is 47%. This has led to a chronic shortage of manpower in Korean agriculture, hindering the sustainable development of agriculture. To address these issues, the government is investing heavily in smart farms based on information and communications technology (ICT).
Smart farming incorporates ICT by remotely connecting various information technologies to greenhouses, orchards, livestock barns, etc., and can thus control the growth environment of crops and livestock [3]. Smart farming combines ICT such as remote sensing, Internet of Things (IoT), unmanned aerial vehicles (UAVs), machine learning, artificial intelligence (AI), and networking with traditional agricultural systems such as crop cultivation and livestock farming [4,5]. This enables automatic monitoring and optimization of processes such as environmental conditions, growth status, soil conditions, weed management, and pest and disease management, ultimately improving crop yields and reducing costs [6]. Pest and disease diagnosis and management are major challenges in improving crop yield and quality. Recently, types of pests and diseases have been increasing in number, owing to causes such as trade globalization and climate change [7], thereby increasing farm damages. Therefore, rapid diagnosis and pest and disease control can minimize damage and economic losses on farms.
Numerous fields have effectively utilized the recent advancements in deep learning technology [8], including image-based pest and disease recognition research [9,10]. Building deep learning models requires large amounts of high-quality training data [11]. However, several difficulties exist in constructing training data for crop pests and diseases, as follows:
  • Collecting a large amount of data: It is difficult to collect a large amount of data on crop pests and diseases as they are seasonal [12]. Although data collection is relatively better for pests and diseases that occur frequently, it is extremely difficult in the case of rare, destructive pests and diseases, such as fire blight (Erwinia amylovora).
  • Building high-quality data: The key to constructing AI training data is ensuring the quality of the data [13]. An essential task in building AI training data is labeling, which involves marking the data with tags. However, there are many cases where different types of pests and diseases have similar damage symptoms (e.g., fire blight and scab). In such cases, inexperienced individuals may make inaccurate diagnoses, so pest and disease experts need to label the data directly. However, there is a severe shortage of experts [14,15].
  • Consistency of data quality: A comprehensive data management system is needed to ensure effective and consistent data quality [16]. Multiple data sources (e.g., image, location, plant, disease, pesticide info, etc.) are essential to efficiently implement services such as pest and disease occurrence monitoring and image-recognition-based pest and disease diagnosis. This data must be managed in a standardized format using an integrated data management system that can expand the data for long-term use.
To address these issues, an integrated plant disease and pest image management system is needed. Unfortunately, there is currently no system for building and managing image datasets for AI training in the agricultural field. Our research makes the following contributions:
  • In this study, we propose a scalable plant disease information collection and management system (PlantInfoCMS) that can efficiently create and manage high-quality image datasets for computer vision tasks.
  • Through the app/web-based image collection module, various disease and pest images of crop cultivation sites can be easily collected.
  • Through the annotation and inspection module, not only can a training dataset for computer vision in a standardized format be built, but high-quality pest and disease training data can also be stored.
  • The dashboard module allows users to check the progress of each task through several statistical functions, and assists administrators in making various decisions based on the information provided.
  • Now PlantInfoCMS handles data on 32 types of crops and 185 types of pests and diseases, storing and managing 301,667 original and 195,124 labeled images.
The remainder of the study is organized as follows. Section 2 describes related research, including existing publicly available pest and disease open data and pest and disease data management systems. Section 3 details the proposed method, PlantInfoCMS. Section 4 presents the discussion and limitations. Section 5 concludes the study.

3. PlantInfoCMS

As revealed through related studies, existing pest and disease image management systems do not prioritize data construction and inspection functions to create reliable datasets. Furthermore, due to the lack of suitable tools, there is also a shortage of available large-scale, high-quality pest and disease image datasets for AI training. In this study, we propose PlantInfoCMS to address these issues by introducing efficient image collection, management, and inspection capabilities. Our proposed system allows multiple users to simultaneously collect and upload images via a mobile application, and facilitates efficient management through various visualization methods. Additionally, an inspection process is introduced to ensure the quality of the collected images.
The PlantInfoCMS proposed in this study includes upload, annotation, inspection, user and crop info, and dashboard modules as shown in Figure 1. In the upload module, users input the crop pest and disease images, and the annotation module specifies the damaged areas in the original images to create AI training data. In the inspection module, the uploaded original images or annotated images are reviewed to ensure accurate and high-quality training images. The user and crop info management module manages user, crop, pest, and disease information. Finally, the dashboard module displays the progress of tasks such as image uploads, annotations, and schedules using various visualization tools.
Figure 1. Diagram of PlantInfoCMS. PlantInfoCMS is comprised of modules for image upload, image annotation, image inspection, user and crop information management, and a dashboard.
The proposed system can be described as a multi-tiered architecture from a software engineering perspective. At the highest level, the system has a front-end interface that enables users to upload images and manage them based on crop, disease, or pest level. The interface is implemented as a web-based application using the jQuery and Bootstrap libraries. The middle level of the system consists of the image management, annotation, and inspection module, which provides features such as image upload, annotation progress status analysis, and image inspection logic. This module is responsible for managing and processing the large volume of images collected by the system, and is implemented using Hypertext Preprocessor (PHP). At the lowest tier, the system includes an image storage system, which is responsible for storing the images and associated metadata such as crop type, disease type, and annotation information. The MySQL database is employed to store the images and metadata. Each module will be explained in detail in the sections below.
Figure 2 shows the design of the database structure for the main tables used in the system. The fb_standard_pest and fb_standard_disease tables in the database store information about uploaded pest and disease images, respectively. The fb_gmember table stores users’ information, while the fb_gcontent table stores metadata related to the images uploaded by users. The metadata includes information such as the crop type, pest or disease type, and shooting location, and is linked to the table that stores information about the uploaded images using foreign keys. The fb_inspection table is used to record the results of pest and disease expert inspection for each image. This table is connected to the fb_gcontent and fb_gmember tables using foreign keys to manage the metadata of the images to be inspected and the information of the users who uploaded the images.
Figure 2. Entity relationship diagram of PlantInfoCMS. The ERD illustrates the database structure design for the primary tables employed in the system. The tables responsible for storing uploaded images, user information, and image inspection data are interconnected using foreign keys.

3.1. Pest and Disease Image Upload Module

The pest and disease image upload module receives crop pest and disease images taken by users and uploads them to the server. The users are crop pest and disease experts who are affiliated with the government agency, the Rural Development Administration (RDA). These users gather several pest and disease images taken in the field through regular site inspections and communication with local farmers.
The pest and disease image upload screen is shown in Figure 3. Users should first select the crop pest and disease information for the image to be uploaded, as shown in Figure 3. ➀, and select the affiliation of the uploader, crop type, pest, or disease type, form of the pest (only applicable for pests), and the affected area. Seven uploader affiliations are currently defined, and each affiliation handles different types of crops, pests, and diseases. There are 32 types of crops, such as apples and pears, and users can select pests and diseases that occur in the chosen crop from a given list. Users can only select the pest from the pest category in the previous pest and disease selection step, and there are four categories to select from: egg, larva, adult, and damage caused by pests. Finally, when selecting the affected area, users can specify the damaged part of the crop, such as leaves, fruit, stems, or roots.
Figure 3. Pest and disease image upload screen. This user interface allows users to conveniently upload one or multiple images simultaneously.
After selecting the crop pest and disease information, the “Select image files” button in Figure 3. ➁ is used to select the images to be uploaded, and multiple images can be selected simultaneously. Only JPG and PNG formats are supported. The selected images are then displayed on the screen as shown in Figure 3. ➂, and users can either upload the images or cancel the upload after rechecking.
After the crop pest and disease image upload process, information such as the person who uploaded the image, pest or disease, and image gets automatically stored in the database; the items are summarized in Table 3. Most items in Table 3 are generated automatically; however, crop names, pest and disease names, pest forms, and affected areas of the uploaded images should be manually selected through the selection menu in Figure 3. ➀. The system is designed to automatically parse information about the captured images from its metadata. For example, “shooting location information” is automatically stored in the metadata when the GPS function is enabled on the shooting device, but it is not stored if GPS is not enabled. As location information can be useful for data management and visualization, we have designed the system such that users manually add it when the location information is not provided. Currently, image location information is used only for visualizing the map view. In the future, it can be utilized to analyze the distribution of crops nationwide and patterns of pest occurrence. Section 3.4 will provide a detailed introduction to the visualization method of map view which utilizes location information.
Table 3. Image File Metadata Standardization Table. The table stores information about the users who uploaded the images, information about the crops, pests, and disease types included in the images, as well as location information about where the images were obtained.
The uploaded pest and disease images and other information are displayed as shown in Figure 4. The person who uploaded the image can reconfirm the information on this screen and make any corrections. Uploaded images are marked as “waiting for inspection,” and they eventually become training data after going through the annotation and inspection modules.
Figure 4. Screen of uploaded pest and disease image list. This user interface enables users to conveniently access information including the uploader’s name, crop name, pest and disease names, and inspection results.

3.2. Pest and Disease Image Annotation Module

In the pest and disease image annotation module, labeling is performed to mark the affected areas in the images. Labeling is a crucial step in building training data for AI, as the labeled data are used to train AI models.
The pest and disease image annotation screen is shown in Figure 5. Figure 5. ➀ is the annotation image waiting list, which displays the uploaded images. Upon clicking an image, it is enlarged and displayed as shown in Figure 5. ➁, and the affected areas can be marked. For annotation, rectangles and polygons are supported. Images marked with rectangles can be used in object detection tasks, and the annotation information contains coordinate values in the format ( X 1 , X 2 , Y 1 , Y 2 ). As pest and disease symptoms have various forms, there are limitations to marking them with rectangles. To solve this, a feature was added to allow marking in polygon format, which enabled users to mark complex damage symptoms more precisely. Images marked with polygons can be used in object detection and segmentation tasks. The annotation information is automatically saved in a separate JSON file.
Figure 5. Image annotation screen. Users can select images for annotation from the ➀ list of images to annotate. The system supports two annotation methods, as indicated by the ➂ annotation type: rectangles and polygons.

3.3. Pest and Disease Image Inspection Module

The quality of labeled images is inspected in the pest and disease image inspection module.
Image inspections are performed by pest and disease experts, and only those who have been pre-approved gain access to the inspection module. PlantInfoCMS allows only users of a specific group authorized by the administrator through the group management feature to visit the module. Pest and disease experts inspect the annotated images for the following:
(1)
Check whether the captured images include diseases or pests and whether the original image is too dark or over-exposed.
(2)
Verify whether the captured image is assigned the correct crop name and pest or disease name.
(3)
Inspect whether the affected areas of disease symptoms or pest occurrence have been accurately annotated without omissions.
The inspection module is designed to have at least two inspectors review a single image. The inspection process for one image is as follows (also shown in Figure 6):
Figure 6. Flow chart describing image inspection process. This flowchart illustrates the step-by-step procedure followed by crop pest and disease experts during the inspection of a single image.
(1)
Two inspectors each review a single image and assign a status of “inspection passed” or “inspection not passed”.
(2)
If both inspectors agree, then their opinion (passed or not passed) is used as the final inspection result.
(3)
If the two inspectors have different opinions, then a third inspector determines the final inspection result.
Information about these inspections is stored in the inspection database, and if an image fails the inspection, feedback is provided to the person who annotated the image through a notification to make corrections.

3.4. Dashboard Module

PlantInfoCMS provides various visualization graphs through the dashboard, making it easy to check the progress of image uploads and annotation tasks. The users can check not only the upload and annotation progress of each group by month, crop, and group, but can also check on the task progress by date and shooting location through visualization methods such as calendar view and map view. The dashboard main screen displayed to users in PlantInfoCMS is shown in Figure 7.
Figure 7. Dashboard main screen. The dashboard provides various visualization graphs, such as status bars, pie charts, and radar charts, to facilitate monitoring the progress of image uploads and annotation tasks.
It consists of a status bar, menu bar, and panels that display various visualization results. As shown in Figure 7. ①, the Image Upload Status provides features to check the image upload and annotation status of each group through pie charts and bar charts. The pie chart displays the number of uploaded images in each group, the number of annotated images, and the proportion of annotated images among the total images. Upon clicking the pie chart, the image upload and annotation task status for each group are displayed as percentages, as shown in Figure 8, allowing users to briefly check the progress.
Figure 8. Status of image upload/annotation tasks by group. This panel enables users to conveniently track the status of image uploads and annotation tasks categorized by groups.
Using the Year Selection feature in Figure 7. ②, users can select a specific year and view the corresponding task statistics. The feature Image Upload Status by Year in Figure 7. ③ visualizes the number of uploaded pest and disease images using a line graph and it shows the number of image uploads for each crop per month. The status of images uploaded in each group is displayed in Figure 7. ④ using a radar chart, making it convenient to check the group with the most or least uploaded images.
PlantInfoCMS supports a map view that allows the user to check the pest and disease image upload status by region. Map View, featured in Figure 7. ⑤, allows users to easily identify the shooting location and distribution of the uploaded images. The map view, as shown in Figure 9, was implemented using the metadata of the image (e.g., location). The shooting location metadata of the image is stored to one day expand the system into a control system in the future.
Figure 9. Map view screen. The map view feature enables users to check the status of pest and disease image uploads by region, as well as visualize the geographical distribution of uploaded images. The Korean text on the map indicates the names of cities and bodies of water.

4. Discussion and Limitations

This study introduces PlantInfoCMS, a scalable system designed for the efficient collection and management of large-scale crop pest and disease data. The proposed system consists of data upload, annotation, inspection, and dashboard modules. The data upload module provides plant pest and disease experts with a user-friendly interface and automatic parsing of image metadata, simplifying the process of uploading images taken in the field. The annotation module offers labeling functions using rectangles and polygons, making it easy to create datasets for object detection and segmentation model training. In the image inspection module, we introduced a process where at least two pest and diseases experts inspect each image to ensure the quality of the collected data. The dashboard provides various charts and a map view for easily visualizing the shooting locations and distribution of the uploaded images.
Previous research on data collection and management systems in the agricultural field has primarily focused on managing tabular data such as plant phenotyping or breeding data, with limited research on systems for managing image data. This study aims to address this issue by proposing a system that efficiently collects and manages plant pest and disease images and builds datasets. It is expected that the proposed system will facilitate the creation of multiple pest image datasets. Currently, PlantInfoCMS has collected a total of 301,667 images, including 185 pest and disease images from 32 different crops.
However, the current system has limitations in that it only manages image data. Future research aims to integrate the system with the open API provided by the Korea Meteorological Administration to enable the analysis and management of weather and climate information at the image-capturing location at the time of image capture. Additionally, the system plans to add the functionality to analyze data from soil and temperature sensors used in smart farms and integrate with the National Crop Pest Management System (NCPMS) of the Rural Development Administration (RDA) of Korea to enable diagnosis and treatment prescription information on the pests and diseases to be checked within the system.
The currently developed system for collecting and managing images of pests and diseases distinguishes images by crop, disease, and pest levels. Crop images can be divided into leaves, stems, branches, fruits, etc. depending on the shooting position, while diseases can be divided into early stage, mid-stage, and late stage according to the onset time. Pests can also be divided into eggs, larvae, pupae, adults, etc. depending on their growth stage. Incorporating such detailed information in the annotation during dataset construction enables more sophisticated diagnosis of diseases and pests, and can serve as evidence for assessing disease severity. Therefore, in future research, we plan to expand the system to include more detailed information such as the parts of the crop image captured, the progression of diseases in terms of early, mid, and late stages, and the growth stages of pests, in the dataset construction.
The currently developed plant disease information collection and management system focuses on pest and disease image collection and dataset construction. However, early diagnosis of crop pests and diseases cannot be adequately resolved using only simple vision data. To achieve more accurate early diagnosis of plant pests and diseases, crop growth data and weather data must also be utilized. For instance, a more accurate diagnosis can be obtained by comprehensively analyzing the diagnostic results of image-based prediction models and the growth information of the corresponding crops when diagnosing a crop disease. By using growth meta-information to determine whether the disease predicted by the model is likely to occur at the time of prediction in an actual farm, comprehensive diagnosis results can be provided to the user. Nevertheless, the current system only focuses on pest and disease image collection and dataset construction and lacks the functionality to collect, manage, and analyze data related to crop production, weather information, and other data. In future research, we plan to expand PlantInfoCMS to include the ability to collect, manage, and analyze data related to crop production, such as growth data and weather data.

5. Conclusions

This study proposed PlantInfoCMS, a system for efficiently collecting and managing crop pest and disease image data. The proposed system consists of data collection, annotation, inspection, and dashboard modules to create high-quality pest and disease training data through these processes. Moreover, the proposed system has excellent scalability and provides various statistical functions, making management highly efficient by allowing users to easily check the progress of each task.
In this study, we propose a system for collecting crop pest and disease images and constructing a high-quality dataset. With the proposed system, we collected approximately 301,557 pest and disease images and conducted annotation and inspection for 255,183 of those images. However, we have not yet systematically validated the accuracy of the diagnostic model trained using the dataset constructed with the system. In future research, we plan to train a pest and disease diagnostic model using the constructed dataset, and test the feasibility of using the dataset through accuracy verification.

Author Contributions

Conceptualization, D.J., H.Y. and Y.H.G.; methodology, D.J. and H.Y.; software, D.J., H.Y. and R.Z.; validation, D.J., H.Y. and Y.H.G.; investigation, D.J., H.Y. and R.Z.; writing—original draft preparation, D.J. and H.Y.; writing—review and editing, S.J.Y. and Y.H.G.; supervision, S.J.Y. and Y.H.G.; project administration, Y.H.G. All authors have read and agreed to the published version of the manuscript.

Funding

This work was carried out with the support of “Cooperative Research Program for Agriculture Science and Technology Development (Project No. PJ016294, Development of pests and plant diseases diagnosis using intelligent image recognition)” Rural Development Administration, Republic of Korea.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. UN. United Nations Population Division. Available online: https://www.un.org/development/desa/pd/ (accessed on 25 April 2023).
  2. Araus, J.L.; Cairns, J.E. Field high-throughput phenotyping: The new crop breeding frontier. Trends Plant Sci. 2014, 19, 52–61. [Google Scholar] [CrossRef] [PubMed]
  3. Yoon, C.; Lim, D.; Park, C. Factors affecting adoption of smart farms: The case of Korea. Comput. Human Behav. 2020, 108, 106309. [Google Scholar] [CrossRef]
  4. Triantafyllou, A.; Tsouros, D.C.; Sarigiannidis, P.; Bibi, S. An architecture model for smart farming. In Proceedings of the 2019 15th International Conference on Distributed Computing in Sensor Systems (DCOSS), Santorini, Greece, 29–31 May 2019; pp. 385–392. [Google Scholar]
  5. Sundmaeker, H.; Verdouw, C.; Wolfert, S.; Freire, L.P. Internet of Food and Farm 2020. In Internet of Things Connecting the Physical, Digital and Virtual Worlds; River Publishers: Aalborg, Denmark, 2022; pp. 129–151. [Google Scholar]
  6. Boursianis, A.D.; Papadopoulou, M.S.; Diamantoulakis, P.; Liopa-Tsakalidi, A.; Barouchas, P.; Salahas, G.; Karagiannidis, G.; Wan, S.; Goudos, S.K. Internet of Things (IoT) and Agricultural Unmanned Aerial Vehicles (UAVs) in smart farming: A comprehensive review. Internet Things 2022, 18, 100187. [Google Scholar] [CrossRef]
  7. Aji, G.K.; Hatou, K.; Morimoto, T. Modeling the dynamic response of plant growth to root zone temperature in hydroponic chili pepper plant using neural networks. Agriculture 2020, 10, 234. [Google Scholar] [CrossRef]
  8. Yu, H.; Miao, C.; Leung, C.; White, T.J. Towards AI-powered personalization in MOOC learning. NPJ Sci. Learn. 2017, 2, 15. [Google Scholar] [CrossRef]
  9. Dhaka, V.S.; Meena, S.V.; Rani, G.; Sinwar, D.; Kavita; Ijaz, M.F.; Woźniak, M. A survey of deep convolutional neural networks applied for prediction of plant leaf diseases. Sensors 2021, 21, 4749. [Google Scholar] [CrossRef]
  10. Atila, Ü.; Uçar, M.; Akyol, K.; Uçar, E. Plant leaf disease classification using EfficientNet deep learning model. Ecol. Inform. 2021, 61, 101182. [Google Scholar] [CrossRef]
  11. Kaya, A.; Keceli, A.S.; Catal, C.; Yalic, H.Y.; Temucin, H.; Tekinerdogan, B. Analysis of transfer learning for deep neural network based plant classification models. Comput. Electron. Agric. 2019, 158, 20–29. [Google Scholar] [CrossRef]
  12. Liu, J.; Wang, X. Plant diseases and pests detection based on deep learning: A review. Plant Methods 2021, 17, 22. [Google Scholar] [CrossRef]
  13. Saiz-Rubio, V.; Rovira-Más, F. From smart farming towards agriculture 5.0: A review on crop data management. Agronomy 2020, 10, 207. [Google Scholar] [CrossRef]
  14. Yin, H.; Gu, Y.H.; Park, C.J.; Park, J.H.; Yoo, S.J. Transfer learning-based search model for hot pepper diseases and pests. Agriculture 2020, 10, 439. [Google Scholar] [CrossRef]
  15. Gu, Y.H.; Yin, H.; Jin, D.; Zheng, R.; Yoo, S.J. Improved Multi-Plant Disease Recognition Method Using Deep Convolutional Neural Networks in Six Diseases of Apples and Pears. Agriculture 2022, 12, 300. [Google Scholar] [CrossRef]
  16. Wong, Z.S.Y.; Zhou, J.; Zhang, Q. Artificial Intelligence for infectious disease Big Data Analytics. Infect. Dis. Health 2019, 24, 44–48. [Google Scholar] [CrossRef] [PubMed]
  17. Hughes, D.P.; Salathe, M. An open access repository of images on plant health to enable the development of mobile disease diagnostics. arXiv 2015, arXiv:1511.08060. [Google Scholar]
  18. Cap, Q.H.; Uga, H.; Kagiwada, S.; Iyatomi, H. LeafGAN: An Effective Data Augmentation Method for Practical Plant Disease Diagnosis. IEEE Trans. Autom. Sci. Eng. 2022, 19, 1258–1267. [Google Scholar] [CrossRef]
  19. Wu, X.; Zhan, C.; Lai, Y.K.; Cheng, M.M.; Yang, J. IP102: A Large-Scale Benchmark Dataset for Insect Pest Recognition. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 16–20 June 2019; pp. 8779–8788. [Google Scholar]
  20. AIHUB. An AI Training Data Platform. Available online: https://aihub.or.kr/ (accessed on 25 April 2023).
  21. Fruit Fire Blight Shooting Image Dataset. Available online: https://aihub.or.kr/aihubdata/data/view.do?currMenu=115&topMenu=100&aihubDataSe=realm&dataSetSn=146 (accessed on 25 April 2023).
  22. The Open-Field Crop Disease Diagnosis Image Dataset. Available online: https://aihub.or.kr/aihubdata/data/view.do?currMenu=115&topMenu=100&aihubDataSe=realm&dataSetSn=147 (accessed on 25 April 2023).
  23. The Open-Field Crop Pest Diagnosis Image Dataset. Available online: https://aihub.or.kr/aihubdata/data/view.do?currMenu=115&topMenu=100&aihubDataSe=realm&dataSetSn=148 (accessed on 25 April 2023).
  24. Wang, R.; Liu, L.; Xie, C.; Yang, P.; Li, R.; Zhou, M. Agripest: A large-scale domain-specific benchmark dataset for practical agricultural pest detection in the wild. Sensors 2021, 21, 1601. [Google Scholar] [CrossRef] [PubMed]
  25. Patel, V. A framework for secure and decentralized sharing of medical imaging data via blockchain consensus. Health Inform. J. 2019, 25, 1398–1411. [Google Scholar] [CrossRef] [PubMed]
  26. Tournier, J.D.; Smith, R.; Raffelt, D.; Tabbara, R.; Dhollander, T.; Pietsch, M.; Christiaens, D.; Jeurissen, B.; Yeh, C.H.; Connelly, A. MRtrix3: A fast, flexible and open software framework for medical image processing and visualisation. Neuroimage 2019, 202, 116137. [Google Scholar] [CrossRef]
  27. Seetharaman, K. A Fully Automated Crop Disease Monitoring and Management System Based on IoT: IoT-Based Disease Identification for Banana Leaf. In Deep Learning Applications and Intelligent Decision Making in Engineering; IGI Global: Hershey, PA, USA, 2021; pp. 192–211. [Google Scholar]
  28. Klukas, C.; Chen, D.; Pape, J.M. Integrated analysis platform: An open-source information system for high-throughput plant phenotyping. Plant Physiol. 2014, 165, 506–518. [Google Scholar] [CrossRef]
  29. Reynolds, D.; Ball, J.; Bauer, A.; Davey, R.; Griffiths, S.; Zhou, J. CropSight: A scalable and open-source information management system for distributed plant phenotyping and IoT-based crop management. Gigascience 2019, 8, giz009. [Google Scholar] [CrossRef]
  30. Yang, Y.; Wilson, L.T.; Wang, J.; Li, X. Development of an integrated Cropland and Soil Data Management system for cropping system applications. Comput. Electron. Agric. 2011, 76, 105–118. [Google Scholar] [CrossRef]
  31. Jung, S.; Lee, T.; Gasic, K.; Campbell, B.T.; Yu, J.; Humann, J.; Ru, S.; Edge-Garza, D.; Hough, H.; Main, D. The Breeding Information Management System (BIMS): An online resource for crop breeding. Database 2021, 2021, baab054. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.