Next Article in Journal
Using Sentinel-1 Time Series Data for the Delineation of Management Zones
Previous Article in Journal
Design and Experiment of a Dual-Disc Potato Pickup and Harvesting Device
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Cloud Computing Framework for Space Farming Data Analysis

by
Adrian Genevie Janairo
,
Ronnie Concepcion II
,
Marielet Guillermo
* and
Arvin Fernando
Gokongwei College of Engineering, De La Salle University, Manila 1004, Philippines
*
Author to whom correspondence should be addressed.
AgriEngineering 2025, 7(5), 149; https://doi.org/10.3390/agriengineering7050149
Submission received: 10 February 2025 / Revised: 15 March 2025 / Accepted: 21 March 2025 / Published: 8 May 2025

Abstract

This study presents a system framework by which cloud resources are utilized to analyze crop germination status in a 2U CubeSat. This research aims to address the onboard computing constraints in nanosatellite missions to boost space agricultural practices. Through the Espressif Simple Protocol for Network-on-Wireless (ESP-NOW) technology, communication between ESP-32 modules were established. The corresponding sensor readings and image data were securely streamed through Amazon Web Service Internet of Things (AWS IoT) to an ESP-NOW receiver and Roboflow. Real-time plant growth predictor monitoring was implemented through the web application provisioned at the receiver end. On the other hand, sprouts on germination bed were determined through the custom-trained Roboflow computer vision model. The feasibility of remote data computational analysis and monitoring for a 2U CubeSat, given its minute form factor, was successfully demonstrated through the proposed cloud framework. The germination detection model resulted in a mean average precision (mAP), precision, and recall of 99.5%, 99.9%, and 100.0%, respectively. The temperature, humidity, heat index, LED and Fogger states, and bed sprouts data were shown in real time through a web dashboard. With this use case, immediate actions can be performed accordingly when abnormalities occur. The scalability nature of the framework allows adaptation to various crops to support sustainable agricultural activities in extreme environments such as space farming.

1. Introduction

The rapid expansion of space exploration into new areas has created a growing need for new technologies and methods that can support human life in space. In the past, space exploration was mostly controlled by big government agencies like NASA and the European Space Agency (ESA). However, this is changing. The development of small, affordable satellites called CubeSats has played a big role in this shift. CubeSats are small, typically measuring 10 × 10 × 10 cm, and its simple and modular design makes it accessible to a wide range of groups, including universities, private companies, and even smaller countries. This has opened up space research to more people, allowing a wider variety of participants to contribute to space science and exploration [1].
One of the most promising applications of CubeSats lies in astrobiology, specifically the study of plant biology in microgravity. As humanity looks towards long-term space missions, including potential inhabitation on the Moon, Mars, and beyond, understanding the biological responses of plants in space environments is critical. Plants are fundamental to life support systems in space, providing oxygen, food, and psychological benefits to astronauts [2]. However, microgravity has been shown to influence various physiological processes in plants, including seed germination, growth, and reproduction. Research conducted on the International Space Station (ISS) has demonstrated that while seeds can germinate in microgravity, the absence of gravity affects the orientation, root growth, and overall morphology of the plants, raising questions about how best to support plant life in space [3]. To advance our understanding of seed germination in microgravity, it is essential to develop sophisticated experimental platforms that can replicate the environmental conditions of space while providing the necessary support for biological studies. CubeSats, with their small size and modular design, offer an ideal platform for such experiments [4,5]. However, its payload standard attributes constrain onboard computing and storage functions. Thus, restraining precise and real-time control of environmental factors such as temperature, humidity, and light to sustain plant growth. To address this, there is a need to establish a framework that will minimize data handling tasks onboard while maximizing the satellite data that can be obtained for crop germination nanosatellite mission. Figure 1 illustrates the exploded view of a standard CubeSat subsystems [6] and its communications system structure to Earth through the ground stations [7]. Ground stations are physical infrastructures housing the necessary communications subsystems for transmitting, receiving, and decoding signals. Base or controller facilities are often stationed to perform more specific tasks according to their primary designation such as security, network, and weather monitoring.
In this study, tasks specific to crop growth status and predictors condition monitoring are proposed to be neither on the payload nor the base stations, but on the cyberspace. Effectively, implementing a space-to-edge-to-cloud solution framework with provision to a potential Internet of Space Things application. This will be discussed further in the next section. The ARCHER (Agricultural CubeSat for Horticulture Experiments and Research) as shown in Figure 2 was used as the CubeSat for the experimental setup.
Each specific objective listed below contributes strategically to bridge terrestrial and space-based monitoring systems for the reinforcement of the agricultural sector:
  • Development of a webserver as a monitoring hub;
  • Establishment of edge-to-cloud communications using AWS IoT Core;
  • Data collection and transmission from 2U CubeSat using the ESP-NOW technology;
  • Development and deployment of a machine vision model for germination detection using Roboflow.
Furthermore, the extension of the proposed framework to space-based applications underscores its potential to contribute to interdisciplinary research areas, including sustainable precision agriculture, environmental monitoring, and space science. By combining advanced technological solutions with practical utility, this study represents a significant contribution to the evolving landscape of smart monitoring systems.

1.1. Review of Related Studies

The convergence of Internet of Things (IoT) technologies with agricultural and space research has driven significant advancements in real-time monitoring and data collection. In agriculture, IoT systems enable precise tracking of environmental parameters such as temperature, humidity, and soil conditions, improving resource management and crop productivity [8]. Similarly, in space research, IoT technologies have become indispensable for managing experiments under the constraints of harsh conditions, limited access, and the need for real-time data transmission [9]. Innovations such as low-cost microcontrollers, cloud-based platforms, and machine learning (ML) models have expanded the potential for automation, accessibility, and data-driven insights [10,11,12]. These opportunities in the near future include automated space vehicles, wearables and mobile apps for space travel, spatial object tracking, and potential smart constructions and data centers in space. This review explores the application of IoT systems in agricultural and space-based research, with a focus on CubeSat missions and the development of solutions for studying seed germination in microgravity. In agricultural settings, the adoption of low-cost and versatile microcontrollers, such as ESP32, has facilitated the development of robust, real-time monitoring systems [13]. Studies have shown that these systems efficiently collect and transmit environmental data, making them ideal for use in remote or resource-constrained environments [14,15]. Furthermore, the integration of remote access solutions like NGROK has expanded the accessibility of such systems, allowing users to retrieve data from any location with minimal infrastructure requirements [16]. The application of IoT in space science further underscores its versatility. CubeSat missions often rely on IoT-based systems to monitor internal conditions such as temperature, humidity, and light, ensuring that experiments onboard remain within operational parameters [17]. These systems also support the collection of biological data, such as plant growth metrics, enabling detailed analysis of processes like seed germination under microgravity conditions [18]. Cloud-based platforms such as AWS IoT have become instrumental in managing the large volumes of data generated by IoT devices [19]. These platforms provide secure storage, scalable data processing, and user-friendly real-time visualization and analysis dashboards. This capability benefits agricultural and space applications, where consistent data monitoring and secure storage are essential [20,21].
The development of CubeSats for space-based research presents significant opportunities and notable challenges, particularly in studying biological processes like seed germination in microgravity [22]. CubeSats, small, modular satellites that typically measure 10 × 10 × 10 cm per unit, offer a unique platform for conducting scientific experiments in space [23]. Their cost-effective, scalable design has opened doors for a wider range of organizations, including universities and private companies, to contribute to space research [23]. However, the limitations of CubeSat platforms must be carefully considered when it comes to biological research, such as studying plant growth and seed germination in microgravity. Microgravity significantly affects plant development, including seed germination, root growth, and overall morphology [24]. Without gravity, plants struggle to orient themselves, leading to altered root growth and nutrient and water distribution changes. The challenge is to design a system that can manage environmental factors such as temperature, humidity, and light and even simulate gravity or microgravity conditions. The 2U CubeSat, twice the size of the standard 1U CubeSat, offers more space for research payloads [25]. However, it is still limited in terms of overall volume and power capacity, making it difficult to incorporate comprehensive environmental controls and sensors. Despite these constraints, CubeSats remain valuable due to their small size, low cost, and ability to perform targeted experiments in space [26]. Yet, the harsh space environment—characterized by radiation, extreme temperatures, and vacuum conditions—poses additional challenges to payload design [27]. Quasi-static loading and vibration directly affects the structural aspect during launch. Communication is also constrained to high frequencies consequently, drawing out more power onboard. Fluctuations happen along various orbital regions causing instability on the payload temperature which may lead to damage on sensors and other electronic subsystem components. These factors require that CubeSats be designed with durable materials and efficient power systems to ensure their functionality during long-duration missions. Advanced monitoring systems and power management techniques are critical to overcoming these limitations. For instance, integrating sensors for real-time data collection allows researchers to monitor plants’ growth conditions during experiments [28]. With effective power management, it is possible to optimize the use of CubeSat resources and maintain the payload’s functionality. These technologies enable continuous data collection despite the limited payload size and power. Understanding how plants behave in microgravity is essential for long-term space missions and technical considerations [24]. The insights gained from seed germination studies in microgravity can inform strategies for supporting space agriculture and ensuring food sustainability during extended space missions [29]. Furthermore, these findings can enhance agricultural practices on Earth, particularly in extreme or resource-limited environments. Chamber growing in areas with low nutrient soil is one. This practice was replicated from a salad crop successfully cultivated through microgravity after a long-term analysis of the plant behavior [30]. Another practice is the use of artificial bioregenerative ecosystem to sustain marine life by converting fish waste into food [31].
Machine learning and vision models are vital in improving data analysis in CubeSat-based biological research [32]. These technologies can automate the analysis of large datasets generated from plant growth monitoring, making it easier and faster to identify trends and anomalies that would otherwise be difficult to detect. ML models can be trained to recognize subtle changes in plant health, while vision models can provide real-time visual assessments of seed germination and growth [33,34]. This approach increases the efficiency of data processing and enables more accurate and actionable insights. ML and computer vision (CV) models, combined with cloud-based platforms like Roboflow, are increasingly used for crop germination detection and classification [35]. Roboflow enables the development of custom object detection models by training on labeled image datasets, allowing researchers to monitor and classify plant growth stages, including seed germination, using visual data. This is especially beneficial for space applications, where traditional monitoring methods are impractical, such as with CubeSat payloads in microgravity environments [36]. Roboflow’s platform supports image analysis techniques like object detection and image segmentation, ideal for identifying subtle differences in germination stages [37]. By integrating this with CubeSat technology, real-time monitoring of plant growth in space becomes possible, providing essential data on seed behavior in microgravity. The cloud-based system offers remote monitoring and real-time feedback, allowing researchers to analyze plant health remotely [38]. As new images are collected, the detection models can continuously improve through adaptive learning, enhancing accuracy over time [39]. Additionally, the scalability of cloud-based platforms allows for easy application across different crops or growth conditions, making them versatile for space-based agriculture. In the study of Priyadarshini et.al. (2022), it was suggested that plants grown in space are often beyond the ideal attributes such as in size as compared to ground surface and it is with the use of cloud monitoring that indicators can be detected early so that corrective actions can be made [40].
In conclusion, this review underscores the critical role of integrating IoT technologies, cloud-based platforms, and machine learning in overcoming the challenges inherent in space farming research. CubeSat missions, constrained by limitations in size, power, and accessibility, demand innovative approaches to facilitate real-time monitoring and analysis of biological experiments. By leveraging tools such as AWS IoT, NGROK, and Roboflow, researchers can implement scalable and efficient systems for automated germination detection and environmental control. These advancements not only enhance the functionality of CubeSat payloads for studying seed germination in microgravity but also contribute to the broader field of sustainable agricultural practices in extreme environments. Collectively, this research provides a robust framework for addressing current gaps in space-based biological studies, offering valuable insights for future explorations in space farming.

1.2. The Proposed Framework for Space Farming Data Analysis

To achieve the objective of performing advanced crop growth monitoring on nanosatellites considering the payload constraints, the framework illustrated in Figure 3 is proposed. The framework is composed of four segments represented by a green dashed-line boxes with blue box identifier labeled: CubeSat, ground station, cloud services, and end user. These are end blocks utilizing different technologies with different purposes interconnected through IoT running on the cloud platform. The corresponding technology per block is marked in red box. The first two blocks use ESP-NOW while the remaining blocks use HTTP. The orange boxes denote the functions of these blocks.
CubeSat block covers the nanosatellite mission standard subsystems such as imaging, power, and communications. The components specific to space farming are further explained in Section 2. The block performs the transmission function wherein data captured through the onboard sensors are sent to the receiver end through the satellite dish at the ground station. The data are decoded in a way that the receiver will understand on a transport layer using the ESP-NOW protocol. The ground station block covers the receiving antenna and its communications subsystems to convert RF signals into data format readable in the data link layer. For simplicity purposes, the block was labeled ground, but it also signifies other stations such as base, access gateway, or customer premise which physically situated on the ground to perform various functions. The block performs the receiving and preliminary decoding functions. The block uses the ESP-NOW protocol for basic sensor reading and image store-and-forward tasks and http protocol for in-house web server that will allow connection with http client requests from the end user block.
The data from the ground station block are passed onto the cloud where the IoT technology resides. The technology performs internetwork or interconnectivity of devices from all end blocks using the IP protocol. Depending on the data type, TCP or UDP can be utilized. Cloud services block does the storing and computing functions. Cloud services also perform application programming interfacing between different cloud service providers and applications. In this case, the Roboflow which houses the CV model-related tasks, as discussed further in the next section. End user block performs the http request to the web server to access and to visualize the crop germination monitoring application. Both the cloud service and end user blocks have two-way communication with the IoT block.
The proposed framework can be further reconfigured to advance the nanosatellite mission further into an Internet of Space Things (IoST) structure such that multiple CubeSat with different missions can be deployed in space communicate with each other. In essence, converting these to active satellites with the ability to do decision making and control onboard. See Figure 4 for the illustration.
The first two blocks in the initial framework were now replaced with the IoST framework as introduced in an existing study [41] which explores bringing the cyber physical system into the space to realize true global connectivity. In this hybrid scenario, multiple CubeSats can be deployed either in homogenous setup performing similar mission [42,43] and serve as a repeater or in heterogenous form [44] designated with unique missions and act with a server–client relationship in space.
In the existing framework for plant growth experiments in CubeSat missions, the nanosatellite carrying the chamber is launched into space and stays for a certain period of time enough to acquire substantial data for analysis. Then, the collected data are examined at the ground station after the mission. The same framework was used in a study analyzing the growth of alfalfa and tobacco as shown in Figure 5.
Comparing the existing with the proposed framework in this study, the key difference is in the position of plant growth analysis in the framework. Analysis happens at the end or post-mission deployment in the existing framework while analysis can be performed as soon as the CubeSat is deployed in space in the proposed framework. Being able to monitor and to analyze seedling status on flight makes a huge difference in chamber environment control and sustaining plant growth.
This study addresses significant gaps in space-based agricultural research, particularly the need for real-time remote monitoring systems for seed germination experiments in CubeSat missions. By developing an integrated system using AWS IoT for data collection, storage, and remote access, this research ensures efficient and accessible monitoring of space-based plant experiments. With the aid of the proposed user interface, the controlled environment can be reconfigured remotely when anomalies occur or when various crops are monitored requiring different temperature values or light intensities. Additionally, it advances the field by incorporating deep learning and cloud-based platforms like Roboflow for automated germination detection, providing a scalable solution for space research. This study also contributes to the design of a 2U CubeSat capable of supporting plant growth experiments in microgravity, highlighting the importance of environmental control in CubeSat payloads. By combining cloud-based image detection and automated biological analysis, this research fills critical gaps in space farming, offering a practical and sustainable framework for future experiments in space-based biological research.

2. Materials and Methods

This section details the development of an ESP32-based system for efficient data collection, monitoring, and transmission in a controlled agricultural setting. Leveraging the ESP-NOW protocol for device communication and advanced libraries for data handling, the system integrates sensor data acquisition, real-time monitoring, and cloud-based access via AWS IoT Core. Additionally, the system framework incorporates Roboflow, a cloud-based machine vision modeling [46], for detecting crop germination using images captured by the ESP32-Cam, enabling automated analysis and insights. Each component ensures reliability, scalability, and precision, supporting agricultural applications in a CubeSat environment through seamless data management and accessibility.

2.1. Data Collection Using the ESP32-Based System Simulation

Using two ESP32-based microcontrollers, data collection from the DHT11 sensor with the ESP32 and the ESP32-Cam are temporarily stored in the microSD memory card inserted in the ESP32-Cam microcontroller. The process is illustrated in Figure 6. This uses the ESP-NOW technology where the ESP32 is used to broadcast a structure of data to the ESP32-Cam over a local router. This simulates the data transfer from the 1st two blocks in the proposed framework.
The ESP32 microcontroller was used to be the main clock of the whole system by internally setting the time with the method of using the header, <ESP32Time.h>. This method does not use internet connection in accessing the real time and date but starts with the given date in the Arduino code. In practice, the date input is the same date of the starting day of data collection and the time is set to 00:00:00 in hr:min:sec format. The next data in the ESP32 is the data from the DHT11. This includes temperature and heat index in degrees Celsius (°C) unit, and humidity in relative percentage (%) unit. This is made possible using the header, <DHT.h>. Since the ESP32 also controls the lighting system and irrigation system of the CubeSat, the last two information to be transferred are the status of the LED light, on or off, and the status of the pump if it is on or off also. Figure 7 shows the corresponding code snippet.
This study effectively leverages ESP-NOW for wireless communication between ESP32 devices. By initializing ESP-NOW, defining a data structure, and establishing a peer-to-peer connection, the code enables the transmission of sensor data, LED control commands, and Fogger state updates. ESP-NOW’s simplicity and low-latency characteristics make it suitable for this application. The code efficiently transmits data without the complexity over the local network via Wi-Fi technology. After this, the data structure as in Figure 8 is received at the ESP32-Cam, which captures the images of the current situation of the plant bed in the ARCHER CubeSat in JPEG format and named according to the data number. In addition, information such as data number was also generated in the ESP32-Cam and was added in the data structure.
Using the built-in SD card module in the ESP32-Cam, the numerical and text data were each saved as a string in a comma-separated values (CSV) format. Each row corresponds to a single reading and includes the following columns: ID, Date, Time, Temperature, Humidity, Heat Index, LED State, and Fogger State. The use of an SD card for data saving is an integral part of the methodology in this study, ensuring secure and accessible storage of sensor and image data. This approach is designed to facilitate the systematic collection, organization, and retrieval of information critical to evaluating crop health and environmental parameters. The process begins with SD card initialization, which is carried out using the SD_MMC library. Headers such as <Arduino.h>, <FS.h>, and <SD_MMC.h> enable core functionality for handling data storage operations, while <Arduino_JSON.h> was utilized for formatting data into JSON strings when needed. Each entry in the data file is uniquely identified using an ID and timestamp, ensuring traceability and chronological organization. For the image data, high-resolution images captured by the hyperspectral or thermal imaging system are stored separately on the SD card in a compressed format, such as JPEG. The files are dynamically named using a combination of a unique identifier and the timestamp of capture. These filenames are cross-referenced in the sensor data logs to maintain a cohesive relationship between the numerical data and corresponding visual or spectral information.
This section discussed the process by which image and string data from microcontrollers are collected, pre-processed, and temporarily stored in a location. The curated data will then be made publicly visible in a user-friendly format to be discussed further in the next section.

2.2. Development of the Asynchronous Local Webserver on ESP32

The development of the ESP32-based web server integrates advanced functionalities such as live data updates as shown in Figure 9. The ESP32-based web server includes a live image upload feature, enabled by the ESP32-CAM module, which allows real-time monitoring through camera feeds. Images are dynamically captured and displayed on the web interface, offering users the ability to visually monitor the germination environment. Complementing this, the server provides live sensor readings, showcasing critical environmental parameters such as temperature, humidity, and heat index. These real-time data aid in making informed decisions based on the monitored conditions. Additionally, the inclusion of a live date and time update feature ensures temporal accuracy in all data interactions that the web server tracks and displays, which is crucial for time-sensitive applications. This is accompanied by the current data ID and image filename saved on the microSD card, providing transparency and ease of access to historical data. This feature simplifies data management and retrieval, especially in scenarios where large datasets are generated over extended periods. The entire system was developed using PlatformIO within Visual Studio Code (https://code.visualstudio.com/, accessed on 10 February 2025), leveraging the Arduino framework for the ESP32. The key libraries used are in Table 1 with their corresponding function.
To manage the web server’s interface, an index.html file was designed and uploaded to the ESP32-CAM as a filesystem image. This process utilized the LittleFS file system, with the LittleFS.h library enabling the ESP32 to access and serve the uploaded web page. After which, the ESP32 microcontroller facilitates both Wi-Fi-based web server functionality and device-to-device communication via ESP-NOW. The use of WebSocket technology within the web server allows for real-time, bidirectional communication, enabling instantaneous updates of sensor readings, image feeds, and system states without requiring repeated HTTP requests. This reduces latency and optimizes bandwidth, making the system suitable for applications demanding high responsiveness.
Numeric and string data, such as sensor readings, current date and time, data IDs, and image filenames, are sent to the client in structured JSON format. The client-side JavaScript listens for WebSocket messages and dynamically updates the HTML elements using the Document Object Model (DOM). Captured images from the ESP32-CAM are stored with unique filenames on the microSD card or the ESP32’s memory. The images are then encoded in Base64 format and made accessible via WebSocket connections, enabling remote monitoring and analysis. The index.html file dynamically requests the latest image, which is served over HTTP and displayed on the web page. To avoid caching issues, the image URL includes a timestamp, ensuring the browser fetches the most recent capture. The image filename is updated through WebSocket alongside other data.
The web interface for which a user can remotely view the germination status inside a CubeSat through the sensor readings and images was explored in this section. The process by which this interface can be accessed publicly through a cloud service will be explored in the next section.

2.3. Establishment of Data Monitoring and Saving Using AWS IoT

This part of the methodology aims to connect the ESP32-Cam microcontroller to Amazon AWS IoT Core via the MQTT protocol, which provides a structured approach to integrating IoT devices with cloud services. This allows device integration with cloud-based solutions for remote monitoring outside the local network, enables data transmission, and facilitates bidirectional communication securely and efficiently.
The first step involves configuring the AWS IoT Core to recognize the ESP32-Cam as a device, referred to as a “Thing”. This representation serves as the bridge between the hardware and cloud services. By generating the required certificates and private keys for SSL/TLS encryption, the process ensures the creation of a secure environment where data integrity and secure communication between the ESP32-Cam and AWS IoT Core are established. After which, the setup involves defining MQTT topics that act as channels for data exchange. In the Arduino code in PlatformIO, two important libraries are included, which are <PubSubClient.h> for MQTT and <ArduinoJSON.h> for data formatting. This also uses the library <WiFi.h> to connect to the internet. After connecting to the internet, the next step is configuring the MQTT client with AWS IoT Core endpoint details and the generated security credentials in a created file header named, <secrets.h>. Functions for publishing sensor data and subscribing to topics are implemented to enable bidirectional communication. The inclusion of SSL/TLS settings ensures secure data transmission. Once the sketch is uploaded to the ESP32-Cam, the serial monitor in VS Code will confirm if the device’s serial output of successful Wi-Fi and MQTT connections. Next is the validation in the AWS IoT Console, where the MQTT Test Client is used to verify that the ESP32-Cam can receive data messages by subscribing to a topic, “dlsuARCHER/pub”. Once subscribed to the topic, the MQTT test client will receive the data from the CubeSat in JSON format such as variables named data_id, date_time, image_id, temperature, humidity, heat_index, ledState, and foggerState. Publishing real-time sensor data to AWS IoT Core enables remote access, even outside the local network, to the data and monitoring that the CubeSat is still working.
The data transmission from the physical layer up to the application layer for which a user receives the data in an understandable format through an online interface was explained in the previous sections. The next section focuses on how the existence of seedlings are detected from the images captured in CubeSat.

2.4. Development of Roboflow Model for Germination Detection

The development of a Roboflow-based model for detecting seed germination represents an innovative approach to advancing the monitoring system inside a CubeSat. See Figure 10 for the actual germination bed setup in 2U CubeSat. The model enabled efficient and non-invasive method for real-time monitoring of germination, with the objective of counting the number of germinated seeds as an input for quantifying the growth rate.
Images from the ARCHER’s SD card mounted in the ESP32-Cam were retrieved and used as the image data for annotation. Manual annotation was implemented in the Roboflow application for placing bounding boxes on the germinated seeds in the plant bed as shown in Figure 11.
The image dataset contains 95 images that were annotated. The germinated crops were labeled as “seedling”. After this, preprocessing and augmentations were performed, which generated more versions of the original image dataset, resulting in 162 images. The list of preprocessing and augmentations are listed in Table 2.
The 162 images were divided into train, validation, and test datasets by 83%, 12%, and 5%, respectively. This percentage values were estimated by Roboflow based on an 85-10-5 split to avoid overfitting issues. The ARCHER germination detection model was trained using the Roboflow 3.0 object detection framework to achieve efficient and accurate germination stage detection. The selected checkpoint, COCO, provides a robust foundation, as it is pre-trained on a large, diverse dataset of objects. This was evaluated with the evaluation metrics such as mean average precision (mAP), precision, and recall. The mAP metric provides a comprehensive measure of the model’s ability to correctly identify germination stages across varying levels of overlap between predicted and actual bounding boxes, ensuring both localization and classification accuracy. Precision highlights the proportion of correctly identified germinating seeds out of all positive predictions, minimizing false positives and ensuring reliability in practical applications. Conversely, recall measures the proportion of actual germinating seeds correctly detected, reflecting the model’s sensitivity and ability to minimize false negatives. In addition, training graphs such as mAP, box loss, class loss, and object loss provided insights into the model’s performance during the training process. The mAP curve tracks the model’s accuracy in detecting and classifying germination stages across training epochs, with a steady upward trend indicating improved learning and convergence. The box loss graph evaluates the precision of bounding box predictions, with a declining trend reflecting enhanced localization of germinating seeds. Similarly, the class loss graph measures errors in classification, and its reduction signifies improved differentiation between germination stages and other objects. Lastly, the object loss graph monitors the model’s confidence in identifying objects, with decreasing values showcasing growing reliability in distinguishing germinating seeds from background noise.

3. Results and Discussions

This section analyzes the outcomes of implementing the crop germination detection and remote monitoring system within the 2U CubeSat using the proposed framework. This section presents data from various methodologies, including local web server performance, AWS IoT Core effectiveness for data management and remote monitoring, and deep learning model evaluation for seedling detection. This analysis emphasizes the significance of agricultural innovation and space research, demonstrating how advanced technologies can assist in real-time monitoring [47] and data collection.

3.1. Local Webserver Based on ESP32

After building the Arduino code in PlatformIO (v6.1.15) using VS Code to transfer information data, sensor data, and image data to update dynamically the local web server and uploading the HTML file as an image to the ESP32-CAM system, the local server can be accessed using the server name “archerserver.local” to be typed as a browser URL. The capability to use a unique DNS was enabled by adding a multicast DNS (mDNS) protocol in the Arduino code. This removes the need to manually access and type the dynamic IP address that usually changes as you restart the router where the ESP32 devices are connected, enabling zero-configuration networking.
Using the WebSocket communication protocol, dynamic updates from ESP32-CAM to the web server were successfully established. This protocol enabled real-time, bidirectional communication between a client, a web browser, and a server over a single, persistent TCP connection going to the microcontroller. This surpasses the capability of a traditional HTTP request-response model where both parties need to refresh their connections just to send and receive messages simultaneously. This provided the capability to have real-time remote data updates such as date, time, data ID number, filename of the latest saved image, sensor readings for temperature, humidity and heat index, LED and fogger states, and an image of the current situation of the germination bed. The image transfer through the WebSocket protocol was made possible by encoding the image file in Base64 format. The web server automatically decodes the Base64 format to display the image. An example output of the web application is presented in Figure 12.

3.2. Secure Remote Monitoring Using AWS IoT

The establishment of data monitoring and saving using AWS IoT Core was effectively implemented by enabling a comprehensive framework for remote monitoring and data management through the ESP32-CAM. The system facilitated bidirectional communication by setting up MQTT topics for publishing sensor data, such as temperature and humidity, while also subscribing to relevant topics for receiving updates from the cloud. This capability ensures that users can both send and receive critical information in real time. However, the feature of sending back data or commands to the microcontroller still has no use with the current configuration. The objective of data management and accessibility is also met, as real-time sensor data are stored in the cloud, making it readily accessible for monitoring and analysis from any location. A snippet of the publishing of the topic and a sample of received data in JSON format are exhibited in Figure 13.
By connecting the device to AWS IoT Core via the MQTT protocol, users can achieve remote monitoring and data transmission, allowing for secure and efficient data transfer beyond local networks. This is complemented by the secure device integration objective, where the ESP32-Cam is configured as a “Thing” in AWS IoT Core, utilizing SSL/TLS encryption to ensure that all communications maintain data integrity and security.

3.3. Performance and Evaluation of the Roboflow 3.0 Model for Germination Detection

The evaluation of the Roboflow model for crop germination detection begins with analyzing the training graphs, which offer valuable insights into the model’s learning process. These graphs track the progression of key loss metrics and demonstrate how well the model improves over time.
The mAP graph shown in Figure 14a shows a steady increase over training epochs, indicating the model’s growing accuracy in detecting germinating seeds. The stabilization of the mAP curve at a high value during training indicates that the model has effectively converged, successfully distinguishing between germinated and non-germinated seeds. Further training likely yields minimal improvements. The model achieves an impressive mAP of 99.5%, which is a strong indication of its ability to correctly predict germinations across multiple overlap thresholds between predicted and actual bounding boxes.
The box loss graph in Figure 14b exhibits fluctuations in the early stages of training, showing that the model is refining its ability to localize germinating seeds by adjusting the bounding boxes. The fluctuations reflect the model’s exploration of optimal bounding box placements. The decline indicates that the model is progressively improving, ensuring that the detected bounding boxes closely match the actual seed locations.
The class loss graph shown in Figure 14c exhibited a rapid decrease in the early epochs, suggesting that the model quickly learns to detect the germinated seeds. This sharp reduction in class loss indicates that the model is efficiently learning the classification task. The stable low value nearing the end implies that the model has reached a strong understanding of the class distinctions.
The object loss graph presented in Figure 14d shows initial fluctuations as the model learns to detect germinated seeds and adjusts its predictions. These fluctuations occur as the model refines its understanding of localization and classification. As training progressed, the object loss decreased steadily, signaling that the model was becoming increasingly proficient in detecting seedlings.
After the training process, validation, and testing, the Roboflow model is evaluated by analyzing key performance metrics such [48] as mean average precision (mAP), precision, and recall. These metrics are instrumental in understanding the resulting model’s efficacy in detecting and classifying germinated seeds accurately. The model achieves an outstanding mAP of 99.5%, reflecting its high detection accuracy across various overlap thresholds between predicted and actual bounding boxes. This is further supported by a precision of 99.9%, which indicates a minimal false-positive rate, ensuring that almost every prediction of germination is correct. The model’s recall of 100.0% highlights its perfect sensitivity, capturing every instance of seed germination without missing any true positives. These metrics establish the model’s robustness and reliability in accurately identifying germinated seeds, providing a highly accurate tool for crop germination detection. This is demonstrated in Figure 15, which is one of the images used in the testing phase of the objection detection model. In using the model for detection with a video of the images, the confidence threshold of the detection was set at 70% to accurately distinguish the germinated seeds. While the overlap threshold was maintained at the default level of 50%.
The descriptive statistics of the actual and detected seedling count values from the training, validation, and testing processes for the germination detection model are summarized in Table 3, Table 4 and Table 5.
The confusion matrix result for the germination model is illustrated in Table 6. The columns correspond to the actual object counts while the rows correspond to the predicted objects using the model deployed in Roboflow.
The first cell in the matrix shows that 395 of aggregated seedling counts from the image datasets were correctly predicted by the model. The second cell shows the false-positive value of 1 which means that a non-seedling object was incorrectly predicted as a seedling. The third cell corresponds to a false-negative value of 1, which means that one actual seedling was not detected. The last cell value of 0 indicates that there is no true negative value. Through this matrix, it can be easily deduced that the model yielded high accuracy and precision values. The corresponding equations are given in (1).
Accuracy = (TP + TN)/(TP + FP + TN + FN) Precision = TP/(TP + FP) Recall = TP/(TP + FN)
where TP is true positive, FP is false positive, FN is false negative, and TN is true negative.
In comparison to existing studies on seedling detection using a machine learning model, the proposed model generated the highest mAP as shown in Table 7.

4. Conclusions and Recommendations

In conclusion, this study successfully developed a cloud-based crop germination detection and real-time remote monitoring system tailored for a 2U CubeSat environment following the proposed framework and resulting to evaluation metrics value of as high as 99% in mAP, precision, and recall. By integrating IoT technologies, deep learning, and cloud platforms such as AWS IoT and Roboflow, this research addresses critical challenges in space agricultural practices, particularly the need for efficient monitoring solutions [52,53,54]. Transmitting data across heterogenous network is one of these eminent problems. Maintaining the quality of data is another. This calls for a mechanism to process and relocate data off the onboard computing machine in the shortest time possible given the payload constraints. The implementation of an ESP32-based system for data collection and the potential for deploying a machine vision model for automated germination detection demonstrated significant advancements in real-time monitoring capabilities. These innovations not only enhance the functionality of CubeSat missions, on-Earth simulations, or in space but also contribute to the broader field of sustainable agriculture in extreme environments. Overall, this research highlights the potential of combining advanced technologies to facilitate the growth of plants in microgravity, ultimately supporting long-term human habitation in space.
Given the promising results, future research should focus on expanding the system’s capabilities, such as integrating additional sensors, and implementing using the proposed hybrid framework to advance nanosatellite mission into IoST architecture. Moreover, further optimization and actual deployment of the machine vision model could enhance its adaptability to different crop stages, types or varieties, and environmental conditions, thereby increasing its applicability across various space missions and even in the classic applications such as greenhouses or aquaponic systems. It is also recommended to explore the integration of autonomous data collection and decision-making systems, enabling greater operational efficiency in remote or resource-limited settings. Lastly, conducting field tests in simulated microgravity environments on Earth could provide additional insights into the system’s performance and reliability in space-like conditions to mitigate space-specific anomalies such as radiation, mechanical vibration, and power fluctuations that could affect internal payload components, further solidifying its potential for space farming applications.

Author Contributions

Conceptualization, A.G.J. and M.G.; methodology, software, visualization, and writing—original draft preparation, A.G.J.; resources and supervision, R.C.II and M.G.; writing—review and editing, formal analysis, and funding acquisition, A.F. and M.G. All authors have read and agreed to the published version of the manuscript.

Funding

The article processing charge (APC) for this research was funded by De La Salle University through its Science Foundation publication development grant.

Data Availability Statement

Data can be requested from the corresponding author as needed.

Acknowledgments

The authors wish to thank the Department of Manufacturing Engineering and Management and the Department of Mechanical Engineering of De La Salle University for the administrative and technical support with the use of laboratory equipment and spaces. Special thanks to Engr. Valencia for the help in the design of CubeSat used in this study. Lastly, the peer reviewers and the editorial board are very much appreciated for their significant contribution in the improvement of this article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Krakos, A. Lab-on-chip technologies for space research—Current trends and prospects. Microchim. Acta 2023, 191, 31. [Google Scholar]
  2. Medina, F.J. Space explorers need to be space farmers: What we know and what we need to know about plant growth in space. Mètode Sci. Stud. J.-Annu. Rev. 2021, 11, 55–62. [Google Scholar]
  3. Mortimer, J.C.; Gilliham, M. SpaceHort: Redesigning plants to support space exploration and on-earth sustainability. Curr. Opin. Biotechnol. 2022, 73, 246–252. [Google Scholar] [CrossRef] [PubMed]
  4. Puig-Suari, J.; Turner, C.; Ahlgren, W. Development of the standard CubeSat deployer and a CubeSat class PicoSatellite. In Proceedings of the 2001 IEEE Aerospace Conference Proceedings (Cat. No. 01TH8542), Big Sky, MT, USA, 10–17 March 2001; pp. 1–347. [Google Scholar]
  5. Kawa, B.; Śniadek, P.; Walczak, R.; Dziuban, J. Nanosatellite Payload for Research on Seed Germination in a 3D Printed Micropot. Sensors 2023, 23, 1974. [Google Scholar] [CrossRef]
  6. Zhang, J.; Su, J.; Wang, C.; Sun, Y. Modular design and structural optimization of CubeSat separation mechanism. Acta Astronaut. 2024, 225, 758–767. [Google Scholar]
  7. Nagavarapu, S.C.; Mogan, L.B.; Chandran, A.; Hastings, D.E. CubeSats for space debris removal from LEO: Prototype design of a robotic arm-based deorbiter CubeSat. Adv. Space Res. 2024; in press. [Google Scholar] [CrossRef]
  8. Mahajan, H.B.; Badarla, A. Application of Internet of Things for Smart Precision Farming: Solutions & Challenge. Int. J. Adv. Sci. Technol. 2018, 2018, 37–45. [Google Scholar]
  9. Bhanumathi, V.; Kalaivanan, K. The Role of Geospatial Technology with IoT for Precision Agriculture. In Cloud Computing for Geospatial Big Data Analytics; Das, H., Barik, R.K., Dubey, H., Roy, D.S., Eds.; in Studies in Big Data; Springer International Publishing: Cham, Switzerland, 2019; Volume 49, pp. 225–250. [Google Scholar]
  10. Capogrosso, L.; Cunico, F.; Cheng, D.S.; Fummi, F.; Cristani, M. A Machine Learning-Oriented Survey on Tiny Machine Learning. IEEE Access 2024, 12, 23406–23426. [Google Scholar] [CrossRef]
  11. Kua, J.; Loke, S.W.; Arora, C.; Fernando, N.; Ranaweera, C. Internet of things in space: A review of opportunities and challenges from satellite-aided computing to digitally-enhanced space living. Sensors 2021, 21, 8117. [Google Scholar] [CrossRef]
  12. Arzo, S.T.; Sikeridis, D.; Devetsikiotis, M.; Granelli, F.; Fierro, R.; Esmaeili, M.; Akhavan, Z. Essential technologies and concepts for massive space exploration: Challenges and opportunities. IEEE Trans. Aerosp. Electron. Syst. 2022, 59, 3–29. [Google Scholar]
  13. Witczak, D.; Szymoniak, S. Review of Monitoring and Control Systems Based on Internet of Things. Appl. Sci. 2024, 14, 8943. [Google Scholar] [CrossRef]
  14. Singh, P.; Krishnamurthi, R. IoT-based real-time object detection system for crop protection and agriculture field security. J. Real-Time Image Proc. 2024, 21, 106. [Google Scholar] [CrossRef]
  15. Miao, J.; Rajasekhar, D.; Mishra, S.; Nayak, S.K.; Yadav, R. A Microservice-Based Smart Agriculture System to Detect Animal Intrusion at the Edge. Future Internet 2024, 16, 296. [Google Scholar] [CrossRef]
  16. Babu, B.R.; Khan, P.M.A.; Vishnu, S.; Raju, K.L. Design and Implementation of an IoT-Enabled Remote Surveillance Rover for Versatile Applications. In Proceedings of the 2022 IEEE Conference on Interdisciplinary Approaches in Technology and Management for Social Innovation (IATMSI), Gwalior, India, 21–23 December 2022; pp. 1–6. [Google Scholar]
  17. Farooq, H.; Rehman, H.U.; Javed, A.; Shoukat, M.; Dudely, S. A Review on Smart IoT Based Farming. Ann. Emerg. Technol. Comput. 2020, 4, 17–28. [Google Scholar]
  18. Marzioli, P.; Gugliermetti, L.; Santoni, F.; Delfini, A.; Piergentili, F.; Nardi, L.; Metelli, G.; Benvenuto, E.; Massa, S.; Bennici, E.; et al. CultCube: Experiments in autonomous in-orbit cultivation on-board a 12-units CubeSat platform. Life Sci. Space Res. 2020, 25, 42–52. [Google Scholar]
  19. Fortino, G.; Guerrieri, A.; Pace, P.; Savaglio, C.; Spezzano, G. IoT Platforms and Security: An Analysis of the Leading Industrial/Commercial Solutions. Sensors 2022, 22, 2196. [Google Scholar] [CrossRef]
  20. Debauche, O.; Mahmoudi, S.; Manneback, P.; Lebeau, F. Cloud and distributed architectures for data management in agriculture 4.0: Review and future trends. J. King Saud Univ.-Comput. Inf. Sci. 2022, 34, 7494–7514. [Google Scholar]
  21. Liu, W.; Wu, M.; Wan, G.; Xu, M. Digital Twin of Space Environment: Development, Challenges, Applications, and Future Outlook. Remote Sens. 2024, 16, 3023. [Google Scholar] [CrossRef]
  22. Shymanovich, T.; Kiss, J.Z. Conducting Plant Experiments in Space and on the Moon. In Plant Gravitropism; Blancaflor, E.B., Ed.; Methods in Molecular Biology; Springer: New York, NY, USA, 2022; Volume 2368, pp. 165–198. [Google Scholar] [CrossRef]
  23. Jiménez, D.A.; Reyna, A.; Balderas, L.I.; Panduro, M.A. Design of 4 × 4 Low-Profile Antenna Array for CubeSat Applications. Micromachines 2023, 14, 180. [Google Scholar] [CrossRef]
  24. Sathasivam, M.; Hosamani, R.; Swamy, B.K. Plant responses to real and simulated microgravity. Life Sci. Space Res. 2021, 28, 74–86. [Google Scholar]
  25. Poghosyan, A.; Golkar, A. CubeSat evolution: Analyzing CubeSat capabilities for conducting science missions. Prog. Aerosp. Sci. 2017, 88, 59–83. [Google Scholar]
  26. Bandyopadhyay, S.; Foust, R.; Subramanian, G.P.; Chung, S.J.; Hadaegh, F.Y. Review of Formation Flying and Constellation Missions Using Nanosatellites. J. Spacecr. Rocket. 2016, 53, 567–578. [Google Scholar]
  27. Arneodo, F.; Di Giovanni, A.; Marpu, P. A Review of Requirements for Gamma Radiation Detection in Space Using CubeSats. Appl. Sci. 2021, 11, 2659. [Google Scholar] [CrossRef]
  28. Dixon, D.J.; Callow, J.N.; Duncan, J.M.A.; Setterfield, S.A.; Pauli, N. Satellite prediction of forest flowering phenology. Remote Sens. Environ. 2021, 255, 112197. [Google Scholar] [CrossRef]
  29. Carillo, P.; Morrone, B.; Fusco, G.M.; De Pascale, S.; Rouphael, Y. Challenges for a Sustainable Food Production System on Board of the International Space Station: A Technical Review. Agronomy 2020, 10, 687. [Google Scholar] [CrossRef]
  30. Kyriacou, M.C.; De Pascale, S.; Kyratzis, A.; Rouphael, Y. Microgreens as a component of space life support systems: A cornucopia of functional food. Front. Plant Sci. 2017, 8, 1587. [Google Scholar]
  31. Przybyla, C. Space aquaculture: Prospects for raising aquatic vertebrates in a B«bioregenerative life-support system on a lunar base. Front. Astron. Space Sci. 2021, 8, 699097. [Google Scholar]
  32. Johansen, K.; Ziliani, M.G.; Houborg, R.; Franz, T.E.; McCabe, M.F. CubeSat constellations provide enhanced crop phenology and digital agricultural insights using daily leaf area index retrievals. Sci. Rep. 2022, 12, 5244. [Google Scholar]
  33. Falk, K.G.; Jubery, T.Z.; Mirnezami, S.V.; Parmley, K.A.; Sarkar, S.; Singh, A.; Ganapathysubramanian, B.; Singh, A.K. Computer vision and machine learning enabled soybean root phenotyping pipeline. Plant Methods 2020, 16, 5. [Google Scholar] [CrossRef]
  34. Dhanya, V.G.; Subeesh, A.; Kushwaha, N.L.; Vishwakarma, D.K.; Kumar, T.N.; Ritika, G.; Singh, A.N. Deep learning based computer vision approaches for smart agricultural applications. Artif. Intell. Agric. 2022, 6, 211–229. [Google Scholar]
  35. Fuentes-Peñailillo, F.; Carrasco Silva, G.; Pérez Guzmán, R.; Burgos, I.; Ewertz, F. Automating Seedling Counts in Horticulture Using Computer Vision and AI. Horticulturae 2023, 9, 1134. [Google Scholar] [CrossRef]
  36. Mahendrakar, T.; White, R.T.; Tiwari, M.; Wilde, M. Unknown Non-Cooperative Spacecraft Characterization with Lightweight Convolutional Neural Networks. J. Aerosp. Inf. Syst. 2024, 21, 455–460. [Google Scholar] [CrossRef]
  37. Colmer, J.; O’neill, C.M.; Wells, R.; Bostrom, A.; Reynolds, D.; Websdale, D.; Shiralagi, G.; Lu, W.; Lou, Q.; Cornu, T.L.; et al. Methods SeedGerm: A cost-effective phenotyping platform for automated seed imaging and machine-learning based phenotypic analysis of crop seed germination. New Phytol. 2020, 228, 778–793. [Google Scholar] [PubMed]
  38. Thilakarathne, N.N.; Bakar, M.S.A.; Abas, P.E.; Yassin, H. Towards making the fields talks: A real-time cloud enabled iot crop management platform for smart agriculture. Front. Plant Sci. 2023, 13, 1030168. [Google Scholar]
  39. Paul, A.; Machavaram, R.; Kumar, D.; Nagar, H. Smart solutions for capsicum Harvesting: Unleashing the power of YOLO for Detection, Segmentation, growth stage Classification, Counting, and real-time mobile identification. Comput. Electron. Agric. 2024, 219, 108832. [Google Scholar]
  40. Priyadarshini, I.; Bhola, B.; Kumar, R.; So-In, C. A Novel Cloud Architecture for Internet of Space Things (IoST). IEEE Access 2022, 10, 15118–15134. [Google Scholar]
  41. Akyildiz, I.F.; Kak, A. The Internet of Space Things/CubeSats. IEEE Network 2019, 33, 212–218. [Google Scholar] [CrossRef]
  42. Fernando, A.; Lim, L.; Bandala, A.; Vicerra, R.; Dadios, E.; Guillermo, M.; Naguib, R. Simulated vs Actual Application of Symbiotic Model on Six Wheel Modular Multi-Agent System for Linear Traversal Mission. J. Adv. Comput. Intell. Intell. Inform. 2024, 28, 12–20. [Google Scholar]
  43. Fernando, A.H.; Guillermo, M.A.; Lim, L.A.; Bandala, A.A.; Vicerra, R.R.; Dadios, E.P.; Naguib, R.N. Load Pushing Capacity Analysis of Individual and Multi-Cooperative Mobile Robot through Symbiotic Application. Int. J. Mech. Eng. Robot. Res. 2024, 13, 304–313. [Google Scholar]
  44. Bedruz, R.A.R.; Maningo, J.M.Z.; Fernando, A.H.; Bandala, A.A.; Vicerra, R.R.P.; Dadios, E.P. Dynamic Peloton Formation Configuration Algorithm of Swarm Robots for Aerodynamic Effects Optimization. In Proceedings of the 2019 7th International Conference on Robot Intelligence Technology and Applications (RiTA), Daejeon, Republic of Korea, 1–3 November 2019; pp. 264–267. [Google Scholar] [CrossRef]
  45. Pearman, J.; Suissa, A.; Klingler, M.; Fernandes, A.; Curet, O.; Zhang, X.H. Analyzing Plant Growth After a Flight to an Altitude of 30,000 Meters via a CubeSat Satellite Carried by a High-Altitude Balloon. Fla. Atl. Univ. Stud. Res. J. 2024, 8, 49–54. [Google Scholar]
  46. Paule, J.M.; Roca, J.R.; Subia, K.M.; Tiong, T.J.T.; Guillermo, M.; de Veas-Abuan, D. Integration of AWS and Roboflow Mask R-CNN Model for a Fully Cloud-Based Image Segmentation Platform. In Proceedings of the 2023 IEEE 15th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM), Coron, Palawan, Philippines, 19–23 November 2023; pp. 1–6. [Google Scholar] [CrossRef]
  47. Pascua, A.R.A.; Rivera, M.; Guillermo, M.; Bandala, A.; Sybingco, E. Face Recognition and Identification Using Successive Subspace Learning for Human Resource Utilization Assessment. In Proceedings of the 2022 13th International Conference on Information and Communication Technology Convergence (ICTC), Jeju Island, Republic of Korea, 19–21 October 2022; pp. 1375–1380. [Google Scholar] [CrossRef]
  48. Fernando, A.H.; Maglaya, A.B.; Ubando, A.T. Optimization of an algae ball mill grinder using artificial neural network. In Proceedings of the 2016 IEEE Region 10 Conference (TENCON), Singapore, 22–25 November 2016; pp. 3752–3756. [Google Scholar] [CrossRef]
  49. Liu, Y.; Guo, L.; Ren, X.; Li, G.; Zhang, Y. Rapid and automatic counting of sorghum seedlings from UAV images based on machine learning and deep learning. Front. Plant Sci. 2024, 15, 1399872. [Google Scholar]
  50. Sheng, R.T.-C.; Huang, Y.-H.; Chan, P.-C.; Bhat, S.A.; Wu, Y.-C.; Huang, N.-F. Rice Growth Stage Classification via RF-Based Machine Learning and Image Processing. Agriculture 2022, 12, 2137. [Google Scholar] [CrossRef]
  51. Xu, R.; Zhang, Y.; Cheng, T.; Ma, X.; Guo, W. Recognizing rice seedling growth stages to timely do field operations based on UAV imagery. Front. Plant Sci. 2022, 13, 914771. [Google Scholar]
  52. Agoo, J.; Lanuza, R.J.; Lee, J.; Rivera, P.A.; Velasco, N.O.; Guillermo, M.; Fernando, A. Geographic Information System-Based Framework for Sustainable Small and Medium-Sized Enterprise Logistics Operations. ISPRS Int. J. Geo-Inf. 2025, 14, 1. [Google Scholar] [CrossRef]
  53. Amante, K.; Ho, L.; Lay, A.; Tungol, J.; Maglaya, A.; Fernando, A. Design, fabrication, and testing of an automated machine for the processing of dried water hyacinth stalks for handicrafts. In Proceedings of the IOP Conference Series: Materials Science and Engineering, Sanya, China, 12–14 November 2021; IOP Publishing: Bristol, UK, 2021; Volume 1109, p. 012008. [Google Scholar]
  54. Fernando, A.; GanLim, L. Velocity analysis of a six wheel modular mobile robot using MATLAB-Simulink. In Proceedings of the IOP Conference Series: Materials Science and Engineering, Sanya, China, 12–14 November 2021; IOP Publishing: Bristol, UK, 2021; Volume 1109. [Google Scholar]
Figure 1. CubeSat (a) physical subsystems and (b) data communications to ground stations.
Figure 1. CubeSat (a) physical subsystems and (b) data communications to ground stations.
Agriengineering 07 00149 g001
Figure 2. ARCHER (Agricultural CubeSat for Horticulture Experiments and Research) version 1.
Figure 2. ARCHER (Agricultural CubeSat for Horticulture Experiments and Research) version 1.
Agriengineering 07 00149 g002
Figure 3. Space farming data analysis using a cloud computing framework.
Figure 3. Space farming data analysis using a cloud computing framework.
Agriengineering 07 00149 g003
Figure 4. The proposed framework expanded to IoST architecture.
Figure 4. The proposed framework expanded to IoST architecture.
Agriengineering 07 00149 g004
Figure 5. Framework used in an existing CubeSat mission for alfalfa and tobacco plant growth [45].
Figure 5. Framework used in an existing CubeSat mission for alfalfa and tobacco plant growth [45].
Agriengineering 07 00149 g005
Figure 6. Process of data transfer using ESP-NOW technology.
Figure 6. Process of data transfer using ESP-NOW technology.
Agriengineering 07 00149 g006
Figure 7. Arduino code snippet for initializations of data structure and transmitting data via ESP-NOW.
Figure 7. Arduino code snippet for initializations of data structure and transmitting data via ESP-NOW.
Agriengineering 07 00149 g007
Figure 8. Data structure in the SD card with filename ARCHER_dataLog.log.
Figure 8. Data structure in the SD card with filename ARCHER_dataLog.log.
Agriengineering 07 00149 g008
Figure 9. Framework of asynchronous live updates in the local web server using WebSocket.
Figure 9. Framework of asynchronous live updates in the local web server using WebSocket.
Agriengineering 07 00149 g009
Figure 10. Germinating bed for the ARCHER CubeSat with the ESP32-CAM placed on top and the DHT11 sensor at the back.
Figure 10. Germinating bed for the ARCHER CubeSat with the ESP32-CAM placed on top and the DHT11 sensor at the back.
Agriengineering 07 00149 g010
Figure 11. Manual annotation performed in Roboflow where germinated crops are labeled as “seedling” with three different light conditions such as (a) no light, and two different levels of exposure as in (b,c).
Figure 11. Manual annotation performed in Roboflow where germinated crops are labeled as “seedling” with three different light conditions such as (a) no light, and two different levels of exposure as in (b,c).
Agriengineering 07 00149 g011
Figure 12. The web application for remote monitoring of the ARCHER CubeSat.
Figure 12. The web application for remote monitoring of the ARCHER CubeSat.
Agriengineering 07 00149 g012
Figure 13. The AWS IoT Core console where a user subscribes to the “dlsuARCHER/pub” MQTT topic, displaying real-time sensor data in JSON format.
Figure 13. The AWS IoT Core console where a user subscribes to the “dlsuARCHER/pub” MQTT topic, displaying real-time sensor data in JSON format.
Agriengineering 07 00149 g013
Figure 14. Training graphs for the Roboflow 3.0 seeding detection model: (a) mAP graph, (b) box loss graph, (c) class loss graph, and (d) object loss graph.
Figure 14. Training graphs for the Roboflow 3.0 seeding detection model: (a) mAP graph, (b) box loss graph, (c) class loss graph, and (d) object loss graph.
Agriengineering 07 00149 g014
Figure 15. Detection of seedlings using the developed Roboflow 3.0 model exhibiting a confidence level of detection from 74% up to 98%.
Figure 15. Detection of seedlings using the developed Roboflow 3.0 model exhibiting a confidence level of detection from 74% up to 98%.
Agriengineering 07 00149 g015
Table 1. Key Libraries used in Web Server Setup.
Table 1. Key Libraries used in Web Server Setup.
Library HeaderFunction
WiFi.hProvides functions to connect the ESP32 to a Wi-Fi network, enabling network communication.
WebServer.hAllows the ESP32 to act as a web server, handling HTTP requests and serving web pages and files.
WebSocketsServer.hEnables real-time, bidirectional communication between the ESP32 and the client via WebSockets.
FS.hProvides file system functionality for managing files on the ESP32’s internal flash memory.
LittleFS.hA lightweight file system is used for storing files (e.g., images) on the ESP32’s internal storage.
ArduinoJson.hUsed to format and parse JSON data for easy transmission between the ESP32 and the client.
Table 2. Preprocessing and Augmentation Parameters for the Germination Detection Model.
Table 2. Preprocessing and Augmentation Parameters for the Germination Detection Model.
CategoryParameterDetails
PreprocessingAuto-OrientApplied
ResizeStretch to 640 × 640
AugmentationsOutputs per Training Example2
GrayscaleApply to 100% of images
SaturationBetween −72% and +72%
BrightnessBetween −38% and +38%
BlurUp to 2.1px
NoiseUp to 1.6% of pixels
Table 3. Summary Statistics Parameters for Germination Detection Model Training.
Table 3. Summary Statistics Parameters for Germination Detection Model Training.
Statistics ParametersTraining Dataset Actual Seedling Detected Seedling
Mean67.52.9477612.947761
Standard Error3.3541020.1725520.172552
Median67.533
ModeN/A33
Standard Deviation38.826541.9974311.997431
Sample Variance1507.53.9897323.989732
Kurtosis−1.2−1.251445−1.251445
Skewness2.03 × 10−17−0.001745−0.001745
Range13366
Minimum100
Maximum13466
Sum9045395395
Count134134134
Confidence Level (98.0%)7.8979450.406310.40631
Table 4. Summary Statistics Parameters for Germination Detection Model Validation.
Table 4. Summary Statistics Parameters for Germination Detection Model Validation.
Statistics ParametersValidation Dataset Actual Seedling Detected Seedling
Mean152.52.652.6
Standard Error1.3228760.3346250.327671
Median152.533
ModeN/A41
Standard Deviation5.916081.4964871.46539
Sample Variance352.2394742.147368
Kurtosis−1.2−1.168687−1.021609
Skewness0−0.0604630.00446
Range1955
Minimum14300
Maximum16255
Sum30505352
Count202020
Confidence Level (98.0%)3.359420.8497740.832116
Table 5. Summary Statistics Parameters for Germination Detection Model Testing.
Table 5. Summary Statistics Parameters for Germination Detection Model Testing.
Statistics ParametersTesting Dataset Actual Seedling Detected Seedling
Mean138.544
Standard Error0.8660250.9449110.944911
Median138.55.55.5
ModeN/A66
Standard Deviation2.449492.6726122.672612
Sample Variance67.1428577.142857
Kurtosis−1.2−1.01584−1.01584
Skewness0−0.957864−0.957864
Range766
Minimum13500
Maximum14266
Sum11083232
Count888
Confidence Level (98.0%)2.5963022.8327982.832798
Table 6. Germination Detection Model Confusion Matrix Result.
Table 6. Germination Detection Model Confusion Matrix Result.
Actual SeedlingBackground
predicted seedling3951
background10
Table 7. Proposed Model mAP Comparison with Existing Studies.
Table 7. Proposed Model mAP Comparison with Existing Studies.
ModelEnvironmentmAP
Proposed RoboflowSpace99.50%
YOLOv8 [49]UAV98.00%
RF (SMOTE-ENN) [50]Space98.77%
EfficientnetB4 [51]UAV99.47%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Janairo, A.G.; Concepcion, R., II; Guillermo, M.; Fernando, A. A Cloud Computing Framework for Space Farming Data Analysis. AgriEngineering 2025, 7, 149. https://doi.org/10.3390/agriengineering7050149

AMA Style

Janairo AG, Concepcion R II, Guillermo M, Fernando A. A Cloud Computing Framework for Space Farming Data Analysis. AgriEngineering. 2025; 7(5):149. https://doi.org/10.3390/agriengineering7050149

Chicago/Turabian Style

Janairo, Adrian Genevie, Ronnie Concepcion, II, Marielet Guillermo, and Arvin Fernando. 2025. "A Cloud Computing Framework for Space Farming Data Analysis" AgriEngineering 7, no. 5: 149. https://doi.org/10.3390/agriengineering7050149

APA Style

Janairo, A. G., Concepcion, R., II, Guillermo, M., & Fernando, A. (2025). A Cloud Computing Framework for Space Farming Data Analysis. AgriEngineering, 7(5), 149. https://doi.org/10.3390/agriengineering7050149

Article Metrics

Back to TopTop