Next Article in Journal
Digital Dividend or Digital Divide? How the Digital Economy Shapes China’s Agri-Food Trade Dynamics: Evidence on Impacts, Mechanisms, and Heterogeneity
Previous Article in Journal
Milk Performance and Blood Biochemical Indicators of Dairy Goats Fed with Black Oat Supplements
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Interoperable IoT/WSN Sensing Station with Edge AI-Enabled Multi-Sensor Integration for Precision Agriculture

1
C-MAST—Centre for Mechanical and Aerospace Science and Technologies, 6201-001 Covilhã, Portugal
2
Department of Electromechanical Engineering, University of Beira Interior, Calçada Fonte do Lameiro, 6200-358 Covilhã, Portugal
*
Author to whom correspondence should be addressed.
Agriculture 2026, 16(1), 69; https://doi.org/10.3390/agriculture16010069
Submission received: 30 November 2025 / Revised: 25 December 2025 / Accepted: 26 December 2025 / Published: 28 December 2025
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)

Abstract

This study presents an in-depth exploration of an innovative monitoring system that contributes to precision agriculture (PA) and supports sustainability and biodiversity. Amidst the challenges of global population growth and the need for sustainable, high-yield agricultural practices, PA, supported by modern technology and data-driven methodologies, emerges as a pivotal approach for optimizing crop yield and resource management. The proposed monitoring system integrates Wireless sensor networks (WSNs) into PA, enabling real-time acquisition of environmental data and multimodal observations through cameras and microphones, with data transmission via LTE and/or LoRaWAN for cloud-based analysis. Its main contribution is a physically modular, pole-mounted station architecture that simplifies sensor integration and reconfiguration across use cases, while remaining solar-powered for long-term off-grid operation. The system was evaluated in two field deployments, including a year-long wild-flora monitoring campaign (three stations; 365 days; 1870 images; 63–100% image-based operational availability), during which stations remained operational through a wildfire event. In the viticulture deployment, the acoustic module supported bat monitoring as a bio-indicator of ecosystem health, achieving bat call detection performance of 0.94 (AP Det) and species classification performance of 0.85 (mAP Class). Overall, the results support the use of modular, energy-aware monitoring stations to perform sustained agricultural and ecological data collection under practical field constraints.

1. Introduction

Agriculture and farming are vital for sustenance and human life and have witnessed a transformative shift in the wake of technological advancements. With the increase in global population, the demand for sustainable, high-yield agricultural practices has become imperative [1]. Precision Agriculture (PA), driven by modern technology and data-driven methodologies, emerges as a pivotal approach to optimize crop yield and resource management [2].
The integration of wireless sensor networks (WSNs) into PA holds immense potential. WSNs enable real-time data collection on various environmental parameters, providing farmers with a comprehensive understanding of their crops’ needs [3]. These networks serve multifaceted purposes, from early pest and disease detection to climate monitoring, and even facilitating precise management techniques like targeted pesticide application [4]. However, the effective utilization of WSNs in agriculture requires addressing crucial challenges, particularly concerning energy efficiency and routing protocols. Energy-efficient routing schemes are crucial to extend the lifespan of sensor networks, ensuring continuous and reliable data transmission [5,6].
Precision agriculture, a contemporary agricultural trend, seeks to optimize production efficiency while minimizing environmental impacts [7]. Its applications extend to epidemic disease control, mitigating the adverse effects of climate-induced diseases, and reducing the excessive use of chemical fungicides [8].
The convergence of Internet of Things (IoT) technologies with PA improves distributed monitoring and control systems, revolutionizing diverse application areas [9]. However, challenges persist, particularly concerning energy management and interoperability issues within heterogeneous sensor networks [10].
Efforts to standardize communication interfaces and establish consensus-based standards are key to simplifying the integration of diverse sensors, reducing complexities, and boosting interoperability in data acquisition networks [11,12].
The combination of PA, IoT technologies, and WSN presents a change in the landscape of agricultural practices. Addressing energy efficiency and standardization challenges is crucial to unleashing the full potential of these advancements in agricultural sustainability and productivity.
Despite these advances, current systems are predominantly single-task, lack hardware modularity, and do not support agroecological duality (simultaneous monitoring of production and biodiversity). In addition, most solutions rely on cloud processing, facing limitations in remote locations due to the lack of data integration via Edge AI.
This study addresses these limitations by investigating the following research questions:
  • Is it possible for a single modular station with flexible hardware architecture to simultaneously support PA and biodiversity monitoring without the need for redesign?
  • Can the implementation of dual-level Edge AI ensure continuous autonomous operation in agricultural environments?
  • How can the energy autonomy of a modular system be ensured under variable computational loads throughout the year?
In this context, this article presents the LITecS station with two case studies: the Montanha Viva system and the BioD’Agro system. These systems were designed with a modular architecture that eliminates the need for redesign when adding new sensors. By combining modular hardware with artificial intelligence algorithms for specific tasks and verified energy autonomy, this solution proposes a new model for technically sustainable environmental and agricultural monitoring.
These systems represent cutting-edge solutions that integrate advanced sensor technology with sustainable energy solutions to address the evolving challenges faced by modern agriculture. The scientific contribution lies not in the individual hardware components or AI, but rather in the design of a new concept, a modular station, which can be adapted to different needs, thus filling the gap found in the scientific literature. The systems described in the literature focus on only one task. In contrast, the monitoring station proposed here was designed from the outset for dual agro-ecological purposes through the integration of multiple sensors that can be easily changed as needed, as shown in the two case studies described in the manuscript.
Through the analysis of these case studies, this paper describes the practical applications, technological features, and benefits of such integrated systems in improving agricultural resilience, biodiversity monitoring, and environmental sustainability.

2. Literature Review

The fusion of IoT, WSNs, and artificial intelligence (AI) is revolutionizing PA, addressing critical challenges, and leveraging new opportunities for sustainable farming practices. This review analyzes the state-of-the-art in key areas, including AI-driven monitoring, system interoperability, and energy efficiency, to identify the research gaps addressed by our work. Table 1 provides a comparative summary of relevant studies.
AI and IoT play a pivotal role in advanced agricultural monitoring, particularly for early disease detection. Systems like the one proposed by Khattab et al. [9] use environmental and soil data from WSNs to forecast diseases in crops such as tomatoes and potatoes, successfully minimizing chemical use and enhancing crop quality. Similarly, computational models like Convolutional Neural Networks (CNNs) have been integrated to improve classification accuracy for crop management. While powerful, these systems are often highly specialized for a single agricultural task, such as disease management or weed classification. This narrow focus represents a gap in developing integrated platforms that can serve both agronomic needs and broader ecological monitoring objectives, such as biodiversity assessment.
In addition to hardware integration, advances in non-destructive monitoring have been driven by new methods based on image analysis. A recent study by Tsaniklidis et al. [13] developed a method for estimating leaf area and greenness without direct contact with plants, enabling automated screening of underdeveloped plants before transplanting to ensure that plants reaching the field are of high quality. Their findings highlight that the yellow/blue colour channel (b*) serves as an indicator of SPAD (chlorophyll) values, enabling early detection of nitrogen deficiencies or biotic/abiotic stress. Similarly, Makraki et al. [14] identified that the cyan component (µC) of images can indicate Relative Water Content (RWC), enabling the detection of invisible dehydration before fruits show visible damage. These studies demonstrate the growing ability of ML-driven decision support to adjust fertilization and irrigation in real time. The station architecture we propose is designed to host these specialized algorithms, providing the modular hardware and cutting-edge processing power needed to translate these analytical findings into autonomous field operations.
The effective deployment of WSNs in PA hinges on solving challenges related to interoperability and energy management. To address the integration of diverse sensors, frameworks based on the IEEE 1451 standard [15] have been proposed to streamline data acquisition in heterogeneous networks. The work by Fernandes et al. [16] is particularly relevant, demonstrating a framework for managing diverse sensors in precision viticulture based on this standard.
Pandiyaraju et al. [17] focus on improving WSNs in PA by optimizing energy consumption in sensor nodes. It introduces a multi-objective clustering approach, a hybrid optimization technique, and a Convolutional Neural Network (CNN) integration. The approach improves performance metrics like classification accuracy, throughput, packet delivery ratio, network lifetime, and energy consumption. Potential enhancements include a new routing mechanism, a lightweight optimizer, and a lightweight CNN to further refine the approach and improve crop yield.
Focusing on energy efficiency, recent innovations in WSNs employ multi-objective clustering methods and deep learning techniques to optimize energy usage. These methods enhance classification accuracy, extend network lifetimes, and reduce energy consumption, offering robust solutions to energy challenges in PA. Haseeb et al. [18] proposed an IoT-based WSN framework for smart agriculture, utilizing agricultural sensors to capture data and determine cluster heads. The framework measures signal strength and security, resulting in improved communication performance. Simulations show a 13.5% increase in network throughput, 38.5% packet drop ratio, 13.5% network latency, 16% energy consumption, and 26% routing overheads compared to other solutions.
Exploring genetic algorithm-based routing protocols, Patil & Kohle [19] emphasize the importance of high-scalability and low-latency solutions for improving WSN performance, particularly in time-sensitive applications. This protocol was suitable for highly distributed and rapidly expanding networks, outperforming conventional routing algorithms and achieving lower latency in highly scalable WSNs.
Research on low-cost WSNs for vine phenology monitoring, especially flowering stages. The DHT22 sensor, chosen for its low cost and high accuracy, was integrated into a network with a central node and several remote nodes connected via Zigbee communication protocols. This innovative approach highlights the potential for cost-effective, accurate, and detailed measurement of spatial variability in agriculture [20].
Dehwah et al. [21] discuss the Universal Dynamic Weather-Conditioned Moving Average (UD-WCMA) as a new energy estimation and forecasting algorithm for solar-powered wireless sensor networks. UD-WCMA uses real-time data and historical energy patterns to improve prediction accuracy in diverse weather conditions. The system used custom-developed sensor boards powered by a 32-bit ARM Cortex M4 microcontroller and a 20Wp/20.8Voc solar panel.
Zhang et al. [22] focused on organic photovoltaic energy harvesting systems that have demonstrated the viability of OPV (Organic Photovoltaic)-powered WSNs in indoor settings, contributing to sustainable energy solutions for sensor networks. The system used sensors for temperature, humidity, air pressure, CO2 levels, a real-time clock, and battery voltage. The sensor nodes were powered by a LiPo battery and a flexible OPP (Organic Photovoltaic Panel) module from InfinityPV (Jyllinge, Denmark). Data transmission was done using a ZigBee RF module for ease of deployment and high data rate.
Foughali et al. [23] investigated a cost-effective WSN for potato farming. The NodeMCU IoT platform and DHT11 sensor for temperature and humidity measurements were developed. The system, powered by two 1.5 V batteries and housed in IP66-rated enclosures, was designed to monitor micro-climate conditions, crucial for managing and preventing late blight in potato crops. This setup allows for efficient and cost-effective monitoring of micro-climate conditions, which is crucial for managing and preventing late blight in potato crops.
These studies collectively emphasize the critical role of IoT, WSN, and AI in advancing PA. They highlight the importance of sensor integration, energy efficiency, AI-based forecasting, and cost-effective solutions in shaping future agricultural practices that are data-driven, efficient, and environmentally sustainable. The integration of these technologies facilitates the development of modular monitoring systems that are adaptable, scalable, and efficient, catering to the evolving needs of modern agriculture [24]. Along with the academic monitoring systems summarized in Table 1, Table 2 presents the commercial monitoring stations on the market and their comparison with the LITecS station.
Table 1. Comparative Analysis of Monitoring Systems in Precision Agriculture.
Table 1. Comparative Analysis of Monitoring Systems in Precision Agriculture.
StudyApplicationKey Technologies UsedContributionsValidation TypeLimitations
Khattab et al. [9]Early disease forecasting in cropsIoT, WSN, AI-driven data fusionReduces chemical usage; versatile for different plant disease modelsField DeploymentNo Edge AI (Cloud dependent); No imaging branch for visual symptom validation; Specific to disease forecasting
Foughali et al. [23]Late blight prevention in potatoesIoT (NodeMCU), WSN (DHT11)Cost-effective solution for micro-climate monitoringField PrototypeLow-precision sensors (DHT11); Relies on static mechanistic models (no AI); Limited energy autonomy (battery powered)
Fuentes-Peñailillo et al. [20]Vine phenology monitoringLow-cost WSN (DHT22 sensor), ZigbeeCost-effective approach for detailed spatial variability measurementField DeploymentLacks visual phenology validation (scalar data only); No biodiversity or acoustic monitoring; Local storage limits real-time analytics.
Pandiyaraju et al. [17]Energy optimization in PAMulti-objective clustering, CNNImproves network lifetime, throughput, and classification accuracySimulation OnlyNo hardware implementation; Energy metrics based on theoretical models; Computational complexity unverified on low-power edge devices
Fernandes et al. [16]Precision Viticulture (PV)IEEE 1451 standard, WSNAddresses data interoperability challenges in heterogeneous sensor networksField PrototypeFocuses on software/protocol standard (IEEE 1451) complexity rather than holistic hardware autonomy; High protocol overhead
Haseeb et al. [18]Smart AgricultureIoT, WSN, custom routing protocolImproved network throughput and reduced energy use in simulationsSimulation onlyPerformance validated via simulation; No physical deployment or environmental hardening; Assumes idealized link conditions.
Table 2. Comparison of commercially available monitoring systems with the LITecS station.
Table 2. Comparison of commercially available monitoring systems with the LITecS station.
ProductApplicationKey Technologies UsedContributionsLimitationsCost
Davis Instruments Vantage Pro2 [25]Acquisition and collection of meteorological data (basic meteorological/agronomic use).Raw data recording; transmission to proprietary software; closed ecosystem.Simple and consolidated solution for meteorology.Proprietary expansion modules only; Closed Ecosystem; Paid historical data$$$
Onset HOBO RX3000 [26]Meteorology, environmental research, and building management; remote monitoring via the cloud.Raw data logging; HOBOlink cloud platform; accepts multiple proprietary smart sensors.Plug-and-play proprietary sensors and integrated remote monitoring.Proprietary Cloud (HOBOlink); Proprietary “Smart Sensors”$$
Rynan Smart Weather Station [27]Agricultural meteorology and smart irrigation provide real-time environmental monitoring and automatic data transmission for precision farming.IoT connectivity (LoRaWAN/4G); integrated meteorological sensors; cloud-based dashboard and data storage; automated data analytics and alerts.Enables real-time decision-making for irrigation and crop management; integrates with farm management systems; user-friendly online dashboard; low maintenance.Fixed sensor configuration; Closed PlatformQuote-based
Senseca HDMCS-100 [28] Weather monitoring for agriculture, research, or field sites Ultrasonic anemometer (no moving parts) for wind; integrated meteorological sensors; independent power and data link.Rapid deployment, low-maintenance (no moving parts in anemometer), and self-powered with remote data transmission.All-in-one (Non-expandable); Proprietary protocols; Fixed hardware; dependent on GSM coverageQuote-based
LITecS
(this proposal)
Precision Agriculture (BioD’Agro) and Biodiversity Monitoring (Montanha Viva); Agro + Ecological acquisition with Edge processing.Edge AI (pre-processing on ESP32; AI inference on Raspberry Pi); easy integration of third-party sensors; open source (FastAPI, MySQL); Open GPIO/I2C/SPI (universal).High modularity; integration of cameras/microphones and others; based on open-source software; User-owned data.Computer vision algorithms are limited by dataset quality and variable field conditions.$
Note: $: <EUR 1000; $$: EUR 1000–1500; $$$: >EUR 1500.

3. Materials and Methods

This section describes the design and architecture of the LITecS Station, a modular monitoring system developed for precision agriculture and ecological applications. The system is designed to support long-term autonomous operation and reliable data collection under real-world agricultural conditions. Because each case study entails different deployment requirements, details such as the number and placement of monitoring stations, as well as other site-specific specifications, are provided within the respective case study sections. The present section, therefore, focuses on the station’s general operating principles, highlighting how its modular design enables straightforward adaptation across scenarios.

3.1. System Architecture and Core Components

The LITecS station is built on a modular pole design that allows flexible sensor integration, adapted for specific monitoring requirements (Figure 1). The system’s electronic architecture employs a two-tier processing strategy to improve energy efficiency.
An ESP32 microcontroller, selected for its low power consumption, manages continuous data acquisition from environmental sensors. For more computationally intensive tasks, such as image capture and data transmission, a Raspberry Pi single-board computer is used. This hierarchical design ensures that the higher-power Raspberry Pi is activated only when necessary, which conserves a significant amount of energy. A summary of the core hardware components is provided in Table 3.
A summary of the LITecS station’s architecture, including its hardware components and the data acquisition, processing, and visualization flow, is shown in Figure 2.
All digital sensors were used with the manufacturer’s factory calibration, and no additional laboratory calibration was performed prior to deployment. Field validation was limited to basic quality-control procedures, including range screening, step-change detection, and consistency checks between coupled variables (e.g., air temperature and relative humidity). In addition, the station’s meteorological time series were compared against expected regional conditions reported by the National Weather Institute (IPMA). The tipping-bucket rain gauge was levelled during installation. The temperature and humidity module was housed in a shaded, ventilated protective enclosure to reduce sun exposure and weather-related biases while maintaining exposure to ambient air. Cameras were installed in a dedicated protective housing with a brim to reduce rain and wind exposure and to limit direct solar glare that can degrade image quality. Camera placement and orientation were defined according to the target scene and the specific monitoring objective at each site.

3.2. Sensor Configuration and Data Acquisition

The main advantage of the LITecS station is its modularity, which allows for a customized sensor array. A representative configuration for comprehensive environmental monitoring includes a thermometer, pluviometer, anemometer, wind direction sensor, barometer, and hygrometer. Additional sensors for soil moisture or leaf wetness can be integrated as needed.
Data acquisition is managed by the ESP32, which samples all connected environmental sensors every 10 min. This sampling interval aligns with World Meteorological Organization (WMO) recommendations, ensuring data quality and comparability [29]. Wind speed and precipitation are computed as 10-min aggregates: wind is summarized as mean, and the maximum values over the interval, and precipitation is reported as the accumulated tipping-bucket total. Other variables (e.g., temperature, humidity, and pressure) are sampled once per 10-min interval and reported at the corresponding timestamp. The resulting data packet is then transmitted to the Raspberry Pi for storage and downstream processing.

3.3. Power System and Energy Management

To ensure continuous, long-term operation in remote agricultural settings, the LITecS station is designed to be fully self-sufficient by harnessing solar energy. The system is powered by a solar panel and a rechargeable battery.
The maximum expected power was determined based on component manufacturer datasheets and technical documentation. These estimates offer a conservative upper bound for evaluating the power system’s adequacy requirements.
The main energy management strategy is the duty cycle of the Raspberry Pi. One low-power ESP32 runs continuously to collect sensor data, while the Raspberry Pi, the most power-intensive component, is kept off by default. Then the ESP32, responsible for the power management of the image branch, activates the Raspberry Pi’s only for the short duration required to capture and transmit images, after which it is powered down again. This strategy is the primary mechanism for achieving the system’s long-term energy autonomy.
Based on this operation, the average daily consumption of the set was estimated at around 80 Wh/day, including losses in the voltage converters and solar controller. A 12 V/30 Ah battery was chosen. Considering the nominal stored energy and an assumed 80% depth of discharge (DoD), it provides on the order of 250–290 Wh of usable energy, which is adequate to support operation and absorb short-term variations in solar production. A 30W solar panel was selected to match the system’s daily energy demand. Considering the average solar radiation and a south-facing installation with a 30° inclination, the panel can generate 60 to 90 Wh/day during the winter and higher values throughout the other seasons. Under winter conditions, this means the daily balance may range from slightly negative to near neutral, depending on weather, so occasional shutdowns can occur during extended periods of low irradiance, with production resuming once production recovers and the battery recharges. This configuration provides a good balance between performance, size, and cost, while supporting stable operation for most of the year. The operational impact of these interruptions is discussed in the case-study results.

3.4. Data Processing and Visualization Pipeline

The data stored in the cloud is processed through a pipeline that manages the two data types in parallel. For the sensor data, the pipeline retrieves the files from the cloud and uploads them to a MySQL database.
For the images, the pipeline includes pre-processing before applying computer vision algorithms. A key algorithm is the “greenness level” analysis, used to quantify vegetation health over time. This metric is derived from a multi-step image processing pipeline. First, an input image is converted from the RGB color model to the HSV (Hue, Saturation, Value) color space, which is more robust for separating color from illumination. A specific HSV mask is then applied to isolate all vegetation pixels and exclude the background. The script employs the K-Means clustering algorithm on these masked pixels to identify the 15 most dominant colors within the plant. The Hue cut-offs (32–75 for green; 0–31 for brown/yellow) were chosen empirically from representative images by inspecting the dominant-color clusters in the segmented vegetation region to separate healthy canopy greens from senescent/stressed yellow–brown tones.
The “greenness index” is then scientifically quantified by classifying these dominant colors based on their Hue value: pixels with a Hue between 32 and 75 are classified as “Green” (indicating healthy, chlorophyll-rich tissue), while those with a Hue between 0 and 31 are classified as “Brown Yellow” (indicating stressed or senescing tissue). The final metric is the percentage of “Green” pixels relative to the total classified vegetation colors.
To conduct temporal analysis, the system programmatically processes images organized by date and named using Unix timestamps. By applying this classification pipeline to each timestamped image in chronological order using os.walk, the system generates a discrete time-series of the green-to-brown ratio. Both the image metadata and the corresponding classification result are stored in the MySQL (8.4.x version) database for dashboard display.
Data integration from both the sensor and imaging branches is managed by the system’s back-end infrastructure, which retrieves processed data from the MySQL database. The back end, implemented using Fast API (version 0.120.0), coordinates the handling of environmental measurements and classifications. This configuration ensures that meteorological data and insights from the output of the computer vision algorithm are combined to enable comprehensive monitoring of environmental conditions and other goal-oriented factors.
The system interface, developed with React, presents the collected information through an interactive panel designed as a Decision Support System (DSS). The dashboard is intended for operational stakeholders directly involved in monitoring and intervention, such as farmers/producers and agricultural technicians. Users can explore real-time and historical meteorological data (temperature, humidity), which are essential for producers to calculate evapotranspiration rates and determine ideal irrigation windows, based on the actual water stress measured by the LITecS station’s soil and weather sensors. Furthermore, the platform allows for the viewing of plant images and their processing status, as well as the uploading of field photographs for model classification. This visual feedback, combined with the environmental data, helps technicians identify localized anomalies without the need for constant physical presence in the field. The photo upload functionality fosters a symbiotic relationship with users, who contribute valuable data to further improve the model.
Figure 3 shows representative examples of the images uploaded for model classification, highlighting the diversity of installation sites and the two distinct camera viewpoints, normal and wide-angle.
The LITecS station integrates IoT sensing, wireless networking, and on-device data processing to support precision-agriculture monitoring while also producing datasets of potential relevance for environmental and climate studies, including applications related to irrigation management and extreme-weather readiness. The main contribution is a modular hardware architecture that facilitates sensor integration, maintenance, and incremental scaling. This is complemented by a dual-use design supporting both on-farm decision support and broader environmental monitoring, and an off-grid power subsystem sized for seasonal operation, noting that prolonged low irradiance can lead to occasional winter downtime. These elements provide a configurable platform for field deployment, supporting data-driven agricultural monitoring and related environmental applications under practical operational constraints.

4. Case Studies

4.1. BioD’Agro System Case Study

BioD’Agro was a viticulture-oriented project in which LITecS stations were the data-acquisition system. The project relied on three stations placed along the vineyard row to generate the sensor data acquisition and imagery used in the subsequent analyses. Each station collected soil and microclimate measurements to support irrigation-relevant monitoring and supported complementary observations relevant to biodiversity. In addition to standard variables (e.g., soil moisture and temperature, wind speed/direction, and solar exposure), the deployed configuration included leaf-wetness sensing and acoustic recordings of bat activity. Each station also acquired time-stamped imagery. These images supported computer-vision analyses of vegetation and field conditions, including greenness-based metrics and visual cues associated with growth and stress. The resulting multimodal dataset supported the evaluation of learning-based approaches for irrigation management and soil-water availability prediction [30]. Additionally, it also provided longitudinal imagery for computer-vision pipelines such as vine trunk detection for localization and vineyard flora classification relevant to weed-management workflows [31].

4.2. Montanha Viva System Case Study

The Montanha Viva project was a field deployment aimed at monitoring native mountain flora, with an emphasis on phenological characterization across seasonal cycles. Within this project, LITecS stations were the field sensing system used for data acquisition. Site selection for station placement was carried out with guidance from agronomic engineers from a collaborating research group within the Montanha Viva consortium. The choice of target plant species was also defined with their input, prioritizing species with local conservation relevance (e.g., restricted distribution) and/or recognized medicinal and food value. Three stations were installed at distinct locations on the slopes of Serra da Gardunha, Portugal. Each station recorded local meteorological conditions (air temperature, relative humidity, precipitation, wind speed and direction, and atmospheric pressure) and captured imagery using two cameras. Data were collected over approximately one year, covering all seasons and a range of weather conditions. In this case study, the imagery supported phenological stage assessment to inform a tourism-oriented application (e.g., route recommendations based on observed flowering and vegetation state). The meteorological data collected was also shared with local farmers to support their agricultural activities.
The dataset acquired by the LITecS stations supported the development of computer-vision pipelines for detecting and tracking phenological stages of wild flora, using seasonally consistent image sequences aligned with environmental measurements [32]. A mobile application was implemented to deploy lightweight YOLO-based detectors for field use, trained and validated using images collected by the stations and supported by the same edge/cloud processing approach evaluated in the viticulture case study [33]. This work also benefited from methodological transfer between deployments, including computer-vision components and data-management practices established in BioD’Agro [30,31,32], as well as biodiversity assessment [33,34]. The use of a data management pipeline (FastAPI/MySQL/cloud), energy-efficient acquisition strategies, and dashboard interfaces allowed for biodiversity monitoring-focused AI and sensing research. Together, these elements demonstrate how the LITecS system facilitated knowledge transfer and enabled mobile and accessible tools for conservation, education, and ecotourism [33]. The monitored species and deployment context are summarized in Table 4. Operational availability is reported as the ratio between captured and expected images over the monitoring period, used to demonstrate the field uptime.

5. Results

This section presents the practical results obtained from the data collected by the LITecS stations in the different case studies.
In the BioD’Agro project, producers used the stations’ meteorological and imaging outputs to contextualize vegetation development and indicators relevant to vineyard management (e.g., pest-related activity and canopy condition). The stations operated for most of the deployment period, including during winter, but experienced occasional interruptions during extended low-irradiance (overcast) episodes that limited solar radiation. The subsequent Montanha Viva deployment incorporated the revised duty-cycled configuration and power-management strategy, reducing the frequency of such interruptions. In Montanha Viva, the stations supported a tourism-oriented application based on plant phenology and local meteorological conditions, and were exposed to substantial seasonal variability (cold, wet winters and hot, dry summers). Across this deployment, the stations achieved sustained operation for most of the monitoring period, with the ESP32-based power-management mechanism playing a central role in maintaining availability during low-insolation conditions, while acknowledging that further reductions in energy demand and improved winter resilience remain possible. Additionally, during the year of the Montanha Viva project, the Gardunha Mountain experienced one wildfire. The stations continued transmitting data and images, documenting the resulting devastation. Figure 4 depicts the pre- and post-fire conditions at one of the LITecS station sites, providing visual evidence of the extensive ecological impact observed in the aftermath of the event.

5.1. Results BioD’Agro System Case Study

To extract relevant agronomic information, the images were processed through the computational vision pipeline detailed in Section 3.4. This analysis quantifies the “greenness level” and provides a data-driven metric for crop monitoring, as illustrated by the segmentation example in Figure 5.
The results from this analysis directly influence and optimize farming strategies. The output ratio of healthy “Green” tissue to “Brown Yellow” tissue serves as a direct scientific indicator of crop health. A detectable increase in the “Brown Yellow” percentage provides an early, quantitative warning of plant stress. This data-driven insight allows for immediate and targeted interventions. For example, a rising brown/yellow index is a classic indicator of water stress, prompting a targeted irrigation cycle. Similarly, a shift towards yellow hues can signify nitrogen deficiency, guiding the precise application of fertilizer. By tracking this ratio over time, a farmer can also scientifically determine the optimal harvest window to maximize crop quality and yield. This system, therefore, translates raw image data into actionable, evidence-based farming decisions.
The Autonomous Bat Echolocation (ABE) Monitoring System utilizes ultrasonic microphones (10–160 kHz) to capture bat vocalizations. These signals are buffered and preprocessed on an edge device, such as a Raspberry Pi, before performing real-time inference. Rather than training a new neural network architecture from scratch, this study focuses on the field implementation and real-time validation of the pre-existing BatDetect2 CNN-based model within the LITecS modular station. By deploying this pre-trained model for edge-level inference, the system provides an understanding of movement patterns and spatial distribution. This information supports an understanding of movement patterns and the spatial distribution of species. Because bat activity correlates with insect abundance, the system serves as an early-warning indicator with bats acting as bioindicators of vineyard ecosystem health. The model achieved 0.94 AP (detection, bat vs. non-bat) and 0.85 mAP (species classification), indicating strong call detection and good species labelling. Complementary diet analyses, through genetic sequencing of guano collected in shelters, identified the meadow spittlebug (Philaenus spumarius), a known viticultural pest. Figure 6 displays real-time monitoring and classification of bat species based on echolocation calls.
To make the collected data accessible, an interactive web dashboard was developed that centralizes all the information. The platform consolidated all the information, enabling farmers and agricultural technicians to analyze sensor data through interactive tables and graphs, compare variables, and export results. A visual monitoring component, composed of a photo gallery, complemented the data analysis, offering farmers a remote and real-time view of the vegetative state of the vineyards.

5.2. Results Montanha Viva System Case Study

In Montanha Viva, the selection of target species and monitoring plots was guided by agronomic experts within the consortium, who emphasized spring to autumn as the periods of greatest phenological interest for the monitored flora. Under this monitoring focus, short winter data gaps associated with prolonged low irradiance were not expected to materially affect the intended project outputs, which is consistent with the reduced growth activity typically observed during winter dormancy in temperate vegetation [35]. Over the 365-day monitoring period, Station 3 captured 710/730 images (≈97% uptime), including during the wildfire period. Station 1 captured 680/730 (≈93% uptime), with gaps attributed mainly to reduced insolation at the installation site. Station 2 captured 482/730 (≈66% uptime). This station experienced wildfire-related damage (localized enclosure breach and cable burn). These data provide quantitative availability information alongside the qualitative field observations.
A computer vision pipeline was implemented for the automated analysis of captured images. This process used the YOLOv8 model for the initial detection of plants in photographs, followed by the EfficientNetB5 model for the precise classification of the species and its respective phenological state. The robustness of this computer vision pipeline was based on a locally captured dataset. The dataset was divided into 70% for training, 20% for validation, and 10% for testing. To increase data variability and improve the generalization capacity of the models, data augmentation techniques were applied, including rotations between −15° and +15°. The YOLOv8 model was trained for 100 epochs with a batch size of 32. The results obtained demonstrated high accuracy in most classes, as detailed in Table 5. A correlation was also observed between the volume of training images and the average accuracy metric (mAP@50), with species such as Armeria transmontana and Umbilicus rupestris reaching values greater than 0.96. However, classes with less representation in the dataset, such as Echinospartum ibericum, revealed challenges for detection. Accurate detection is crucial, as misclassifications can lead to trail recommendations where the desired species is not in its ideal phenological stage. For example, in species with lower mAP, such as Echinospartum ibericum, the high error rate can cause the system to suggest routes where the desired species or phenological stage is absent. After this processing, the images were made available on the dashboard for user visualization. The extracted phenological information formed the basis for a route recommendation system, which recommended hiking trails to users with the highest probability of observing certain species in interesting phases, such as flowering. Additionally, the platform offered robust tools for the analysis of sensor data: the user could access the data in tabular or graphical format, as presented in Figure 7, export the information in CSV files as well as the graphs themselves, and perform a comparative analysis between different parameters measured by the station’s sensors.
The monitored environmental data showcased in Figure 7 reveal generally stable atmospheric and climatic conditions throughout the observation period. Atmospheric pressure remained around 1000 hPa, with minor oscillations likely caused by natural diurnal heating and cooling cycles, as well as local wind effects that are more pronounced due to the station’s position on a small mountain. These fluctuations reflect the influence of altitude and temperature-driven air movement rather than significant weather changes. Temperature and ambient humidity also remained steady, indicating a stable air mass and limited precipitation activity. Soil humidity showed a slight increase toward the end of the period, suggesting localized moisture accumulation possibly due to ground absorption rather than rainfall, as the pluviometer readings stayed constant. These readings were taken in August, in Serra da Gardunha, when dry and stable weather conditions typically prevail, which further explains the absence of rain and the overall stability observed in the data.
The comparative analysis of meteorological and environmental parameters (e.g., precipitation, temperature, humidity), considered alongside time-stamped images, helps producers assess how local conditions align with relevant climatic regimes (e.g., Mediterranean). Linking these observations to the plant’s recorded growth phases in the dashboard shows which conditions coincide with each stage, which is particularly useful when the species is cultivated outside its native range. In addition, the daily data acquisition reveals climate-change effects on phenological development (e.g., shifts in the timing of flowering), making the dashboard valuable for both producers and researchers. The user can also upload pictures to be classified by the model. This design not only provides a clear and actionable view of the monitored ecosystem but also fosters a symbiotic relationship with users, who contribute valuable data to further improve the model.

5.3. Limitations

Despite the promising results obtained by AI models in classifying flora, it is essential to recognize the factors that can limit their accuracy in real-world application scenarios. One of the main sources of error lies in the visual similarity between different plant species, especially when they are in similar stages of growth or share morphological characteristics. This overlap of visual characteristics can lead to incorrect classifications, an inherent challenge in flora classification. Additionally, the phenological state and physiological condition of the plant can significantly alter its visual appearance (color, texture), potentially hindering the model’s ability to correctly identify the species. Another set of challenges is related to the variable environmental conditions inherent in field monitoring, for which the LITecS system was designed. The accuracy of visual recognition can be affected by significant variations in natural lighting, such as intense shadows and low light (dawn/dusk). In addition to these factors, there is also wind, which can cause movement in plants (‘motion blur’), making it difficult to capture clear images and extract stable characteristics for the model. These factors, related to both the nature of the data and the acquisition environment, represent important limitations. Quantifying the exact impact of these factors on the performance of the LITecS system and developing strategies to mitigate their effects are important areas for future research.

6. Conclusions

This study presented the LITecS station as a modular IoT/WSN sensing platform designed to support heterogeneous agricultural and environmental monitoring without hardware redesign. The same core architecture was deployed in two distinct contexts: (i) precision viticulture (BioD’Agro) and (ii) wild-flora monitoring (Montanha Viva), by reconfiguring the sensor suite and the associated data products (meteorological sensing, imaging, and acoustics) while keeping the station concept unchanged, thus proving that a single modular architecture can be adapted to distinct Precision Agriculture approaches and biodiversity-oriented objectives. The two-tier ESP32–Raspberry Pi architecture enabled low-power continuous sensing while duty-cycling higher-load imaging and communication tasks, supporting the use of task-specific AI pipelines when required. The field deployments demonstrate the practical performance of the sensing and edge-processing approach. In BioD’Agro, time-stamped images supported an automated vegetation “greenness” analysis that produced a quantitative green-to-brown/yellow ratio for crop-condition tracking. Also, the acoustic module enabled real-time bat-call monitoring using BatDetect2, achieving 0.94 AP for bat vs. non-bat detection and 0.85 mAP for species classification. In Montanha Viva, three stations operated over 365 days, providing an empirical availability indicator based on captured images: 93%, 66%, and 97%, including continued operation and data acquisition during a wildfire event. The same dataset supported computer-vision pipelines for flora monitoring, with YOLO-based detection reaching mAP@50 > 0.96 for well-represented classes (e.g., Armeria transmontana, Umbilicus rupestris) and lower values for under-represented classes (e.g., Echinospartum ibericum), highlighting the dependence of model robustness on class balance and training volume.
Energy autonomy was addressed through a two-tier architecture in which the ESP32 handles continuous low-power acquisition while the Raspberry Pi is duty-cycled for imaging and transmission. Under the adopted operating strategy, the platform’s average consumption was estimated at approximately 80 Wh/day, and the implemented power subsystem (12 V/30 Ah battery and 30 W photovoltaic panel) was sized to ensure seasonal off-grid operation, although with occasional winter interruptions under prolonged low irradiance and site-dependent insolation constraints.
Overall, the results show that a single, modular station can support field monitoring across agro-ecological use cases, combining multi-sensor acquisition with edge-enabled AI workflows under realistic operational constraints. The observed limitations, such as the winter irradiance sensitivity and dataset imbalance effects on certain vision classes, define clear priorities for subsequent iterations: improved low-irradiance resilience through further load reduction and/or power sizing, and expanded, better-balanced training data for under-represented species.

Author Contributions

Conceptualization, M.A., N.P., and P.D.G.; methodology, M.A., N.P. and P.D.G.; software, M.S., A.A., and N.P.; validation, M.A., N.P. and P.D.G.; formal analysis, M.S., A.A., R.A., M.A., N.P. and P.D.G.; investigation, M.S., A.A., R.A., M.A., N.P., and P.D.G.; resources, N.P., and P.D.G.; data curation, M.S., A.A., R.A., M.A., and N.P.; writing—original draft preparation, M.S., A.A., R.A., M.A., N.P., and P.D.G.; writing—review and editing, M.S., A.A., R.A., M.A., N.P., and P.D.G.; supervision, N.P. and P.D.G.; project administration, P.D.G.; funding acquisition, P.D.G. All authors have read and agreed to the published version of the manuscript.

Funding

This is within the activities of Project BioDAgro–Sistema operacional inteligente de informação e suporte á decisão em AgroBiodiversidade, project PD20-00011 and project Montanha Viva—An intelligent prediction system for decision support in sustainability, project PD21-00009, promoted by the PROMOVE program funded by Fundação La Caixa and supported by Fundação para a Ciência e a Tecnologia and BPI. The authors acknowledge the financial support of the Fundação para a Ciência e a Tecnologia (FCT), I.P., through project UIDB/0151/2025, Centre for Mechanical and Aerospace Sciences and Technologies (C-MAST). DOI: https://doi.org/10.54499/UID/00151/2025. The authors are solely responsible for the content of this article, which does not reflect the opinion of the FCT.

Data Availability Statement

Data is contained within the article. The data presented in this study are available at www.montanhaviva.pt (accessed on 5 December 2025).

Acknowledgments

The authors acknowledge the support provided by LITecS (Laboratory of Innovation and Technologies for Sustainability) (www.litecs.ubi.pt) (accessed on 25 December 2025).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ahmed, N.; De, D.; Hussain, I. Internet of Things (IoT) for smart precision agriculture and farming in rural areas. IEEE Internet Things J. 2018, 5, 4890–4899. [Google Scholar] [CrossRef]
  2. Popescu, D.; Stoican, F.; Stamatescu, G.; Ichim, L.; Dragana, C. Advanced UAV–WSN system for intelligent monitoring in precision agriculture. Sensors 2020, 20, 817. [Google Scholar] [CrossRef]
  3. Wang, S. Multipath routing based on genetic algorithm in wireless sensor networks. Math. Probl. Eng. 2021, 2021, 4815711. [Google Scholar] [CrossRef]
  4. Akhter, R.; Sofi, S.A. Precision agriculture using IoT data analytics and machine learning. J. King Saud Univ.-Comput. Inf. Sci. 2022, 34, 5602–5618. [Google Scholar] [CrossRef]
  5. Nehra, V.; Sharma, A.K.; Tripathi, R.K. NMR inspired energy efficient protocol for heterogeneous wireless sensor network. Wirel. Netw. 2019, 25, 3689–3700. [Google Scholar] [CrossRef]
  6. Tawfeek, M.A.; Alrashdi, I.; Alruwaili, M.; Jamel, L.; Elhady, G.F.; Elwahsh, H. Improving energy efficiency and routing reliability in wireless sensor networks using modified ant colony optimization. EURASIP J. Wirel. Commun. Netw. 2025, 2025, 22. [Google Scholar] [CrossRef]
  7. Rogers, D.; Tsirkunov, V. Costs and Benefits of Early Warning Systems. Global Assessment Report on Disaster Risk Reduction 2011, Background Document. United Nations International Strategy for Disaster Reduction: Geneva, Switzerland, 2011. Available online: https://www.preventionweb.net/english/hyogo/gar/2011/en/bgdocs/Rogers_&_Tsirkunov_2011.pdf (accessed on 25 December 2025).
  8. Lim, J.A.; Yaacob, J.S.; Mohd Rasli, S.R.A.; Eyahmalay, J.E.; El Enshasy, H.A.; Zakaria, M.R.S. Mitigating the repercussions of climate change on diseases affecting important crop commodities in Southeast Asia, for food security and environmental sustainability—A review. Front. Sustain. Food Syst. 2023, 6, 1030540. [Google Scholar] [CrossRef]
  9. Khattab, A.; Habib, S.E.; Ismail, H.; Zayan, S.; Fahmy, Y.; Khairy, M.M. An IoT-based cognitive monitoring system for early plant disease forecast. Comput. Electron. Agric. 2019, 166, 105028. [Google Scholar] [CrossRef]
  10. Tani, F.K.; Cugnasca, C.E. Agriculture and the IEEE 1451 smart transducer interface standards. In Proceedings of the EFITA WCCA 2005 Proceedings, Vila Real, Portugal, 25–28 July 2005; Universidade de Trás-os-Montes e Alto Douro/EFITA/APDTICA: Trás-os-Montes, Portugal, 2005. [Google Scholar]
  11. Chen, C.; Helal, S. Sifting through the jungle of sensor standards. IEEE Pervasive Comput. 2008, 7, 84–88. [Google Scholar] [CrossRef]
  12. Oostdyk, R.L.; Mata, C.T.; Perotti, J.M. A Kennedy Space Center implementation of IEEE 1451 networked smart sensors and lessons learned. In Proceedings of the 2006 IEEE Aerospace Conference, Big Sky, MT, USA, 4–11 March 2006; IEEE: Piscataway, NJ, USA, 2006; p. 20. [Google Scholar]
  13. Tsaniklidis, G.; Makraki, T.; Papadimitriou, D.; Nikoloudakis, N.; Taheri-Garavand, A.; Fanourakis, D. Non-Destructive Estimation of Area and Greenness in Leaf and Seedling Scales: A Case Study in Cucumber. Agronomy 2025, 15, 2294. [Google Scholar] [CrossRef]
  14. Makraki, T.; Tsaniklidis, G.; Papadimitriou, D.M.; Taheri-Garavand, A.; Fanourakis, D. Non-Destructive Monitoring of Postharvest Hydration in Cucumber Fruit Using Visible-Light Color Analysis and Machine-Learning Models. Horticulturae 2025, 11, 1283. [Google Scholar] [CrossRef]
  15. IEEE 1451.0-2007; IEEE Standard for a Smart Transducer Interface for Sensors and Actuators—Common Functions, Communication Protocols, and Transducer Electronic Data Sheet (TEDS) Formats. IEEE: New York, NY, USA, 2007.
  16. Fernandes, M.A.; Matos, S.G.; Peres, E.; Cunha, C.R.; López, J.A.; Ferreira, P.J.S.G.; Reis, M.; Morais, R. A framework for wireless sensor networks management for precision viticulture and agriculture based on IEEE 1451 standard. Comput. Electron. Agric. 2013, 95, 19–30. [Google Scholar] [CrossRef]
  17. Pandiyaraju, V.; Ganapathy, S.; Mohith, N.; Kannan, A. An optimal energy utilization model for precision agriculture in WSNs using multi-objective clustering and deep learning. J. King Saud Univ.-Comput. Inf. Sci. 2023, 35, 101803. [Google Scholar] [CrossRef]
  18. Haseeb, K.; Ud Din, I.; Almogren, A.; Islam, N. An energy efficient and secure IoT-based WSN framework: An application to smart agriculture. Sensors 2020, 20, 2081. [Google Scholar] [CrossRef]
  19. Patil, V.B.; Kohle, S. A high-scalability and low-latency cluster-based routing protocol in time-sensitive WSNs using genetic algorithm. Meas. Sens. 2024, 31, 100941. [Google Scholar] [CrossRef]
  20. Fuentes-Peñailillo, F.; Acevedo-Opazo, C.; Ortega-Farías, S.; Rivera, M.; Verdugo-Vásquez, N. Spatialized system to monitor vine flowering: Towards a methodology based on a low-cost wireless sensor network. Comput. Electron. Agric. 2021, 187, 106233. [Google Scholar] [CrossRef]
  21. Dehwah, A.H.; Elmetennani, S.; Claudel, C. UD-WCMA: An energy estimation and forecast scheme for solar powered wireless sensor networks. J. Netw. Comput. Appl. 2017, 90, 17–25. [Google Scholar] [CrossRef]
  22. Zhang, S.; Bristow, N.; David, T.W.; Elliott, F.; O’Mahony, J.; Kettle, J. Development of an organic photovoltaic energy harvesting system for wireless sensor networks; application to autonomous building information management systems and optimisation of OPV module sizes for future applications. Sol. Energy Mater. Sol. Cells 2022, 236, 1115503. [Google Scholar] [CrossRef]
  23. Foughali, K.; Fathallah, K.; Frihida, A. Using Cloud IOT for disease prevention in precision agriculture. Procedia Comput. Sci. 2018, 130, 575–582. [Google Scholar] [CrossRef]
  24. Sharma, A.; Jain, A.; Gupta, P.; Chowdary, V. Machine learning applications for precision agriculture: A comprehensive review. IEEE Access 2020, 9, 4843–4873. [Google Scholar] [CrossRef]
  25. Davis. Available online: https://www.davisinstruments.com/pages/vantage-pro2 (accessed on 10 November 2025).
  26. HOBOnet® Wireless Sensor Network. Available online: https://www.onsetcomp.com/hobonet-sensor-network-remote-monitoring-system (accessed on 10 November 2025).
  27. Smart Weather Station for Farms. Available online: https://rynanagriculture.com/smart-weather-station (accessed on 10 November 2025).
  28. HDMCS-100—All-in-One Meteo Compact Station. Available online: https://environmental.senseca.com/product/hdmcs-100-all-in-one-meteo-compact-station/ (accessed on 10 November 2025).
  29. World Meteorological Organization (WMO). Guide to Instruments and Methods of Observation; (WMO-No. 8); WMO: Geneva, Switzerland, 2024; Volume I. [Google Scholar] [CrossRef]
  30. Alibabaei, K.; Gaspar, P.D.; Assunção, E.; Alirezazadeh, S.; Lima, T.M.; Soares, V.N.G.J.; Caldeira, J.M.L.P. Comparison of On-Policy Deep Reinforcement Learning A2C with Off-Policy DQN in Irrigation Optimization: A Case Study at a Site in Portugal. Computers 2022, 11, 104. [Google Scholar] [CrossRef]
  31. Corceiro, A.; Pereira, N.; Alibabaei, K.; Gaspar, P.D. Leveraging Machine Learning for Weed Management and Crop Enhancement: Vineyard Flora Classification. Algorithms 2024, 17, 19. [Google Scholar] [CrossRef]
  32. Videira, J.; Gaspar, P.D.; Soares, V.N.G.J.; Caldeira, J.M.L.P. Detecting and monitoring the development stages of wild flowers and plants using computer vision: Approaches, challenges and opportunities. Int. J. Adv. Intell. Inform. 2023, 9, 347–362. [Google Scholar] [CrossRef]
  33. Videira, J.; Gaspar, P.D.; Soares, V.N.G.J.; Caldeira, J.M.L.P. A mobile application for detecting and monitoring the development stages of wild flowers and plants. Inform.—Int. J. Comput. Inform. 2024, 48, 43–58. [Google Scholar] [CrossRef]
  34. Marcelino, S.; Gaspar, P.D.; Paço, A. Sustainable Waste Management in the Production of Medicinal and Aromatic Plants—A Systematic Review. Sustainability 2023, 15, 13333. [Google Scholar] [CrossRef]
  35. Körner, C.; Möhl, P.; Hiltbrunner, E. Four ways to define the growing season. Ecol. Lett. 2023, 26, 1277–1292. [Google Scholar] [CrossRef]
Figure 1. Example of sensors used in the LITecS station. The station is modular, so it allows the integration of multiple sensors and cameras.
Figure 1. Example of sensors used in the LITecS station. The station is modular, so it allows the integration of multiple sensors and cameras.
Agriculture 16 00069 g001
Figure 2. Flow diagram representing the sequence of steps of the data flow architecture in the LITecS station: (Purple) Power supply; (Blue) Data acquisition, transmission, and storage; (Green) Data Processing; (Yellow) Data organization and analysis; (Red) User interface.
Figure 2. Flow diagram representing the sequence of steps of the data flow architecture in the LITecS station: (Purple) Power supply; (Blue) Data acquisition, transmission, and storage; (Green) Data Processing; (Yellow) Data organization and analysis; (Red) User interface.
Agriculture 16 00069 g002
Figure 3. Example images obtained from the imaging system, demonstrating camera performance and the two viewing modes (wide-angle and normal).
Figure 3. Example images obtained from the imaging system, demonstrating camera performance and the two viewing modes (wide-angle and normal).
Agriculture 16 00069 g003
Figure 4. Pre- and post-wildfire images captured at a LITecS station site, showing system applicability and resilience of the monitoring station.
Figure 4. Pre- and post-wildfire images captured at a LITecS station site, showing system applicability and resilience of the monitoring station.
Agriculture 16 00069 g004
Figure 5. Example of the model segmentation of the vineyard.
Figure 5. Example of the model segmentation of the vineyard.
Agriculture 16 00069 g005
Figure 6. Spectrogram of ultrasonic bat calls captured by the ABE Monitoring System and analyzed by BatDetect2 for real-time detection and species classification. Distinct calls from Pipistrellus pipistrellus and Nyctalus leisleri are visible.
Figure 6. Spectrogram of ultrasonic bat calls captured by the ABE Monitoring System and analyzed by BatDetect2 for real-time detection and species classification. Distinct calls from Pipistrellus pipistrellus and Nyctalus leisleri are visible.
Agriculture 16 00069 g006
Figure 7. Examples of the different graphs that users can access on the Montanha Viva Dashboard.
Figure 7. Examples of the different graphs that users can access on the Montanha Viva Dashboard.
Agriculture 16 00069 g007
Table 3. Core Hardware Components of the LITecS Station.
Table 3. Core Hardware Components of the LITecS Station.
SubsystemComponentManufacturer/ModelKey Specs (Range/Accuracy)Interface/ProtocolRole in Station
Edge compute
(low power)
Microcontroller boardNodeMCU ESP32Dual-core LX6 up to 240 MHz;
512 kB SRAM; 4 MB flash
UART/I2C/SPI/ADC/DACContinuous sensor acquisition;
local preprocessing; power control of higher-load modules
Edge compute (higher power)Single-board computerRaspberry Pi Zero 2 W Quad-core 64-bit ARM Cortex-A53 @ 1 GHz; 512 MB SDRAM; 2.4 GHz Wi-FiCSI-2 (camera);
GPIO; microSD
Image acquisition/processing and network transmission
ImagingCameraRaspberry Pi Camera Module 3 Wide120° FoV (wide);
autofocus; HDR mode;
4608 × 2592 px resolution;
Low-light sensibility
CSI-2Wide high-resolution imagery for computer vision tasks
ImagingCameraRaspberry Pi Camera Module 3 76° FoV; autofocus; HDR mode;
4608 × 2592 px resolution;
Low-light sensibility
CSI-2High-resolution imagery for computer vision tasks
AtmosphereTemperature,
relative humidity,
and pressure
module
M5Stack ENV III HAT Temperature −40 to 120 °C (±0.2 °C, 0–60 °C);
RH 10–90% (±2% RH);
pressure 300–1100 hPa (±1 hPa)
I2C Ambient monitoring (temperature, relative humidity, pressure)
WeatherWind vane,
cup anemometer,
tipping-bucket rain gauge
External mechanical weather kitReed-switch outputs.
Rain: tipping-bucket, 0.2794 mm/tip.
Wind speed: pulse output; 2.4 km/h ≈ 1 Hz.
Wind direction: resistor network encoding 16 directions (22.5° resolution) via voltage divider (ADC).
Rain + anemometer: digital pulse wind vane: analog voltage (ADC)Wind speed/direction and precipitation measurement
Table 4. Montanha Viva deployment overview and image-based operational availability. The table reports station locations, monitoring period, target species, acquisition settings, total images collected (two cameras per station), and the derived operational availability, computed as the fraction of expected images successfully captured over the monitoring period.
Table 4. Montanha Viva deployment overview and image-based operational availability. The table reports station locations, monitoring period, target species, acquisition settings, total images collected (two cameras per station), and the derived operational availability, computed as the fraction of expected images successfully captured over the monitoring period.
Station IDSiteMonitoring Period (Days)Ecosystem/
Crop Type
Main speciesSampling Interval (min)Daily Images
per Camera
Total
Images
Operational Availability
140°07′28.0″ N365WildflowersEchinospartum ibericum10168093.2%
7°30′22.8″ WArmenia transmontana
240°06′15.0″ N365WildflowersGlandora prostrata10148266.0%
7°30′17.0″ WCytisus multiflorus
340°04′26.0″ N365WildflowersCytisus multiflorus10171097.3%
7°33′49.6″ WHalimium umbellatum
Table 5. Performance metrics of the YOLOv8 model by class, detailing the dataset and average accuracy (mAP@50).
Table 5. Performance metrics of the YOLOv8 model by class, detailing the dataset and average accuracy (mAP@50).
ClassDataset ImagesmAP@50 (YOLO)
Armenia transmontana5170.966
Cistus salviifolius1440.923
Coincya monensis3410.742
Cytisus multiflorus1330.518
Echinospartum ibericum320.351
Echium lusitanicum1860.904
Glandora prostrata520.904
Halimium umbellatum1120.730
Umbilicus rupestris710.971
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sousa, M.; Alves, A.; Antunes, R.; Aguiar, M.; Gaspar, P.D.; Pereira, N. Interoperable IoT/WSN Sensing Station with Edge AI-Enabled Multi-Sensor Integration for Precision Agriculture. Agriculture 2026, 16, 69. https://doi.org/10.3390/agriculture16010069

AMA Style

Sousa M, Alves A, Antunes R, Aguiar M, Gaspar PD, Pereira N. Interoperable IoT/WSN Sensing Station with Edge AI-Enabled Multi-Sensor Integration for Precision Agriculture. Agriculture. 2026; 16(1):69. https://doi.org/10.3390/agriculture16010069

Chicago/Turabian Style

Sousa, Matilde, Ana Alves, Rodrigo Antunes, Martim Aguiar, Pedro Dinis Gaspar, and Nuno Pereira. 2026. "Interoperable IoT/WSN Sensing Station with Edge AI-Enabled Multi-Sensor Integration for Precision Agriculture" Agriculture 16, no. 1: 69. https://doi.org/10.3390/agriculture16010069

APA Style

Sousa, M., Alves, A., Antunes, R., Aguiar, M., Gaspar, P. D., & Pereira, N. (2026). Interoperable IoT/WSN Sensing Station with Edge AI-Enabled Multi-Sensor Integration for Precision Agriculture. Agriculture, 16(1), 69. https://doi.org/10.3390/agriculture16010069

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop