Next Article in Journal
Research on the Impact of Shot Selection on Neuromuscular Control Strategies During Basketball Shooting
Previous Article in Journal
Enhancing LoRaWAN Performance Using Boosting Machine Learning Algorithms Under Environmental Variations
Previous Article in Special Issue
Inductive Power Transfer Coil Misalignment Perception and Correction for Wirelessly Recharging Underground Sensors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Mobile Ground-Truth 3D Detection Environment for Agricultural Robot Field Testing

1
Faculty of Engineering and Computer Science, University of Applied Sciences Osnabrück, 49076 Osnabrück, Germany
2
Institute of Computer Science, Osnabrück University, 49090 Osnabrück, Germany
*
Author to whom correspondence should be addressed.
Sensors 2025, 25(13), 4103; https://doi.org/10.3390/s25134103
Submission received: 14 May 2025 / Revised: 10 June 2025 / Accepted: 24 June 2025 / Published: 30 June 2025
(This article belongs to the Collection Sensors and Robotics for Digital Agriculture)

Abstract

Safety and performance validation of autonomous agricultural robots is critically dependent on realistic, mobile test environments that provide high-fidelity ground truth. Existing infrastructures focus on either component-level sensor evaluation in fixed setups or system-level black-box testing under constrained conditions, lacking true mobility, multi-object capability and tracking or detecting objects in multiple Degrees Of Freedom (DOFs) in unstructured fields. In this paper, we present a sensor station network designed to overcome these limitations. Our mobile testbed consists of self-powered stations, each equipped with a high-resolution 3D-Light Detection And Ranging (LiDAR) sensor, dual-antenna Global Navigation Satellite System (GNSS) receivers and on-board edge computers. By synchronising over GNSS time and calibrating rigid LiDAR-to-LiDAR transformations, we fuse point clouds from multiple stations into a coherent geometric representation of a real agricultural environment, which we sample at up to 20 Hz. We demonstrate the performance of the system in field experiments with an autonomous robot traversing a 26,000 m² area at up to 20 km/h. Our results show continuous and consistent detections of the robot even at the field boundaries. This work will enable a comprehensive evaluation of geofencing and environmental perception capabilities, paving the way for safety and performance benchmarking of agricultural robot systems.
Keywords: agricultural robotics; ground truth; 3D-LiDAR; GNSS time synchronisation; multi-sensor fusion; autonomous field testing; geofencing agricultural robotics; ground truth; 3D-LiDAR; GNSS time synchronisation; multi-sensor fusion; autonomous field testing; geofencing

Share and Cite

MDPI and ACS Style

Barrelmeyer, D.; Stiene, S.; Jose, J.; Porrmann, M. Mobile Ground-Truth 3D Detection Environment for Agricultural Robot Field Testing. Sensors 2025, 25, 4103. https://doi.org/10.3390/s25134103

AMA Style

Barrelmeyer D, Stiene S, Jose J, Porrmann M. Mobile Ground-Truth 3D Detection Environment for Agricultural Robot Field Testing. Sensors. 2025; 25(13):4103. https://doi.org/10.3390/s25134103

Chicago/Turabian Style

Barrelmeyer, Daniel, Stefan Stiene, Jannik Jose, and Mario Porrmann. 2025. "Mobile Ground-Truth 3D Detection Environment for Agricultural Robot Field Testing" Sensors 25, no. 13: 4103. https://doi.org/10.3390/s25134103

APA Style

Barrelmeyer, D., Stiene, S., Jose, J., & Porrmann, M. (2025). Mobile Ground-Truth 3D Detection Environment for Agricultural Robot Field Testing. Sensors, 25(13), 4103. https://doi.org/10.3390/s25134103

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop