# Selection of a Navigation Strategy According to Agricultural Scenarios and Sensor Data Integrity

^{1}

^{2}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

#### 1.1. Related Works

#### 1.2. Contributions

## 2. Materials and Methods

#### 2.1. AgBot

#### 2.2. Experimental Setup

#### 2.3. Predicted and Predictors Variables

#### 2.3.1. GNSS

#### 2.3.2. LiDAR

#### 2.3.3. RGB Camera

#### 2.4. Classifiers Techniques

#### 2.4.1. Multinomial Logistic Regression

#### 2.4.2. Decision Tree

#### 2.4.3. Random Forest

- Minimum sample (min_n): Minimum amount of sample to perform a new partitioning. The values of this hyperparameter were 1, 11, 21, 30, and 40;
- Trees number (trees): Number of weak classifiers. The values of this hyperparameter were 2, 26, 51, 75, and 100.

#### 2.4.4. Gradient Boosting

- Depth of tree (tree_depth): Maximum depth of nodes that the tree can create. The values of this hyperparameter were 1, 4, 8, 11, and 15;
- Learning rate (learn_rate): Learning rate for each weak model. The values of this hyperparameter were ${10}^{-1}$, ${10}^{-3}$, ${10}^{-3}$, ${10}^{-4}$, and ${10}^{-5}$.

#### 2.5. Classifiers Fusion

#### 2.5.1. Classification Vector

#### 2.5.2. Sensors’ Feasibility Matrix

#### 2.5.3. Strategy Selection Vector

## 3. Results

#### 3.1. GNSS

#### 3.2. LiDAR

#### 3.3. Camera

#### 3.4. Navigation Selections or Navigation Methodologies

## 4. Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## Abbreviations

AgBot | Agricultural robot |

AIC | Akaike criterion information |

DOF | Degree of freedom |

DT | Decision Tree |

GB | Gradient Boosting |

GLM | Generalized linear models |

GNSS | Global navigation satellite system |

HSL | Hue, saturation and lightness |

IMU | Inertial measurement unit |

LiDAR | Light detection and ranging |

LL | Log-likelihood |

LRT | Likelihood-ratio test |

MCC | Matthews correlation coefficient |

MLR | Multinomial logistic regression |

RF | Random Forest |

RGB | Red, green and blue |

RNN | Recurrent neural networks |

ROS | Robotic operating system |

RTK | Real-time kinematic |

SLAM | Simultaneous localization and mapping |

## References

- Tantalaki, N.; Souravlas, S.; Roumeliotis, M. Data-Driven Decision Making in Precision Agriculture: The Rise of Big Data in Agricultural Systems. J. Agric. Food Inf.
**2019**, 20, 344–380. [Google Scholar] [CrossRef] - Sparrow, R.; Howard, M. Robots in agriculture: Prospects, impacts, ethics, and policy. Precis. Agric.
**2021**, 22, 818–833. [Google Scholar] [CrossRef] - Thayer, T.C.; Vougioukas, S.; Goldberg, K.; Carpin, S. Routing Algorithms for Robot Assisted Precision Irrigation. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 2221–2228. [Google Scholar] [CrossRef]
- Hoppe, A.; Jefferson, E.; Woodruff, J.; McManus, L.; Phaklides, N.; McKenzie, T. Novel Robotic Approach to Irrigation and Agricultural Land Use Efficiency. In Proceedings of the 2022 IEEE Conference on Technologies for Sustainability (SusTech), Sunny Riverside, CA, USA, 21–23 April 2022; pp. 181–186. [Google Scholar] [CrossRef]
- Quan, L.; Jiang, W.; Li, H.; Li, H.; Wang, Q.; Chen, L. Intelligent intra-row robotic weeding system combining deep learning technology with a targeted weeding mode. Biosyst. Eng.
**2022**, 216, 13–31. [Google Scholar] [CrossRef] - Alam, M.S.; Alam, M.; Tufail, M.; Khan, M.U.; Güneş, A.; Salah, B.; Nasir, F.E.; Saleem, W.; Khan, M.T. TobSet: A New Tobacco Crop and Weeds Image Dataset and Its Utilization for Vision-Based Spraying by Agricultural Robots. Appl. Sci.
**2022**, 12, 1308. [Google Scholar] [CrossRef] - Gao, X.; Li, J.; Fan, L.; Zhou, Q.; Yin, K.; Wang, J.; Song, C.; Huang, L.; Wang, Z. Review of wheeled mobile robots’ navigation problems and application prospects in agriculture. IEEE Access
**2018**, 6, 49248–49268. [Google Scholar] [CrossRef] - Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng.
**2016**, 149, 94–111. [Google Scholar] [CrossRef] - Vougioukas, S.G. Agricultural Robotics. Annu. Rev. Control. Robot. Auton. Syst.
**2019**, 2, 365–392. [Google Scholar] [CrossRef] - Siegwart, R.; Nourbakhsh, I.R.; Scaramuzza, D. Introduction to Autonomous Mobile Robots; MIT Press: Cambridge, MA, USA, 2011. [Google Scholar]
- Reitbauer, E.; Schmied, C. Bridging GNSS Outages with IMU and Odometry: A Case Study for Agricultural Vehicles. Sensors
**2021**, 21, 4467. [Google Scholar] [CrossRef] [PubMed] - Kaswan, K.S.; Dhatterwal, J.S.; Baliyan, A.; Jain, V. Special Sensors for Autonomous Navigation Systems in Crops Investigation System. In Virtual and Augmented Reality for Automobile Industry: Innovation Vision and Applications; Hassanien, A.E., Gupta, D., Khanna, A., Slowik, A., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 65–86. [Google Scholar] [CrossRef]
- Winterhalter, W.; Fleckenstein, F.; Dornhege, C.; Burgard, W. Localization for precision navigation in agricultural fields—Beyond crop row following. J. Field Robot.
**2021**, 38, 429–451. [Google Scholar] [CrossRef] - Fountas, S.; Mylonas, N.; Malounas, I.; Rodias, E.; Hellmann Santos, C.; Pekkeriet, E. Agricultural Robotics for Field Operations. Sensors
**2020**, 20, 2672. [Google Scholar] [CrossRef] - Oliveira, L.F.P.; Moreira, A.P.; Silva, M.F. Advances in Agriculture Robotics: A State-of-the-Art Review and Challenges Ahead. Robotics
**2021**, 10, 52. [Google Scholar] [CrossRef] - Sigrist, P.; Coppin, P.; Hermy, M. Impact of forest canopy on quality and accuracy of GPS measurements. Int. J. Remote Sens.
**1999**, 20, 3595–3610. [Google Scholar] [CrossRef] - Yoshimura, T.; Hasegawa, H. Comparing the precision and accuracy of GPS positioning in forested areas. J. For. Res.
**2003**, 8, 147–152. [Google Scholar] [CrossRef] - Chan, C.W.; Schueller, J.K.; Miller, W.M.; Whitney, J.D.; Cornell, J.A. Error Sources Affecting Variable Rate Application of Nitrogen Fertilizer. Precis. Agric.
**2004**, 5, 601–616. [Google Scholar] [CrossRef] - Deng, Y.; Shan, Y.; Gong, Z.; Chen, L. Large-Scale Navigation Method for Autonomous Mobile Robot Based on Fusion of GPS and Lidar SLAM. In Proceedings of the 2018 Chinese Automation Congress (CAC), Xi’an, China, 30 November–2 December 2018; pp. 3145–3148. [Google Scholar] [CrossRef]
- Moeller, R.; Deemyad, T.; Sebastian, A. Autonomous Navigation of an Agricultural Robot Using RTK GPS and Pixhawk. In Proceedings of the 2020 Intermountain Engineering, Technology and Computing (IETC), Orem, UT, USA, 2–3 October 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Manish, R.; Lin, Y.C.; Ravi, R.; Hasheminasab, S.M.; Zhou, T.; Habib, A. Development of a Miniaturized Mobile Mapping System for In-Row, Under-Canopy Phenotyping. Remote Sens.
**2021**, 13, 276. [Google Scholar] [CrossRef] - Higuti, V.A.H.; Velasquez, A.E.B.; Magalhaes, D.V.; Becker, M.; Chowdhary, G. Under canopy light detection and ranging-based autonomous navigation. J. Field Robot.
**2019**, 36, 547–567. [Google Scholar] [CrossRef] - Higuti, V.A.H. 2D LiDAR-Based Perception for under Canopy Autonomous Scouting of Small Ground Robots within Narrow Lanes of Agricultural Fields. Ph.D. Thesis, Universidade de São Paulo, São Paulo, Brazil, 2021. [Google Scholar]
- Weiss, U.; Biber, P. Semantic place classification and mapping for autonomous agricultural robots. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Workshop: Semantic Mapping and Autonomous Knowledge Acquisition, Taipei, Taiwan, 18–22 October 2010. [Google Scholar]
- Ji, T.; Vuppala, S.T.; Chowdhary, G.; Driggs-Campbell, K. Multi-modal anomaly detection for unstructured and uncertain environments. arXiv
**2020**, arXiv:2012.08637. [Google Scholar] [CrossRef] - Suh, H.K.; Hofstee, J.W.; van Henten, E.J. Improved vegetation segmentation with ground shadow removal using an HDR camera. Precis. Agric.
**2018**, 19, 218–237. [Google Scholar] [CrossRef] [Green Version] - Sunil, G.C.; Zhang, Y.; Koparan, C.; Ahmed, M.R.; Howatt, K.; Sun, X. Weed and crop species classification using computer vision and deep learning technologies in greenhouse conditions. J. Agric. Food Res.
**2022**, 9, 100325. [Google Scholar] [CrossRef] - Gasparino, M.V.; Sivakumar, A.N.; Liu, Y.; Velasquez, A.E.B.; Higuti, V.A.H.; Rogers, J.; Tran, H.; Chowdhary, G. WayFAST: Navigation with Predictive Traversability in the Field. IEEE Robot. Autom. Lett.
**2022**, 7, 10651–10658. [Google Scholar] [CrossRef] - Preti, M.; Verheggen, F.; Angeli, S. Insect pest monitoring with camera-equipped traps: Strengths and limitations. J. Pest Sci.
**2021**, 94, 203–217. [Google Scholar] [CrossRef] - Barbedo, J.G.A. A review on the main challenges in automatic plant disease identification based on visible range images. Biosyst. Eng.
**2016**, 144, 52–60. [Google Scholar] [CrossRef] - Mirbod, O.; Choi, D.; Thomas, R.; He, L. Overcurrent-driven LEDs for consistent image colour and brightness in agricultural machine vision applications. Comput. Electron. Agric.
**2021**, 187, 106266. [Google Scholar] [CrossRef] - Silwal, A.; Parhar, T.; Yandun, F.; Baweja, H.; Kantor, G. A Robust Illumination-Invariant Camera System for Agricultural Applications. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 3292–3298. [Google Scholar] [CrossRef]
- Torres, C.J. Sistema de Controle e Supervisão para Robô Agrícola Móvel Baseado em Fusão de Dados Sensoriais. Ph.D. Thesis, Universidade de São Paulo, São Paulo, Brazil, 2018. [Google Scholar]
- Muñoz-Bañón, M.Á.; del Pino, I.; Candelas, F.A.; Torres, F. Framework for fast experimental testing of autonomous navigation algorithms. Appl. Sci.
**2019**, 9, 1997. [Google Scholar] [CrossRef] [Green Version] - Eiffert, S.; Wallace, N.D.; Kong, H.; Pirmarzdashti, N.; Sukkarieh, S. Experimental evaluation of a hierarchical operating framework for ground robots in agriculture. In Proceedings of the International Symposium on Experimental Robotics, La Valletta, Malta, 9–12 November 2020; Springer: Berlin/Heidelberg, Germany; pp. 151–160. [Google Scholar] [CrossRef]
- FAOSTAT. FAOSTAT Statistics Database; FAOSTAT: Rome, Italy, 2022. [Google Scholar]
- Silva, D.L.G.; Batisti, D.L.S.; Ferreira, M.J.G.; Merlini, F.B.; Camargo, R.B.; Barros, B.C.B. Cana-de-açúcar: Aspectos econômicos, sociais, ambientais, subprodutos e sustentabilidade. Res. Soc. Dev.
**2021**, 10, e44410714163. [Google Scholar] [CrossRef] - Zhou, M.; Cheng, W.; Huang, H.; Chen, J. A Novel Approach to Automated 3D Spalling Defects Inspection in Railway Tunnel Linings Using Laser Intensity and Depth Information. Sensors
**2021**, 21, 5725. [Google Scholar] [CrossRef] [PubMed] - Papoulis, A.; Pillai, S.U. Probability, Random Variables, and Stochastic Processes; Tata McGraw-Hill Education: New York, NY, USA, 2002. [Google Scholar]

**Figure 2.**Comparison among models using GNSS data: (

**a**) Decision Tree; (

**b**) Random Forest; and (

**c**) Gradient Boosting.

**Figure 4.**Comparison among models using LiDAR data: (

**a**) Decision Tree; (

**b**) Random Forest; and (

**c**) Gradient Boosting.

**Figure 6.**Comparison among models using camera data: (

**a**) Decision Tree; (

**b**) Random Forest; and (

**c**) Gradient Boosting.

**Figure 8.**Trajectory: (

**a**) InsideCrop to OutsideCrop: Presence of crop; (

**b**) InsideCrop to OutsideCrop: Selection of sensors for navigation; (

**c**) InsideCrop to OutsideCrop: Probability indicated by the methodology of selecting each navigation strategy; (

**d**) OutsideCrop to InsideCrop: Presence of crop; (

**e**) OutsideCrop to InsideCrop: Selection of sensors for navigation; (

**f**) OutsideCrop to InsideCrop: Probability indicated by the methodology of selecting each navigation strategy; (

**g**) OutsideCrop to UnderCanopy: Presence of crop, (

**h**) OutsideCrop to UnderCanopy: Selection of sensors for navigation and (

**i**) OutsideCrop to UnderCanopy: Probability indicated by the methodology of selecting each navigation strategy.

AgBot * | Components | Characteristics |
---|---|---|

Battery | 14.8 V 10,000 mAh | |

Motor | Maytech Brushless Outrunner Hub | |

Computer | NUC-I7 Processor | |

Operational System | Ubuntu 18.04 | |

LiDAR | UST-10LX | |

IMU | Bosch BNO0555 | |

GNSS | Ublox Zed-F9P | |

Camera | RGB cameras with LED Illumination |

Representation | Class | Distance between Crops (m) |
---|---|---|

GNSSOutsideCrop | ≥3.00 | |

GNSSInsideCrop | ≥1.00 | |

UnderCanopy | <1.00 |

Feature | Variable Name | Unit |
---|---|---|

Heading Accuracy | HeadAcc | ${}^{\circ}$ |

Horizontal Accuracy | HorAcc | m |

Speed Accuracy | SpeedAcc | m/s |

Representation | Class | Description |
---|---|---|

LiDAROutsideCrop | There is no crop. | |

LiDARInsideCrop | There is at least one row crop. | |

LiDAROcclusion | Many points surrounding origin. |

Feature | Variable Name | Equation |
---|---|---|

Mean | ${\mu}_{X,Y,I}$ | $\frac{1}{k}{\sum}_{i=1}^{k}\left({x}_{i}\right)$ |

Variance | ${\sigma}_{X,Y,I}$ | $\frac{1}{k}{\sum}_{i=1}^{k}{\left({x}_{i}-\overline{x}\right)}^{2}$ |

Skewness | ${\gamma}_{X,Y,I}$ | $\frac{1}{k}{\sum}_{i=1}^{k}{\left(\frac{{x}_{i}-\overline{x}}{\sigma}\right)}^{3}$ |

Representation | Class | Description |
---|---|---|

Ideal | There is no problem in the image. | |

HighBrightness | Image is too bright. | |

LowBrightness | Image is too dark. | |

Blur | Image loses details. | |

CameraOcclusion | Camera is occluded. |

Feature | Variable Name | Equation |
---|---|---|

Mean | ${\mu}_{G}$ | $-\frac{1}{k}{\sum}_{i=1}^{k}\left({x}_{i}\right)$ |

Variance | ${\sigma}_{G}$ | $\frac{1}{k}{\sum}_{i=1}^{k}{\left({x}_{i}-\overline{x}\right)}^{2}$ |

Skewness | ${\gamma}_{G}$ | $\frac{1}{k}{\sum}_{i=1}^{k}{\left(\frac{{x}_{i}-\overline{x}}{\sigma}\right)}^{3}$ |

Laplacian Descriptor | ${\sigma}_{lap}^{2}$ | $\frac{{\partial}^{2}I(x,y)}{\partial {x}^{2}}+\frac{{\partial}^{2}I(x,y)}{\partial {y}^{2}}$ |

Characteristics | Total Model | Stepwise Model |
---|---|---|

${\beta}_{\mathrm{HeadingAccuracy}}$ | ✔ | ✔ |

${\beta}_{\mathrm{HorizontalAccuracy}}$ | ✔ | - |

${\beta}_{\mathrm{SpeedAccuracy}}$ | ✔ | - |

Accuracy [%] | 100.00 | 100.00 |

AIC | 16.05 | 10.46 |

LL | −0.02 | −1.23 |

DOF | 8 | 4 |

LRT | 2.41 | |

p-value | 0.66 |

GNSSInsideCrop | UnderCanopy | |
---|---|---|

$\alpha $ | −49.75 | 59.76 |

${\beta}_{\mathrm{HorizontalAccuracy}}$ | 337.37 | 103.98 |

${\beta}_{\mathrm{HeadingAccuracy}}$ | - | - |

${\beta}_{\mathrm{SpeedAccuracy}}$ | - | - |

Characteristics | Total Model | Stepwise Model |
---|---|---|

${\beta}_{{\mu}_{X}}$ | ✔ | ✔ |

${\beta}_{{\sigma}_{X}}$ | ✔ | ✔ |

${\beta}_{{\gamma}_{X}}$ | ✔ | ✔ |

${\beta}_{{\mu}_{Y}}$ | ✔ | ✔ |

${\beta}_{{\sigma}_{Y}}$ | ✔ | - |

${\beta}_{{\gamma}_{Y}}$ | ✔ | ✔ |

${\beta}_{{\mu}_{I}}$ | ✔ | ✔ |

${\beta}_{{\sigma}_{I}}$ | ✔ | - |

${\beta}_{{\gamma}_{I}}$ | ✔ | ✔ |

Accuracy [%] | 70.22 | 70.84 |

AIC | 1546.41 | 1541.36 |

LL | −753.21 | −754.68 |

DOF | 20 | 16 |

LRT | 2.944 | |

p-value | 0.57 |

LiDAROcclusion | LiDARInsideCrop | |
---|---|---|

$\alpha $ | −1.32 | 0.73 |

${\beta}_{{\mu}_{X}}$ | 1.65 | 0.47 |

${\beta}_{{\sigma}_{X}}$ | 1.78 | −0.07 |

${\beta}_{{\gamma}_{X}}$ | −1.08 | −0.09 |

${\beta}_{{\mu}_{Y}}$ | −2.31 | −0.44 |

${\beta}_{{\sigma}_{Y}}$ | - | - |

${\beta}_{{\gamma}_{Y}}$ | 0.74 | 0.33 |

${\beta}_{{\mu}_{I}}$ | 5.75 | 1.00 |

${\beta}_{{\sigma}_{I}}$ | - | - |

${\beta}_{{\gamma}_{I}}$ | −0.53 | −0.82 |

Characteristics | Total Model | Stepwise Model |
---|---|---|

${\beta}_{{\mu}_{G}}$ | ✔ | ✔ |

${\beta}_{{\sigma}_{G}}$ | ✔ | ✔ |

${\beta}_{{\gamma}_{G}}$ | ✔ | ✔ |

${\beta}_{{\sigma}_{lap}^{2}}$ | ✔ | ✔ |

Accuracy [%] | 93.76 | 93.76 |

AIC | 1619.65 | 1619.65 |

LL | −789.83 | −789.83 |

DOF | 20 | 20 |

LRT | 0 | |

p-value | 1.00 |

Ideal | HighBrightness | CameraOcclusion | LowBrightness | |
---|---|---|---|---|

$\alpha $ | 2.01 | −4.53 | 1.56 | −13.18 |

${\beta}_{{\mu}_{G}}$ | −3.53 | 17.31 | −2.98 | −13.11 |

${\beta}_{{\sigma}_{G}}$ | −8.72 | 2.65 | −2.09 | −2.22 |

${\beta}_{{\gamma}_{G}}$ | −1.85 | −1.25 | 11.00 | 10.78 |

${\beta}_{{\sigma}_{lap}^{2}}$ | 17.95 | 10.01 | 6.44 | 16.83 |

Crop? | InsideCrop to OutsideCrop | OutsideCrop to InsideCrop | OutsideCrop to UnderCanopy | |||
---|---|---|---|---|---|---|

GNSS | LiDAR or Camera | GNSS | LiDAR or Camera | GNSS | LiDAR or Camera | |

Yes | 0 | 70 | 24 | 165 | 0 | 192 |

No | 170 | 23 | 10 | 0 | 0 | 9 |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Bonacini, L.; Tronco, M.L.; Higuti, V.A.H.; Velasquez, A.E.B.; Gasparino, M.V.; Peres, H.E.N.; Oliveira, R.P.d.; Medeiros, V.S.; Silva, R.P.d.; Becker, M.
Selection of a Navigation Strategy According to Agricultural Scenarios and Sensor Data Integrity. *Agronomy* **2023**, *13*, 925.
https://doi.org/10.3390/agronomy13030925

**AMA Style**

Bonacini L, Tronco ML, Higuti VAH, Velasquez AEB, Gasparino MV, Peres HEN, Oliveira RPd, Medeiros VS, Silva RPd, Becker M.
Selection of a Navigation Strategy According to Agricultural Scenarios and Sensor Data Integrity. *Agronomy*. 2023; 13(3):925.
https://doi.org/10.3390/agronomy13030925

**Chicago/Turabian Style**

Bonacini, Leonardo, Mário Luiz Tronco, Vitor Akihiro Hisano Higuti, Andres Eduardo Baquero Velasquez, Mateus Valverde Gasparino, Handel Emanuel Natividade Peres, Rodrigo Praxedes de Oliveira, Vivian Suzano Medeiros, Rouverson Pereira da Silva, and Marcelo Becker.
2023. "Selection of a Navigation Strategy According to Agricultural Scenarios and Sensor Data Integrity" *Agronomy* 13, no. 3: 925.
https://doi.org/10.3390/agronomy13030925