# Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows

^{1}

^{2}

^{3}

^{4}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Materials and Methods

#### 2.1. Hardware Composition

#### 2.2. Technological Route

#### 2.3. Improvements to the YOLOv4 Algorithm

#### 2.3.1. YOLOv4 Algorithm

#### 2.3.2. Improvement of the YOLOv4 Algorithm

_{odd}denotes the nearest odd number to |*|, and a′ and b′ are function coefficients. In this paper, a′ takes the value of 2, and b′ takes the value of 1.

#### 2.4. Fruit Tree Trunk Positioning

#### 2.4.1. Fruit Tree Trunk Camera Coordinates

_{C}, Y

_{C}, Z

_{C}). The method in this paper is to project the ground so the Y

_{C}coordinates do not need to be solved to simplify the calculation process. The pixel coordinates of the geometric center of the fruit tree trunk in the left image are (${u}_{L},{v}_{L}$), and the image coordinates are (${x}_{L},{y}_{L}$); the pixel coordinates of the geometric center of the fruit tree trunk in the right image are (${u}_{R},{v}_{R}$), and the image coordinates are (${x}_{R},{y}_{R}$):

_{L}, O

_{R}, with O

_{L}as the origin, horizontally to the right for the axis X

_{L}positive direction, vertically down for the axis Y

_{L}positive direction, and horizontally forward for the positive direction of the Z

_{L}, to establish the camera coordinate system O

_{L}-X

_{L}Y

_{L}Z

_{L}. f indicates the focal length of the binocular camera in mm; b indicates the baseline of the left and right cameras in mm. The structure of the binocular camera is shown in Figure 5.

_{0}denotes the amount of lateral translation of the image coordinate system origin in the pixel coordinate system.

_{C}is obtained as follows

_{L}O

_{L}Z

_{L}are obtained from Equations (7) and (8) where, b, u

_{0}, f

_{x}are camera internal parameters, which can be obtained by camera calibration.

#### 2.4.2. Fruit Tree Trunk Coordinate Conversion

_{L}O

_{L}Z

_{L}

_{.}The translation of X and Z direction is X

_{0}and Z

_{0,}respectively. When the camera coordinates are converted to ground coordinates, the Y-axis direction is not to be calculated. The fruit tree trunk in the plane coordinate system X

_{L}O

_{L}Z

_{L}coordinates for (X

_{C},Z

_{C}), and, in the ground, coordinate system coordinates for (X

_{g},Z

_{g}); the conversion relationship is shown in Equation (9).

#### 2.5. Calculation of Navigation Path and Attitude Parameters

_{i}(X

_{gi},Z

_{gi}). Among them, the coordinate point of G

_{i}(X

_{gi},Z

_{gi}) < 0 is regarded as being located on the left side of the orchard operation vehicle, and at this time, the left-side sitting mark is G

_{Lj}(X

_{glj},Z

_{glj}). The coordinate point of X

_{gi}≥ 0 is regarded as being located on the right side of the orchard operation vehicle, and at this time, the right-side sitting mark is G

_{Rk}(X

_{grk},Z

_{grk}).

_{j,k}(X

_{j,k},Z

_{j,k}).

#### 2.5.1. Navigation Path Calculation

_{1}X + a

_{0}. The calculation process is as follows:

_{0}and a

_{1}from Equation (11):

#### 2.5.2. Calculation of Postural Parameters

_{1}X + a

_{0}; the lateral deviation λ of the vehicle can be obtained by calculating the perpendicular distance from the origin O to the fitted straight line Z = a

_{1}X + a

_{0}. The calculation formulas are shown in (13) and (14):

## 3. Results

#### 3.1. Data Acquisition and Model Training

_{P}denotes the number of correctly detected fruit tree trunks in the picture, F

_{P}denotes the number of detection errors in the picture, and F

_{N}denotes the number of missed targets in the picture.

#### 3.2. Posture Parameter Determination Test

#### 3.2.1. Binocular Camera Internal Reference Measurement

#### 3.2.2. Experimental Design and Evaluation Indicators

_{t}, λ

_{t}; the values measured by the method in this paper are estimated values and recorded as φ

_{e}, λ

_{e}. The difference between the real values of the heading angle and lateral deviation and the estimated values are the error values and recorded as E

_{φ}, E

_{λ}, which were taken as the evaluation indexes of the present experiment. The valuation index is calculated as follows:

#### 3.2.3. Experimental Results and Analysis

## 4. Discussion

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## References

- Zhao, C.J. The development current situation and future outlook of smart agriculture. J. South China Agric. Univ.
**2021**, 42, 1–7. [Google Scholar] - Hu, J.T.; Gao, L.; Bai, X.P.; Li, T.C.; Liu, X.G. Research progress in automatic navigation technology of Agricultural machinery. Trans. Chin. Soc. Agric. Eng.
**2015**, 31, 1–10. [Google Scholar] - Zhou, J.; He, Y.Q. Research progress in agricultural machinery navigation path planning. Trans. Chin. Soc. Agric. Mach.
**2021**, 52, 1–14. [Google Scholar] - Qi, F.; Zhu, M.; Zhou, X.Q.; Wei, X.M. Analysis of the relationship between agricultural engineering and China’s agricultural modernization. Trans. Chin. Soc. Agric. Eng.
**2015**, 31, 1–10. [Google Scholar] - Luo, X.W.; Liao, J.; Hu, L.; Zang, Y.; Zhou, Z.Y. Improve the level of agricultural mechanization to promote sustainable agricultural development. Trans. Chin. Soc. Agric. Eng.
**2016**, 32, 1–11. [Google Scholar] - Guo, C.Y.; Zhang, S.; Zhao, J.; Chen, J. Research on autonomous navigation system of orchard agricultural vehicles based on RTK-BDS. J. Agric. Mech. Res.
**2020**, 42, 254–259. [Google Scholar] - Li, X.J.; Wang, P.F.; Yang, X.; Li, J.P.; Liu, H.J.; Yang, X.V. Design and test of orchard ridge mower. J. Chin. Agric. Mech.
**2019**, 40, 47–52. [Google Scholar] - Zhou, Z.K.; Hu, J.N.; Zhang, C.Y.; Yang, J.M.; Ren, Z.H. Design of Obstacle Avoidance Lawn Mower Control System Based on Image and LiDAR. J. Agric. Mech. Res.
**2022**, 44, 80–84. [Google Scholar] - Song, C.C.; Zhou, Z.Y.; Wang, G.B.; Wang, X.W.; Zang, Y. Optimization of groove wheel structure parameters of groove wheel fertilizer ejector of fertilization UAV. Trans. Chin. Soc. Agric. Eng.
**2021**, 37, 1–10. [Google Scholar] - Song, Y.P.; Zhang, Z.H.; Fan, G.Q.; Gao, D.S.; Zhang, H.M.; Zhang, X.H. Research status and development trend of orchard ditch fertilization machinery in China. J. Chin. Agric. Mech.
**2019**, 40, 7–12. [Google Scholar] - Zhang, H.J.; Xu, C.B.; Liu, S.X.; Jiang, H.; Zhang, C.F.; Wang, J.X. Design and Experiment of Double Row Ditching Fertilizer Applicator with Automatic Depth Adjustment. Trans. Chin. Soc. Agric. Mach.
**2021**, 52, 62–72. [Google Scholar] - Hu, G.R.; Kong, W.Y.; Qi, C.; Zhang, S.; Bu, L.X.; Zhou, J.G.; Chen, J. Optimization of navigation path for mobile picking Robot in orchard Environment. Trans. Chin. Soc. Agric. Eng.
**2021**, 37, 175–184. [Google Scholar] - Li, T.; Qiu, Q.; Zhao, C.J.; Xie, F. Task planning of multi-arm picking robot in dwarf dense planting orchard. Trans. Chin. Soc. Agric. Eng.
**2021**, 37, 1–10. [Google Scholar] - Han, J.; Park, C.; Park, Y.; Kwon, J.H. Preliminary results of the development of a single-frequency GNSS RTK-based autonomous driving system for a speed sprayer. J. Sens.
**2019**, 2019, 4687819. [Google Scholar] [CrossRef] - Yue, S.; Zhu, H.P.; Ozkan, H.E. Development of a variable-rate sprayer with laser scanning sensor to synchronize spray outputs to tree structures. Trans. ASABE
**2012**, 55, 773–781. [Google Scholar] - Zhang, Y.Y.; Zhou, J. Detection of orchard trunk based on lidar. J. China Agric. Univ.
**2015**, 20, 249–255. [Google Scholar] - Niu, R.X.; Zhang, X.Y.; Wang, J.; Zhu, H.; Huang, J.; Chen, Z.W. A trunk detection algorithm based on agricultural robot orchard. Trans. Chin. Soc. Agric. Mach.
**2020**, 51, 21–27. [Google Scholar] - Bi, S.; Wang, Y.H. Orchard robot visual navigation interline pose estimation and fruit tree target positioning method. Trans. Chin. Soc. Agric. Mach.
**2021**, 52, 16–26. [Google Scholar] - Cai, S.P.; Sun, Z.M.; Liu, H.; Wu, H.X.; Zhang, Z.Z. Real-time detection method of orchard obstacles based on the modified YOLOv4. Trans. Chin. Soc. Agric. Eng.
**2021**, 37, 36–43. [Google Scholar] - Zhong, Y.; Xue, M.Q.; Yuan, H.L. Design of GNSS/INS combined navigation System. Trans. Chin. Soc. Agric. Eng.
**2021**, 37, 40–46. [Google Scholar] - Wei, S.; Li, S.C.; Zhang, M.; Ji, Y.H.; Xiang, M.; Li, M.Z. Agricultural machinery automatic navigation path search and steering control based on GNSS. Trans. Chin. Soc. Agric. Eng.
**2017**, 33, 70–77. [Google Scholar] - Luo, X.W.; Zhang, Z.G.; Zhao, Z.X.; Chen, B.; Hu, L.; Wu, X.P. DGPS automatic navigation control system of Dongfanghong X-804 tractor. Trans. Chin. Soc. Agric. Eng.
**2009**, 25, 139–145. [Google Scholar] - Liu, W.H.; He, X.K.; Liu, Y.J.; Wu, Z.M.; Yuan, C.J.; Liu, L.M.; Qi, P.; Li, T. 3D LiDAR navigation method between orchard rows. Trans. Chin. Soc. Agric. Eng.
**2021**, 37, 165–174. [Google Scholar] - Han, Z.H.; Li, J.; Yuan, Y.W.; Fang, X.F.; Zhao, B.; Zhu, L.C. Path recognition of orchard visual navigation based on U-Network. Trans. Chin. Soc. Agric. Mach.
**2021**, 52, 30–39. [Google Scholar] - Feng, C.Y.; Nie, G.H.; Naveed, Q.N.; Potrich, E.; Sankaran, K.S.; Kaur, A.; Sammy, F. Optimization of sorting robot control system based on deep learning and machine vision. Math. Probl. Eng.
**2022**, 2022, 5458703. [Google Scholar] [CrossRef] - Nasirahmadi, A.; Sturm, B.; Edwards, S.; Jeppsson, K.; Olsson, A.C.; Müller, S.; Hensel, O. Deep Learning and Machine Vision Approaches for Posture Detection of Individual Pigs. Sensors
**2019**, 19, 3738. [Google Scholar] [CrossRef] - Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
- Ju, M.; Hu, Q. Brain-inspired filtering Network for small infrared target detection. Multimed. Tools Appl.
**2023**, 82, 28405–28426. [Google Scholar] - Qian, H.; Wang, H.; Feng, S.; Yan, S. FESSD:SSD target detection based on feature fusion and feature enhancement. J. Real-Time Image Process.
**2023**, 20, 2. [Google Scholar] [CrossRef] - Redmon, J.; Farhadi, A. YOLO9000: Better, faster, stronger. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. Yolov4: Optimal speed and accuracy of object detection. arXiv
**2020**, arXiv:2004.10934. [Google Scholar] - Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. arXiv
**2022**, arXiv:2207.02696. [Google Scholar] - Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.Y.; Berg, A.C. SSD: Single shot multibox detector. In Proceedings of the Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, 11–14 October 2016; Springer International Publishing: Amsterdam, The Netherlands, 2016. [Google Scholar]
- Xie, S.; Sun, H. Tea-YOLOv8s: A tea bud detection model based on deep learning and computer vision. Sensors
**2023**, 23, 6576. [Google Scholar] [CrossRef] [PubMed] - Wang, Q.; Cheng, M.; Huang, S.; Cai, Z.; Zhang, J.; Yuan, H. A deep learning approach incorporating YOLO v5 and attention mechanisms for field real-time detection of the invasive weed Solanum rostratum dunal seedlings. Comput. Electron. Agric.
**2022**, 199, 107194. [Google Scholar] [CrossRef] - Zhao, J.W.; Tian, G.Z.; Qiu, C.; Gu, B.X.; Zheng, K.; Liu, Q. Weed detection in potato fields based on improved YOLOv4: Optimal speed and accuracy of weed detection in potato fields. Electronics
**2022**, 11, 3709. [Google Scholar] [CrossRef] - Qiu, C.; Tian, G.Z.; Zhao, J.W.; Liu, Q.; Xie, S.J.; Zheng, K. Grape maturity detection and visual pre-positioning based on improved YOLOv4. Electronics
**2022**, 11, 2677. [Google Scholar] [CrossRef]

**Figure 3.**The ECA module. Note: C, W, and H denote the feature channel size, width, and height of the feature map, respectively; σ denotes the Sigmoid activation function.

Model | Precision/% | Recall/% | $\mathbf{Frame}\mathbf{Rate}/(\mathbf{Frame}\xb7{\mathbf{s}}^{-1}$) |
---|---|---|---|

YOLOv4 | 91.13 | 87.51 | 15.47 |

ECA5-YOLOv4 | 97.05 | 95.42 | 17.59 |

SENet5-YOLOv4 | 94.25 | 91.01 | 18.03 |

CBAM5-YOLOv4 | 96.83 | 95.14 | 17.50 |

Site | YOLOv4 | ECA5-YOLOv4 | SENet5-YOLOv4 | CBAM5-YOLOv4 | ||||
---|---|---|---|---|---|---|---|---|

Heading Angle φ _{e}/° | Lateral Deviation λ _{e}/m | Heading Angle φ _{e}/° | Lateral Deviation λ _{e}/m | Heading Angle φ _{e}/° | Lateral Deviation λ _{e}/m | Heading Angle φ _{e}/° | Lateral Deviation λ _{e}/m | |

1 | 152.9 | 0.48 | 151.1 | 0.50 | 151.9 | 0.46 | 152.2 | 0.48 |

2 | 156.2 | 0.45 | 158.0 | 0.49 | 156.7 | 0.52 | 157.5 | 0.45 |

3 | 161.6 | 0.50 | 160.7 | 0.55 | 159.6 | 0.48 | 159.9 | 0.46 |

mean value of error | 2.07 | 0.03 | 0.57 | 0.02 | 1.43 | 0.05 | 1.17 | 0.05 |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Gu, B.; Liu, Q.; Gao, Y.; Tian, G.; Zhang, B.; Wang, H.; Li, H.
Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows. *Sensors* **2023**, *23*, 8807.
https://doi.org/10.3390/s23218807

**AMA Style**

Gu B, Liu Q, Gao Y, Tian G, Zhang B, Wang H, Li H.
Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows. *Sensors*. 2023; 23(21):8807.
https://doi.org/10.3390/s23218807

**Chicago/Turabian Style**

Gu, Baoxing, Qin Liu, Yi Gao, Guangzhao Tian, Baohua Zhang, Haiqing Wang, and He Li.
2023. "Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows" *Sensors* 23, no. 21: 8807.
https://doi.org/10.3390/s23218807