Special Issue "Agricultural and Field Robotics"

A special issue of Robotics (ISSN 2218-6581).

Deadline for manuscript submissions: closed (31 December 2018)

Special Issue Editors

Guest Editor
Prof. Dr. Qin Zhang

Director, Center for Precision and Automated Agricultural Systems, Washington State University, 24106 N, Bunn Road, Prosser, WA 99350, USA
Website | E-Mail
Interests: agricultural robotics; intelligent agricultural equipment; agricultural automation
Guest Editor
Prof. Dr. Manoj Karkee

Associate Professor, Department of Biosystems Engineering, Center for Precision and Automated Agricultural Systems, Washington State University, 24106 N, Bunn Road, Prosser, WA 99350, USA
Website | E-Mail
Interests: machine vision; field robotics; human machine collaboration; sensing and control agricultural system modeling and simulation

Special Issue Information

Dear Colleagues,

Agriculture is one of the oldest and most important industries that human civilization has established. Fundamentally, agriculture relies on the efficient utilization of natural resources, such as land, water, nutrition, and other chemicals to produce the basic necessities of human lives, including food, fiber, feed, and fuel. As predicted by the United Nations, the world’s population will increase to approximately 10 billion by 2050. The continuously increasing pressure to feed a rapidly growing population presents a huge challenge to the agricultural industry: How to sustainably produce enough agricultural supplies to meet such a huge demand.

Agricultural mechanization; use of machinery to perform laborious operations; has helped improve agricultural productivity through more efficient use of labor, increased timeliness of operations, and more efficient input management. Continuing advancement in agricultural mechanization and automation technologies in recent decades has led the agriculture into an era of robotic farming era. Agricultural robots, in general, can be defined as a line of intelligent machinery that exhibits some similar behaviors to a human operator, such as the capabilities of perception, reasoning and manipulation in farming settings to perform predetermined operations and tasks, with or without human supervision. Such robotic technologies have a potential to further reduce the use of labor and increase the precision and efficiency of production inputs thus contributing to increased agricultural productively and long-term sustainability of the industry.

Agricultural products can be broadly grouped into food, feed and raw materials for various other products, all cultivated differently in different geographic reasons around the world. It results in a wide variation in mechanisms, technologies and machines required to do complete different agricultural operations or handle special agricultural challenges. For example, after over a few decades of research and development, thousands of milking robots have been installed in dairy farms, tractors are auto-guided and auto-steered in performing different field operations, and drones are offering unique and novel applications in agriculture to improve productivity and reduce labor and input use, worldwide. This is just the beginning of what is expected to be a revolution in the way agricultural industry is operated. The objective of this Special Issue is, therefore, to promote a deeper understanding of major conceptual and technical challenges and facilitate spreading of recent breakthroughs in agricultural robotics. This Special Issue, by achieving this objective, is expected to enable safe, efficient, and economical agricultural production, and to advance the state-of-the-art in sensing, mobility, manipulation, and management technologies applied to the production of grain, fruit, vegetable, meat, milk and other agricultural products.

Topics of interest include (but are not limited to):

  • Sensing technologies for situation awareness in agricultural applications
  • Control strategies for robot manipulation in agricultural applications
  • Automatic guidance of robotic vehicle in agriculture sites
  • UASs or drones in agriculture
  • Robotics for row crop production
  • Robotics for specialty crop production (including fruit and vegetable)
  • Robotics for greenhouse and vertical farming systems
  • Robots for animal production
  • Machine learning and arterial intelligence in agriculture
  • Applications and supervision of agricultural robots
  • Management and maintenance of agricultural robots
  • Robotic farming economic analysis

Prof. Dr. Qin Zhang
Prof. Dr. Manoj Karkee
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Robotics is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 350 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Machine vision and other sensing systems
  • Modeling, simulation, and controls
  • Navigation and guidance
  • End-effectors and manipulators
  • Artificial intelligence
  • Soft computing and machine learning
  • Autonomous operations
  • Task management
  • Operation supervision
  • Robot maintenance
  • Application economics
  • Robot operator training

Published Papers (4 papers)

View options order results:
result details:
Displaying articles 1-4
Export citation of selected articles as:

Research

Open AccessArticle Leaf Area Estimation of Reconstructed Maize Plants Using a Time-of-Flight Camera Based on Different Scan Directions
Received: 19 September 2018 / Revised: 8 October 2018 / Accepted: 11 October 2018 / Published: 11 October 2018
Cited by 1 | PDF Full-text (3696 KB) | HTML Full-text | XML Full-text
Abstract
The leaf area is an important plant parameter for plant status and crop yield. In this paper, a low-cost time-of-flight camera, the Kinect v2, was mounted on a robotic platform to acquire 3-D data of maize plants in a greenhouse. The robotic platform [...] Read more.
The leaf area is an important plant parameter for plant status and crop yield. In this paper, a low-cost time-of-flight camera, the Kinect v2, was mounted on a robotic platform to acquire 3-D data of maize plants in a greenhouse. The robotic platform drove through the maize rows and acquired 3-D images that were later registered and stitched. Three different maize row reconstruction approaches were compared: reconstruct a crop row by merging point clouds generated from both sides of the row in both directions, merging point clouds scanned just from one side, and merging point clouds scanned from opposite directions of the row. The resulted point cloud was subsampled and rasterized, the normals were computed and re-oriented with a Fast Marching algorithm. The Poisson surface reconstruction was applied to the point cloud, and new vertices and faces generated by the algorithm were removed. The results showed that the approach of aligning and merging four point clouds per row and two point clouds scanned from the same side generated very similar average mean absolute percentage error of 8.8% and 7.8%, respectively. The worst error resulted from the two point clouds scanned from both sides in opposite directions with 32.3%. Full article
(This article belongs to the Special Issue Agricultural and Field Robotics)
Figures

Figure 1

Open AccessArticle A Novel Multirobot System for Plant Phenotyping
Received: 30 July 2018 / Revised: 7 September 2018 / Accepted: 12 September 2018 / Published: 26 September 2018
PDF Full-text (9909 KB) | HTML Full-text | XML Full-text
Abstract
Phenotypic studies require large datasets for accurate inference and prediction. Collecting plant data in a farm can be very labor intensive and costly. This paper presents the design, architecture (hardware and software) and deployment of a multi-robot system for row crop field data [...] Read more.
Phenotypic studies require large datasets for accurate inference and prediction. Collecting plant data in a farm can be very labor intensive and costly. This paper presents the design, architecture (hardware and software) and deployment of a multi-robot system for row crop field data collection. The proposed system has been deployed in a soybean research farm at Iowa State University. Full article
(This article belongs to the Special Issue Agricultural and Field Robotics)
Figures

Figure 1

Open AccessFeature PaperArticle Smart Agricultural Machine with a Computer Vision-Based Weeding and Variable-Rate Irrigation Scheme
Received: 4 June 2018 / Revised: 1 July 2018 / Accepted: 17 July 2018 / Published: 19 July 2018
Cited by 1 | PDF Full-text (4022 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
This paper proposes a scheme that combines computer vision and multi-tasking processes to develop a small-scale smart agricultural machine that can automatically weed and perform variable rate irrigation within a cultivated field. Image processing methods such as HSV (hue (H), saturation (S), value [...] Read more.
This paper proposes a scheme that combines computer vision and multi-tasking processes to develop a small-scale smart agricultural machine that can automatically weed and perform variable rate irrigation within a cultivated field. Image processing methods such as HSV (hue (H), saturation (S), value (V)) color conversion, estimation of thresholds during the image binary segmentation process, and morphology operator procedures are used to confirm the position of the plant and weeds, and those results are used to perform weeding and watering operations. Furthermore, the data on the wet distribution area of surface soil (WDAS) and the moisture content of the deep soil is provided to a fuzzy logic controller, which drives pumps to perform variable rate irrigation and to achieve water savings. The proposed system has been implemented in small machines and the experimental results show that the system can classify plant and weeds in real time with an average classification rate of 90% or higher. This allows the machine to do weeding and watering while maintaining the moisture content of the deep soil at 80 ± 10% and an average weeding rate of 90%. Full article
(This article belongs to the Special Issue Agricultural and Field Robotics)
Figures

Figure 1

Open AccessArticle Combining Hector SLAM and Artificial Potential Field for Autonomous Navigation Inside a Greenhouse
Received: 23 March 2018 / Revised: 17 May 2018 / Accepted: 19 May 2018 / Published: 22 May 2018
Cited by 1 | PDF Full-text (20789 KB) | HTML Full-text | XML Full-text
Abstract
The key factor for autonomous navigation is efficient perception of the surroundings, while being able to move safely from an initial to a final point. We deal in this paper with a wheeled mobile robot working in a GPS-denied environment typical for a [...] Read more.
The key factor for autonomous navigation is efficient perception of the surroundings, while being able to move safely from an initial to a final point. We deal in this paper with a wheeled mobile robot working in a GPS-denied environment typical for a greenhouse. The Hector Simultaneous Localization and Mapping (SLAM) approach is used in order to estimate the robots’ pose using a LIght Detection And Ranging (LIDAR) sensor. Waypoint following and obstacle avoidance are ensured by means of a new artificial potential field (APF) controller presented in this paper. The combination of the Hector SLAM and the APF controller allows the mobile robot to perform periodic tasks that require autonomous navigation between predefined waypoints. It also provides the mobile robot with a robustness to changing conditions that may occur inside the greenhouse, caused by the dynamic of plant development through the season. In this study, we show that the robot is safe to operate autonomously with a human presence, and that in contrast to classical odometry methods, no calibration is needed for repositioning the robot over repetitive runs. We include here both hardware and software descriptions, as well as simulation and experimental results. Full article
(This article belongs to the Special Issue Agricultural and Field Robotics)
Figures

Figure 1

Robotics EISSN 2218-6581 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top