1. Introduction
The US is the third-largest cotton producing country in the world. Cotton producers must stay competitive by adopting the latest technologies. The US cotton industry has a significant impact on the economy, with 190,000 jobs and turnover of more than
$25 billion per year. This year’s yield forecast at 386 kg. per harvested acre was slightly above the previous year [
1]. The US cotton industry has a long history of adopting distributive technologies, starting with the invention of the cotton gin in 1790s, the adoption of mechanical harvesters in the 1950s, and the development of the module builder in the 1970s [
2]. These technologies significantly decreased labor requirements and allowed the labor to produce a 218 kg bale of cotton fiber to drop from 140 h in 1940 to less than 3 h today [
3]. Despite the success of technology adoption, there are still many challenges faced by the US cotton producer. One major challenge is competition from polyester, where overproduction in China has resulted in polyester prices that are approximately 50% less than cotton and has resulted in suppressed cotton prices [
4]. Thus, producers must continue to increase their production efficiency as increased cotton prices are on the near horizon. Other challenges facing cotton producers are increased pest resistance, particularly glyphosate-resistant weeds [
5], and early indications of bollworm resistance to Bt cotton (genetically modified cotton that contains genes for an insecticide).
Robots are becoming more integrated into the manufacturing industry. Some examples of these integrations include material handling gantry robots [
6], autonomous transport vehicles [
7], automotive kitting applications [
8], TIREBOT [
9], and iRobot-Factory [
10]. Robots are also applied in other industries (services, construction, mining, transportation/communication, agriculture/forestry/fishery), smart factories, smart buildings, smart homes, smart cities, and even on smart farms where it employs new paradigms and technologies like Industry 4.0, internet of things (IoT), cyber-physical systems (CPS), artificial intelligence (AI) and machine learning (ML). Zielinska (2019) [
11] classified these industrial robots as those that operate under fully structured environments. However, field robots like those in the agriculture industry work in fully unstructured natural environments. Although it was more than three decades ago, Sistler (1987) [
12] provided a review of the different robotic applications and their future possibilities in agriculture. More robot-based technologies have been used in agriculture. They have been implemented through the use of automation and with ranges of form factors, e.g., ground-based (e.g., smart tractors, unmanned ground vehicle [UGV]), crane-based systems, aerial-based (e.g., unmanned aerial vehicles [UAV]). A rapidly adopted automation in agriculture, for example, is an automated system for milking cows. Salfer et al. (2019) [
13] estimated over 35,000 milking systems are currently used all over the world. For row crops, weed control with the rise of herbicide-resistant weeds and lack of new herbicide modes of action is a significant concern, and robotic systems are one of the proposed solutions [
14]. For example, in the textile industry, growers of cotton
(Gossypium spp.) aim for optimized fiber yield and quality [
15]. The need for the textile industry and advances in robotics has led to the proliferation, and use of mobile robot platforms applied to the cotton industry. Several applications of robot platforms have been studied in cotton phenotyping [
16,
17,
18]; lint yield prediction [
19], path tracking [
20,
21], monitoring germination [
22]; wireless tracking of cotton modules [
23], yield prediction [
24]; yield monitoring [
25,
26], and cotton residue collection [
27]. Cotton growers who applied various technologies reported increased field performance and efficient use of resources [
28].
UGVs have been used for different purposes in agriculture. BoniRob is a four-wheeled-steering robot with adjustable track width and used as a crop scout [
29]. Its sensor suite includes different cameras (3D time of flight, spectral), and laser distance sensors. It was at first design as a phenotyping robot, but additional functionality was added as a weeder as its development progressed. It used a hammer type of mechanism to destroy weeds. Unfortunately, BoniRob development was discontinued for an unknown reason. Vinobot is a phenotyping UGV implemented on a popular mobile platform from clearpathrobotics. Vinobot can measure phenotypic traits of plants and used different sensors [
30]. TERRA-MEPP (Transportation Energy Resource from Renewable Agriculture Mobile Energy-crop Phenotyping Platform) is another UGV that was used for high-throughput phenotyping of energy sorghum. It used imaging sensors to measures the plant from both sides as it traverses within rows, thereby overcoming the limitations of bigger UGV [
31]. A center-articulated hydrostatic rover [
32] was used for cotton harvesting. It used a red, green and blue (RGB) stereo camera to localized/detect the cotton bolls and a 2D manipulator to harvest the cotton. The system achieved a picking performance of 17.3 s per boll and 38 s per boll for simulated and field conditions, respectively, but the authors indicated that the speed of harvest and successful removal must be improved to be used commercially. ByeLab (Bionic eYe Laboratory) is another UGV used to monitor and sense the orchards and vineyards′ health status using multiple sensors [
33]. It used two light-detection and ranging (lidar) sensors to determine the plants′ shape and volume and six AgLeader OptRx crop sensors to obtain the plant′s health. It used the Normalized Difference Vegetation Index (NDVI) of the crop sensors to determine if the vegetation is healthy or unhealthy. The results for measuring the plants′ thickness using the lidar sensors provide a relatively high
R2 (0.83 and 0.89 to two different experimental layouts). However, no field result was presented. Phenotron is a system modeled UGV which includes a lidar, inertial measurement unit (IMU), and Global Positioning System (GPS) [
34]. The field where the robot was tested was also modeled and designed with Sketchup. It used multiple lidar configurations (nodding, tilted, side and overhead) for phenotyping (canopy volume measurement) and tested on a simulated field. The results showed that three configurations (nodding, tilt, and side) produced comparable volume results with an average percent error of 6%. In comparison, the overhead configuration had the highest average percent error at 15.2%. The paper noted that the position of the lidar sensor could heavily influence the results due to occlusion. The cotton plant model canopy and shape may also have affected the results as the model may not have been dense enough to block the lidar. The shape in this model used an oval cross-section but real cotton exhibits different shapes and may result in different volume calculations. This work is very beneficial for the cotton breeder. It makes it quicker to collect important traits of plants to determine which breed performs best in terms of different inputs (fertilizer, water, different management, etc.).
Robots are becoming more integrated into manufacturing industry. Although most of the manufacturing environment is not as complicated as the outdoors, recent advances in sensors and algorithms provide an interesting outlook on how robots will be working outdoors with humans. Commercial small UGVs or mobile ground robots with navigation-sensing modality provides a platform to increase farm management efficiency. The platform, Husky, (clearpathrobotics) [
35] can be retrofitted with different manifolds that perform a specific task, e.g., spraying, scouting (having multiple sensors), phenotyping, weeding, harvesting, etc.
Autonomous robot navigation was developed, and a selective harvesting proof of concept was also designed and field-tested. The robot was retrofitted with a vacuum-type system with a small storage bin. Performance evaluation for the cotton harvesting was performed in terms of how effective the harvester suctions the cotton bolls and the effective distance of the suction cap to the cotton bolls. This work’s overall objective is to investigate the potential of UGV to be used for multiple farm operations, e.g., weeding, harvesting, phenotyping, etc. The specific objectives are to design a selective harvesting autonomous mobile platform for cotton and investigate the module′s efficacy in terms of both laboratory and field performance.
2. Materials and Methods
2.1. Mobile Robot Platform and Harvesting Concept Design
The mobile platform used in this work is the Husky A200 (
Figure 1) from Clearpath Robotics. The platform is suitable for field operations as its width of 68 cm fits common cotton row spacings. It is lightweight for field traffic and thus soil compaction is not an issue as compared to huge farm machines.
The platform is powerful enough to handle payloads of up to 75 kg and can operate at speeds of 1 m per second. It has a 24 V direct current (DC) lead-acid battery which can provide 2 h of operation. Two new lithium polymer batteries with 6 cells each and a 10Ah rating provide up to 3 h of operation. Husky is equipped with IMU (UM7, CH Robotics, Victoria, Australia), GPS (Swiftnav, Swift Navigation, CA, USA), individual steering motors and encoders for each wheel for basic navigation, and a laser scanner (UST-10LX, Hokuyo, Osaka, Japan) for obstacle detection. The IMU has an Extended Kalmat Filter (EKF) estimate rate of 500 Hz, with ±2 degrees for static and ±4 degrees accuracy in dynamic pitch and roll. The RTK GPS supports multiple bands (GPS L1/L2, GLONASS G1/G2, BeiDou B1/B2, and Galileo E1/E2), enabling faster convergence times to high precision mode. It has a maximum of 10 Hz solution rates and has flexible interfaces, including Universal Asynchronous Receiver/Transmitter (UART), Ethernet, Controller Area Network (CAN), and Universal Serial Bus (USB). The lidar has a scanning range from 0.02~10 m with ±40 mm accuracy. It has an angular resolution of 0.25 degree and a scanning frequency of 40 Hz. It used an Ethernet as its primary communication. The lidar and IMU were configured for an update rate of 10 Hz, while the GPS was set to 5 Hz. The robot can be programmed to perform specific tasks like mapping, navigation, and obstacle avoidance through its onboard PC (mini-ITX) running on the Ubuntu 16.04 operating system and the Robot Operating System (ROS, Kinetic version) framework. A mini-liquid crystal display (LCD) screen, keyboard, and pointing device are connected to the onboard PC allowing the user to easily write and test code, view and perform operations.
The cotton harvester module underwent several design revisions. The design requirements were based on the following constraints: dimensions and payload of the mobile platform, power supply of the platform, cotton plant height and boll positions, and temporary storage of the harvested cotton bolls. The first few prototypes used a fabric for temporary storage. It was determined that the porous nature of the fabric had a huge impact on the suction of the system. The container was replaced with a sealed container using a modified 19 L bucket with a sealed lid. A fluid simulation was performed for the storage and is presented under
Section 4 (Discussion). The overall design of the cotton-harvesting autonomous platform (CHAP) was modified to accommodate the new collection device, as shown in
Figure 2.
Several different parts were modeled and created via 3D printing because of their unique shapes and requirements, as shown in
Figure 3. An outlet from the collection bucket was created to house a filter to prevent the blower from becoming clogged with cotton, as shown in
Figure 3a,b. The inlets to the collection bucket were designed to divert the cotton to the bottom as shown in
Figure 3c,d.
Figure 3e shows the collection bucket with all the 3D printed parts.
The blower was mounted above the collection bucket and a mount was created using a higher-density setting. This was to prevent the mount from failing due to the blower′s weight and possible vibration, as shown in
Figure 4a. Several designs for the suction ports on the cotton intake fitting were printed and tested. The multiple suction port design (
Figure 4b) was first envisioned to collect multiple cotton bolls but during testing, a significant drawback was discovered. It lowers the pressure from the blower, which results in no bolls being collected most of the time. The final design was modified and only used one suction port as shown in
Figure 4c.
The complete harvesting module used a low-voltage blower motor (McMaster 12 V DC 12 A, 1000 rpm, and 7 Cubic Meters per Minute (CMM) attached on top of a 19 L-bin. The blower’s 10.16 cm diameter inlet port was connected to the bin′s top cover through a rubber hose. A hole was made to fit a 6-feet corrugated hose on one side of the bin with a 3.17 cm diameter opening. At the tip of the hose was a 3D-printed nozzle tied to an extrusion frame that extends on one side of the robot. The nozzle’s position was fixed prior to harvesting operations where most cotton bolls were situated along the plant row. A custom-built controller board (AtMega 644P, Microchip, AZ, USA) interfaced to the robot’s onboard PC controls the blower motor, and the light blinker that serves as a warning device during operation is shown in
Figure 5. The controller board, blower, and blinker are powered by an external 12 V DC lithium polymer battery (
Figure 5b). All the harvester components were attached to an aluminum extrusion assembly frame that can be easily retrofitted to the robot’s frame. The combined setup of the mounted harvester integrated on the mobile robot platform is shown in
Figure 6.
2.2. Navigation
Autonomous field navigation is achieved by having a digital map of the field and localizing the robot on that map. Localization involves integrating the coordinate frame of the robot with the coordinate frame of the digital map. The robot’s coordinate frame, commonly referred to as its odometry, estimates the robot’s position and orientation over time. The accuracy of the robot’s odometry may be enhanced by integrating it with other positional readings from an IMU or a GPS device. The robot’s position is first determined using the kinematic model in
Figure 7. The kinematic model of the four-wheeled robot used in this study was treated as a two-wheeled differential robot with virtual wheels W
L and W
R to simplify calculations. The robot’s current position is determined by a tuple (xc, yc, α) and its new position (xc, yc, α)’ after time δt, given its right and left virtual wheel linear speeds, vR and vL, respectively. The linear speed of each virtual wheel is shown in Equations (1) and (2).
where
ω is the angular speed and
r is the wheel radius. The angular speeds
ω and angular position
φ of each virtual wheel is the average of its real counterparts as shown in Equations (3)–(6),
The robot’s angular speed and position are shown in Equations (7) and (8),
Equations (9) and (10) compute the robot’s
x and
y component,
and the actual position is found using Equations (11) and (12),
The values of each variables above were stored in a data-serialization language, YAML Ain’t Markup Language (YAML). YAML is commonly used for configuration files specially in ROS. The YAML file is one of the many files that were setup to correctly configure the ROS navigation stack on the robot’s PC.
2.3. Robot Operating System (ROS) Navigation Stack and Cotton-Harvesting Autonomous Platform (CHAP) Navigation
The ROS Navigation Stack is an integrated framework of individual software or algorithmic packages bundled together as nodes for steering the robot from one point to the next, as shown in
Figure 8. Users configure the navigation stack by either plugging in built-in or custom-built packages in any of the navigation stack nodes. Estimation of the robot’s odometry is therefore handled internally by the nodes in the navigation stack that automatically loads, reference, and updates the configuration file during runtime execution of the robot.
In this study, the field′s digital map was generated using a GMapping algorithm, a variant of simultaneous localization and mapping (SLAM) algorithm [
36,
37]. GMapping involved fusing the robot’s odometry, GPS, IMU, and laser scanner readings using Kalman filters and Rao-Blackwell particle filters (RBPF) to determine the robot’s current position and orientation on the map. A custom-built global/local planner package was then developed suitable for steering the robot within plant rows and avoiding obstacles using the laser scanner. To navigate, a graphical user interface (GUI) showing the map of the field allows users to either click points/segments on the map or hardcode the coordinates where the robot will navigate autonomously. These points/segments could be the locations of the cotton bolls to be harvested.
A computer program for the navigation and control of CHAP was developed. The program was tested via computer simulation using a pre-built model of the platform. For proper navigation, support devices like the GPS, IMU, and laser scanner were calibrated for the simulation to work. The results are shown in
Figure 9. The lower right image shows the simulated playpen where the platform navigates, and the lower-left image shows its path and movement. The upper two terminals show the status readings of the different devices.
An overview of the hardware and software components needed for CHAP to navigate is shown in
Figure 10. CHAP needs to know its location in a given map for it to navigate around it. To do this, it needs data readings from several sensors (GPS, IMU, and laser) for it to steer its wheel motors in the correct direction. Two essential tests (GPS accuracy and lidar precision) were conducted to configure and calibrate the hardware and software components and presented in
Section 2.6.
2.4. Cotton Boll Harvest
To harvest the cotton bolls on the plant, a common reference coordinate was first established between the robot base with respect to the plant location, as shown in
Figure 11. Both the relative positions of the cotton bolls and the nozzle was then defined from this reference coordinate. Prior to harvesting, the average boll heights and boll offsets for each row or plot to be sampled were measured to calibrate the nozzle position. The nozzle position was located based on where most of the cotton bolls were expected. The robot was then programmed to keep track of its relative position away from the plant as it navigates along the rows. The robot also monitored the user′s points/segments, which signals the activation or deactivation of the blower motor and blinker to start or end the harvesting.
2.5. Study Site
An 82 × 142 sq m 29-row loamy sand field of cotton (Deltapine 1358 B2XF) was established with 96.5 cm row spacing and 10 cm in-row plant spacing at the Edisto Research and Education Center in Blackville, SC (33.3570° N, 81.3271° W). Seeding was done in early May and harvesting in the last week of November. Regular crop management practices were applied during the growing season. Laboratory and field tests were conducted to configure and evaluate the navigation and harvesting system′s performance before they were integrated.
2.6. Global Positioning System (GPS) and Laser Scanner Testing and Mapping
Key components of the navigation and harvester systems were first tested to determine if it satisfies the study′s objective and requirements. Onboard GPS, laser scanner, and other important components (e.g., blower motors, hose size, nozzle type, and size) were all subjected to test trials. But only the GPS and laser scanner will be presented, as these were the critical factors in autonomous navigation. For GPS, outdoor waypoint measurements were conducted to determine the longitude and latitude readings′ accuracy per waypoint. Seven sampling sites were chosen to test the accuracy of the GPS. While the laser scanner’s resolution was tested to determine if it detects the cotton stem′s diameter in a simulated experiment. In this test, smaller 6.25 mm diameter bolts were used as a simulated cotton stem. The bolts were lined up along the left and right edges. Due to the length of the bolts, the laser scanner sensor was moved to the lower front. For mapping, testing took place in both indoor and outdoor environments. For the indoor environment test, CHAP navigation was tested at the Sensor and Automation Laboratory and the whole building where the laboratory is located. The outdoor mapping test was conducted on the field with five rows.
2.7. Performance Evaluation
The performance of the robot in cotton harvesting was first evaluated in the laboratory in terms of how effective the harvester suctioned the cotton bolls and how close they should be to the nozzle. These tests will help calibrate the positioning of the robot relative to the plant and the positioning of the nozzle during the actual field harvesting. Two rows in the field were initially identified as the sampling rows. Ten cotton plant samples per row were taken to the lab. Each plant’s stem was cut above ground with all other parts of the plant intact. Plant samples were taken approximately two weeks after defoliation.
2.7.1. Lab Tests for Effective Suctioning Distance
Ten random cotton bolls per row were selected from the plant samples. Each cotton boll had about 4 to 5 locks. The nozzle was clamped to a vise grip, and a ruler was placed alongside its front opening to determine the suction distance. The blower motor was activated manually before individual cotton bolls were hand-drawn closer to the nozzle until they were suctioned. The distance between the nozzle’s tip and the edge of the cotton boll facing the nozzle was then measured and recorded.
2.7.2. Lab Tests for Effective Suctioning Distance
Ten random cotton bolls per row were selected from the plant samples. Each cotton boll had about 4 to 5 locks. The nozzle was clamped to a vise grip and a ruler placed alongside its front opening to determine the suction distance. The blower motor was activated manually before individual cotton bolls were hand-drawn closer to the nozzle until it was suctioned. The distance between the nozzle’s tip and edge of the cotton boll facing the nozzle was then measured and recorded.
2.7.3. Lab Tests for Suctioning Locks per Boll
A lab setup that simulated actual field harvesting was conducted. Plants were lined up and mounted in a makeshift rack inside the lab. Ten random cotton bolls per row were selected from the plant samples. Bolls of each plant targeted for harvesting were all aligned along the traversal path of the nozzle. The robot was then programmed to traverse along the simulated plant row with the harvester activated at a traversal speed of 0.5 m/s. The number of locks successfully suctioned per boll per plant was recorded at the end of each robot’s pass. There were about five robot passes for each boll. On each pass, either the plant or the nozzle’s position was adjusted so that all the locks on a boll were tested for harvesting. Each boll was scored based on the ratio between the number of locks suctioned over the total number of locks for that boll.
2.7.4. Field Tests for Harvester Performance
Cotton boll yield sampling based on Goodman et al. (2003) [
38] was applied during the field tests. The yield estimate is based on the standard sample length of 3 m row. The bolls were then picked in the 3 m row and weighed to obtain better result. However, an assumption can be made about the weight of the bolls (~4 g) and so picking the number of bolls with the assumption that each boll weighs 4 g is sufficient to predict the yield. In our case, we were interested on how many bolls our suction system could harvest in a 3 m row. Note that since we only had one suction cap, the number of cotton bolls that could be harvested would be limited to that particular location. Two 3 m row (RA and RB) sampling locations were selected and then subdivided into three subplots. The subplots were labeled RA1 to RA3 and RB1 to RB3, as shown in
Figure 12a. Each subplot had 10 consecutive cotton plants. The 3 m row was subdivided into subplots to account for the variabilities observed at the time of the measurements. Differences in plant height and cotton boll opening along the row were visually observed. To determine the position of the cotton bolls on each plant, measurements were made on the plant height, boll distance from stem (boll offset), and boll height above ground. These measurements were used to calibrate the position of the harvester nozzle targeting where most of the bolls are located as shown in
Figure 12b. Measurements were all done two to three weeks after defoliation. The robot′s harvesting performance was evaluated by counting the number of cotton bolls that were harvested on the sampled rows.