2.1. Rice-Ducking Farming
Rice-duck farming is an environmentally friendly rice cultivation method that employs neither chemical fertilizers nor pesticides. Although hybrid ducks are generally used for rice-duck farming, farmers in northern Japan use mallards (
Anas platyrhynchos) because of their utility value as a livestock product. For this study, we specifically examine rice-duck farming using mallards considering regional characteristics.
Figure 2 depicts rice-duck farming using mallards.
Regional competition of rice production in Japan has come to be difficult since the rice reduction policy was abolished in 2018. Rice farmers are requested not only to secure food safety for the nation, but also to switch to novel rice production styles that are requested by consumers and which are consequently rewarded on the market. One solution is organic farming rice production. To reduce the environmental load related to agricultural production, organic farming uses no agricultural chemicals including fertilizers and pesticides. For the benefit of natural production capacity, organic farming has come to be regarded as an effective approach to improve the value of products.
Since ancient times, mallards have been domesticated as poultry for human consumption. Mallards are used not only for rice-duck farming, but also for meat because their smell is not strong. Mallards eat leaves, stems, seeds, and shells of plants. For weeding and pest control, mallards eat not only aquatic weeds such as
Echinochloa esculenta,
Cyperus microiria, and
Juncus effuses, but also aquatic insects such as
Lissorhoptrus oryzophilus,
Sogatella furcifera,
Nilaparvata lugens,
Laodelphax striatellus, but also river snails in a paddy field. Mallard movements in a paddy field also produce positive effects of full-time paddling. For weed prevention, turbid water suppresses photosynthesis of weeds below the surface. Moreover, mallards not only provide feces for nutrients of growing rice, but also contact rice during movements as a stimulus. A paddy field is a place of abundant water and life for mallards [
17]. Moreover, the grown paddy rice can provide refuge from natural enemies.
Organic farming using hybrid ducks or mallards is commonly known as rice-duck farming. Ducks provide weed control and pest control in paddy fields. As an example, the rice price using rice-duck farming is up to three times higher than that of conventional farming using agricultural chemicals and fertilizers. Therefore, improved stabilization for farmers is expected. Generally, organic farming without pesticides requires more labor than conventional farming for weed control and for cultivation management. In underpopulated regions, especially in rural areas with population aging and labor shortage, rice-duck farming is extremely attractive because it provides a substitute for human labor. However, one important shortcoming is that a flock of ducks tends to gather in a specific area that then becomes a spring pond. No rice is grown in such a spring pond because seedlings are pushed down. Another shortcoming is the ineffective weed control achieved in areas outside of ducks’ active moving areas. We consider that robotics technologies, especially for small robots [
18,
19], provide the potential to solve these diverse shortcomings. The aim of this study was to develop an autonomous mobile robot that guides mallards to realize highly efficient rice-duck farming. This study was conducted to develop robot prototypes of three models that navigates mallards to achieve high-efficiency rice-duck farming.
2.2. Rice-Ducking Robots
As air-cushion vehicles (ACVs), Yasuda et al. [
20] developed a brush-roller type paddy weeding robot that floats in a paddy field using a hovercraft mechanism. They used a tension member coated with glass fiber with polypropylene for a brush roller. Rotation of the brush roller behind the robot actualized weeding to the roots of rice. Although blowers for feeding air into skirts and the brush roller were driven using dedicated motors, a generator with a gasoline engine was used because the robot had no battery. Regarding locomotion in a large paddy field, they provided not only manual operation, but also automatic pilot using GPS. Their prototype robot assumed for a practical use had the ability to perform weeding of 10,000 m
2 up to 4 h. However, miniaturization was important future work because the body of 1900 mm long, 1860 mm wide, and 630 mm high was big, especially for carrying on a light truck used for transportation by small-scale farmers in Japan. Furthermore, the brush roller using the tension member damaged the rice. An important difficulty was that the rice yield decreased by up to 30%. Although they considered an alternative cultivation approach to increase the number of planted seedlings as a countermeasure against damage to paddy rice, the production cost and the rice quality remained as a subject for future work in this area.
For controlling weed growth with soil agitation, a small weeding robot named iGAMO was developed by the Gifu Prefectural Research Institute of Information Technology, Japan [
21]. This robot was improved in collaboration with an agricultural machine manufacturer for practical use and dissemination [
22]. The main body is 580 mm long, 480 mm wide, and 520 mm high. Two crawlers with 150 mm gap provided locomotion to run over a line of rice plants. The robot achieved autonomous locomotion from rice plant distribution information detected using a near infrared (IR) camera and two position-sensitive device (PSD) depth sensors. Two 1.41 N·m motors driven by a 180 Wh battery provided continuous operation up to 3 h with working efficiency of 1000 m
2/h. Moreover, they improved the weeding efficiency using not only stirring of the crawlers, but also metal chains for scraping the soil surface. They compared their robot with a conventional weeding machine and actual hybrid ducks in several rice paddy fields The experimentally obtained results revealed amounts of residual grass and differences of yields and rice grade. Sori et al. [
23] proposed a rice paddy weeding robot with paddle wheels instead of crawlers. The main body is 428 mm long, 558 mm wide, and 465 mm high. The two paddle wheels provide locomotion to run over a line of rice plants. The experimentally obtained results revealed that the number of tillers was increased when using their robot compared to the case without weeding.
Nakai et al. [
24] described a small weed-suppression robot. The main body is 400 mm long, 190 mm wide, and 250 mm high. The robot achieved locomotion with a passage width of 300 mm, which is the standard rice plant interval in Japan. The robot accommodates a tri-axial manipulator with an iron brush for improving the efficiency of weeding, combined with crawler-based weeding. For autonomous locomotion, they actualized stable movement in rough paddy fields using a laser range finder (LRF). However, hybrid ducks provide not only weeding, but also pest predation and excrement, the latter of which provides nutrients for the rice. Weeding robots can provide no such effect. Hybrid ducks do not move freely in a paddy field because they have no consciousness or responsibility for agricultural work. Therefore, improving weeding efficiency, pest control, and nutrient injection from excrement remain as challenging tasks. An approach that combines robots and green ducks is positioned as an excellent solution for outdoor cultivation in conventional farming.
Yamada et al. [
25] proposed an autonomous mobile robot that navigated hybrid ducks. They conducted an imprinting experiment for baby hybrid ducks using the robot. The main body size is similar to the mean size of parent hybrid birds. They used crawlers for the locomotion mechanism of the robot in the paddy field. In the imprinting experiment, they put a baby hybrid bird at 48 hr after hatching in a square box of 300 mm length. They applied visual stimulation for 45 min repeated six times. They confirmed that baby hybrid ducks acted according to robot behavior patterns. This result demonstrated that imprinting using a robot was possible for baby hybrid ducks. The robot appearance need not be similar to that of parent hybrid ducks. Imprinting was performed on quadrangular objects with no pattern. Moreover, they conducted induction experiments for up to four baby hybrid ducks that had hatched seven days before. The experimentally obtained results revealed the effectiveness of imprinting for the effect between feeding as a reward for bait and as a direct reward from stamped stimulus. Nevertheless, no experiment was conducted in actual paddy fields. In an artificial environment of 1500 mm long and 2000 mm wide, the robot merely repeated a reciprocating motion over an acrylic board. Regarding the influence of duck calling, no significant effect was found.
Moreover, Yamada et al. [
26] noted the fact that about one week had passed for baby hybrid ducks for hatching as a condition for farmers. They conducted not only imprinting on ducklings that had passed the critical period of imprinting, but also conducted induction experiments with feeding. This experiment was conducted at a paddy field. The robot supported a camera, a speaker used for sound reproduction, and two pyroelectric IR sensors from the rear. In addition, a feeding port for feed learning was provided at the tail end of the robot. According to the sensor conditions, the feed port had a function of automatically opening and closing. The pyroelectric IR sensors covered from 0.3 m through 0.5 m for near measurements and from 0.5 m through 1.5 m for remote measurements for detecting objects including obstacles. Although they performed 45 min × 6 imprinting operations for two hybrid ducks, they concluded that no action was observed to approach an autonomous mobile robot after the critical period. However, respective behaviors after playing a call of a parent bird from a robot without movement revealed that feed-learned hybrid ducks approached the feed outlet of their robot system. Moreover, results revealed as the effect of feeding learning that feeding-learned individuals were more likely to follow the robot than unlearned individuals. They inferred that hybrid ducks learned that the robot was harmless because they contacted the robot several times. However, the movement ranges of the robot and hybrid ducks in the evaluation experiment was about 1.50 m
2. Compared with the actual paddy field, the result of this experiment is limited to induction within a limited range. The robot guided two hybrid ducks in the stop status and one hybrid duck in the moving status. Therefore, no indication was shown for feeding guidance for hybrid ducks, which show group behavior as a basic habit.
2.3. Animal–Robot Interaction
Animal–robot interaction (ARI) is an extended concept of human–robot interaction (HRI) and human–robot relation based on
ethorobotics, which relies on evolutionary, ecological, and ethological concepts for developing social robots [
27,
28]. One important factor for ARI is that developers should have full knowledge of animal behaviors and emotions to make a robot understand their needs sufficiently for natural interaction with the animal [
29]. Moreover, robot behaviors should be well designed considering the animal ethology for a sudden movement [
29]. Recently, ARI studies and technologies represent a relatively novel research field of bio-robotics and are opening up to new opportunities for multidisciplinary studies, including biological investigations, as well as bio-inspired engineering design [
30]. Numerous ARI studies have been conducted, especially on Zebrafish [
31,
32,
33,
34,
35,
36,
37], the green bottle fly (
Lucilia sericata) [
38], squirrels, crabs, honeybees, rats, and other animal species, including the studies on interactive bio-robotics [
39]. In the case of waterfowl, two representative studies are as follows.
Vaughan et al. [
40] developed a mobile sheepdog robot that maneuvers a flock of ducks to a specified goal position. After verifying the basic characteristics through simulations, they evaluated the navigation of 12 ducks to an arbitrary position set as a goal in an experimental arena with a 7 m diameter created as an actual environment. Nevertheless, they indicated no specifications such as the mobile performance of their robot.
Henderson et al. [
41] navigated domestic ducks (
Anas platyrhynchos domesticus) using two stimuli: a small mobile vehicle and a walking human. They navigated 37 adult ducks of 6 flocks at a donut-shaped experimental course. The experimentally obtained comparison results demonstrated the differences in navigation capability between humans and robots based on the index of mean latency to return to food.
2.4. Autonomous Locomotion and Navigation
Numerous state-of-the-art robots including flying robots as unmanned aerial vehicles (UAV) have been proposed for autonomous locomotion and navigation. Chen et al. [
42] proposed a legged stable walking control strategy based on multi-sensor information feedback for large load parallel hexapod wheel-legged robot developing. They developed a mobile robot that has six legs and six wheels applied in complex terrain environments. The results revealed that their proposed active compliance controllers based on impedance control reduced the contact impact between the foot-end and the ground for improving the stability of the robot body. Moreover, they actualized the anti-sliding ability after introducing the swing leg retraction that provided stable walking in complex terrain environments.
Li et al. [
43] developed a wheel-legged robot with a flexible lateral control scheme using a cubature Kalman algorithm. They proposed a fuzzy compensation and preview angle-enhanced sliding model controller to improve the tracking accuracy and robustness. The simulations and experimentally obtained results demonstrated that their proposed method achieved satisfactory performance in high-precision trajectory tracking and stability control of their mobile robot.
Chen et al. [
44] proposed an exact formulation based on mixed-integer linear programming to fully search the solution space and produce optimal flight paths for autonomous UAVs. They designed an original clustering-based algorithm to classify regions into clusters and obtain approximate optimal point-to-point paths for UAVs. The experimentally obtained results with randomly generated regions demonstrated the efficiency and effectiveness for the coverage path planning problem of autonomous heterogeneous UAVs on a bounded number of regions.
In a paddy field, mallards often concentrate in a specific area because of a swarm habit. Some areas therefore have persistent weeds because mallards do not disperse. Mallards do not weed a whole paddy field uniformly. Moreover, a stepping pond occurs where all the paddy rice has been eaten by mallards. As a different approach, some farmers use feed to navigate mallards. The difficulty of this approach is the necessity of human burdens, especially for large paddy fields. Therefore, farmers must weed them using a weed-removing machine. For this study, we examined three navigation approaches used for mallards: imprinting, pheromone tracking, and feeding.
Imprinting is a unique behavior observed in nidifugous birds such as ducks, geese, and chickens [
45]. Imprinting is a contacting and follow response to a stimulus that is received for the first time during a short period after hatching. Moreover, imprinting is enhanced for running and following in response to sounds or moving shadows. For this study, we specifically examined imprinting-based navigation. We used a small robot as an imprinting target for baby mallards.
Pheromones are chemicals that promote changes in behavior and development of conspecific individuals after being produced inside the body and secreted outside the body [
46]. Pheromones are used mainly when insects communicate with conspecific individuals. Although no pheromone is available to navigate mallards, we consider indirect navigation using insects favored by mallards with pheromone. Specifically, we devised an indirect usage that navigates mallards using insects that are gathered to a pheromone trap attached to a robot. Although baby mallards attempt to eat insects, adult birds have no interest in them. We consider that the efficiency of this approach decreases along with mallard development. For this study, we conducted no experiments using pheromone-based navigation because of the difficulty of the procedures using insects and the weak overall effects.
For rice-duck farming, breeders use feed to collect ducks. Breeders give minimum feed for ducks because ducks stop eating weeds if too much feed is given. We expect that feeding-based navigation is effective for adult mallards because imprinting can only be performed during the baby mallard period. Yamada et al. tested the effects of feed learning and navigation for hybrid ducks. However, they described no test for feed learning for mallard navigation. For this study, we specifically examined the method of navigating mallards following the use of feed combined with a small robot.
For actualizing self-driving cars, research and development of autonomous locomotion have been conducted actively [
47] in the field of automated driving using multiple sensors [
48]. Murase et al. [
49] have applied a convolutional neural network (CNN) to their proposed method for automatic automobile driving to improve control precision and accuracy. Their developed CNN model was trained with vehicle states using on-board camera images and vehicle speeds as system inputs and the volumes of steering, acceleration, and break operations as system outputs. The experimentally obtained results demonstrated the effectiveness of the combination of time-series images and CNNs for automated driving. Although the method exhibited usefulness for weather and changes, the evaluation results using video datasets remain as simulations.
Kamiya et al. [
50] attempted to estimate car motion patterns in video images from a first-person view (FPV) [
51] using a recurrent convolutional neural network (RCNN) [
52]. They compared estimation accuracies with those of three RCNN models trained with three input patterns: color images, dynamic vector images obtained from optical flow features, and both images. Evaluation experiments targeting four behavior prediction patterns, which comprise moving forward, turning right, turning left, and moving backward, demonstrated the usefulness of switching input features to accommodate the driving scenario characteristics.
Xu et al. [
53] proposed a generic vehicle motion model using end-to-end trainable architecture to predict a distribution over future vehicle ego-motion from instantaneous monocular camera observations and previous vehicle states. They evaluated their model using the Berkeley DeepDrive Video dataset (BDDV) [
54] to predict four driving actions: straight, stop, turn left, and turn right. Their proposed model demonstrated superior accuracy compared with existing state-of-the-art prediction models based on deep learning (DL) algorithms.
As described above, various studies using DL-based methods for elucidation, estimation, and prediction of semantics from video images for automatic driving have been actively conducted. Nevertheless, no report of the relevant literature has described a feasible means of realizing small farming robots that can move autonomously in a field based on DL and visual processing. Moreover, studies of mobile robots that can infer behavior patterns from time-series images in a field of complex surface conditions have not progressed. Therefore, the challenge remains of [
55] exploring the applicability of DL technologies to a small robot that moves autonomously in a paddy field.
For this study, we assume that the environment used for our prototypes as a mobile robot is a paddy field filled with water to 200 mm depth. In a paddy field, the robot encounters difficulties with locomotion because the ground condition is muddy, rough, and underwater. The robot must move in a passage of approximately up to 300 mm wide between rice plants with inter-rice locomotion. Moreover, stems and leaves are spreading according to the growing of rice plants. Therefore, we assume that the robot body size is restricted to a similar size to that of a sheet of A4 paper.