Developing an Urban Landscape Fumigation Service Robot: A Machine-Learned, Gen-AI-Based Design Trade Study
Abstract
:1. Introduction
2. Related Studies
2.1. LLM Models
2.2. GAN
2.3. VAE
2.4. Other Models in Gen-AI
- The development and implication of “machine-learned multimodal and feedback-based VAE model” in design creation using Gen-AI.
- Incorporating a comparative analysis of design generation output with and without a spraying robot’s dataset.
- The development of the autonomous fumigation robot is based on the data-driven design generated using the proposed model.
- This study focuses on developing a “machine-learned multimodal and feedback-based VAE model” for design creation using Gen-AI, integrating multimodal data types and feedback mechanisms to enhance precision, relevance, robustness, and adaptability to real-time deployable service robots.
- Conducting a comparative analysis of design outputs generated with and without spraying robot datasets to evaluate the role of data input in improving design quality.
- Additionally, this work involves developing an autonomous fumigation robot for urban landscapes, utilizing the proposed model for application-specific optimization and functionality.
3. Methods
Algorithm 1: Development of MMF-VAE Model |
Require: data: robot attributes (Size, Weight, Speed, Drive Type, etc.) //Dataset of readily available fumigation robots input_dim: total dimension of flattened input latent_dim: dimension of latent vector//(16) hidden_dim: dimension of hidden layers//(128) batch_size: number of samples per training batch num_epochs: number of training epochs learning_rate: Adam optimizer step size |
Objective: Generate an MMF-VAE model file from fumigation-robot data |
1: Define AttributeEncoder(input_dim, latent_dim, hidden_dim): 2: net ← Sequential(Linear(input_dim, hidden_dim), ReLU(), Linear(hidden_dim, hidden_dim), ReLU()) 3: fc_mu, fc_logvar ← Linear(hidden_dim, latent_dim), Linear(hidden_dim, latent_dim) 4: forward(x): h ← net(x); return (fc_mu(h), fc_logvar(h)) 5: end Define 6: Define SpecificationDecoder(latent_dim, hidden_dim): 7: net ← Sequential(Linear(latent_dim, hidden_dim), ReLU(), Linear(hidden_dim, hidden_dim), ReLU()) 8: size_fc, weight_fc, speed_fc, motor_fc ← Linear(...), ... 9: forward(z): h ← net(z); size = ReLU(size_fc(h))×100; weight=ReLU(weight_fc(h))×50 10: speed = ReLU(speed_fc(h)); motor_probs = Softmax(motor_fc(h)) 11: return {size, weight, battery, autonomy_type, drive_type, payload, sensors, PC, navigation_system, spraying_type, spray_system, terrain, safety} 12: end Define 13: Define MMF_VAE(input_dim, latent_dim, hidden_dim): 14: encoder ← AttributeEncoder(input_dim, latent_dim, hidden_dim) 15: decoder ← SpecificationDecoder(latent_dim, hidden_dim) 16: reparameterize(mu, logvar): std = exp(0.5×logvar); eps = randn_like(std); return mu + eps × std 17: forward(x): (mu, logvar) = encoder(x); z = reparameterize(mu, logvar); specs = decoder(z) 18: return (specs, mu, logvar) 19: end Define 20: mmf_vae_model ← MMF_VAE(input_dim, latent_dim, hidden_dim) 21://Train via Adam and domain-specific feedback to get final model 22://e.g., for epoch in 1.num_epochs: forward pass, VAE + feedback loss, backprop 23://After training, save mmf_vae_model for LLM integration 24: return mmf_vae_model //The trained model |
Algorithm 2: Integrating MMF-VAE with LLM Model |
Input: mmf_vae_model: a trained MMF-VAE model file LLM_service: large language models (e.g., ChatGPT/Ollama) user_prompt: User prompts/instructions |
Output: Generate an output with targeted or desired requirements |
Ensure: Interactive design refinements via LLM based on MMF-VAE outputs |
Pseudocode: 1: Load mmf_vae_model 2: While user is active: 3: prompt ← GET_INPUT_FROM_USER() 4: if prompt requests new design: 5: z_random ← SAMPLE_LATENT_VECTOR() 6: generated_specs ← mmf_vae_model.decoder(z_random) 7: SEND_TO_LLM(LLM_service, “Suggested Parameters: ”+STR(generated_specs)) 8: else if prompt provides feedback: 9: feedback_vector ← CONVERT_TO_CONSTRAINTS(prompt) 10: z_refined ← APPLY_FEEDBACK_TO_LATENT(mmf_vae_model, feedback_vector) 11: refined_specs ← mmf_vae_model.decoder(z_refined) 12: SEND_TO_LLM(LLM_service, “Refined Parameters: ”+STR(refined_specs)) 13: end if 14: end While 15: return //End interactive session |
4. Incorporation of Proposed Model in Urban Landscape Fumigation Service Robot
4.1. Design Parameter Fixation of Fumigation Service Robot
4.2. Design Parameter Generation Using Proposed Model
4.3. Development of Fumigation Robot Based on Design Generation
4.4. Limitations and Future Scope
- Interpretability challenges: understanding or interaction conflict between the human and Gen-AI.
- Lack of contextual understanding: this method can provide inaccurate outcomes without proper training.
- Biased output: output is generated based only on the training database.
- Overreliance on AI: this method lacks human-centric considerations and creativity.
- Generalization challenges: requires diverse datasets for task generalization and design adaptability.
- User inputs: continuous inputs are essential to achieving accurate and desirable results.
- Human irreplaceable: the designer’s role in defining the problem, results interpretation, and solution selection remains irreplaceable.
- Hybrid architecture: incorporating additional DGM models, GANs, or transformers improves diversity and quality while effectively managing complex multimodal data.
- Design validation metrics: integrating automated evaluation and validation metrics ensures that the generated design satisfies the required functionality, aesthetics, and core requirements.
- Interpretability and explainability: this can increase interpretation by improving transparency for effective validation.
- Human collaboration: develop a framework for enhanced collaboration in design generation between AI and human designer creativity.
- Energy and resource optimization: investigate approaches to integrating energy consumption and resource allocation metrics into the generative process, guiding designs to be more cost- and power-efficient.
- Cross-domain transfer learning: leverage insights from adjacent fields (e.g., automated manufacturing and construction robotics) to enrich the dataset and adapt the model’s learned representations for broader applications.
- Simulation-driven evaluation: employ large-scale simulation environments (e.g., Gazebo and Webots) to rapidly test generated designs in virtual conditions before physical deployment, reducing overall development cycles.
- Multi-agent collaboration: extend the MMF-VAE framework to design swarms or collaborative robot teams, focusing on communication protocols, task allocation, and synchronized operation.
- Explainable generative models: develop more interpretable architectures, such as feature-attribution or attention mechanisms, to clarify how specific design elements are selected and refined.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- World Health Organization. Disease Outbreak News, Dengue-Global Situation; World Health Organization: Geneva, Switzerland, 2023; pp. 1–14. [Google Scholar]
- Onen, H.; Luzala, M.M.; Kigozi, S.; Sikumbili, R.M.; Muanga, C.-J.K.; Zola, E.N.; Wendji, S.N.; Buya, A.B.; Balciunaitiene, A.; Viškelis, J. Mosquito-Borne Diseases and Their Control Strategies: An Overview Focused on Green Synthesized Plant-Based Metallic Nanoparticles. Insects 2023, 14, 221. [Google Scholar] [CrossRef] [PubMed]
- Choi, J.; Cha, W.; Park, M.-G. Evaluation of the effect of photoplethysmograms on workers’ exposure to methyl bromide using second derivative. Front. Public Health 2023, 11, 1224143. [Google Scholar] [CrossRef] [PubMed]
- Bordas, A.; Le Masson, P.; Thomas, M.; Weil, B. What is generative in generative artificial intelligence? A design-based perspective. Res. Eng. Des. 2024, 35, 1–17. [Google Scholar] [CrossRef]
- Wadinambiarachchi, S.; Kelly, R.M.; Pareek, S.; Zhou, Q.; Velloso, E. The Effects of Generative AI on Design Fixation and Divergent Thinking. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 11–16 May 2024; pp. 1–18. [Google Scholar]
- Regenwetter, L.; Nobari, A.H.; Ahmed, F. Deep generative models in engineering design: A review. J. Mech. Des. 2022, 144, 071704. [Google Scholar] [CrossRef]
- Saadi, J.I.; Yang, M.C. Generative design: Reframing the role of the designer in early-stage design process. J. Mech. Des. 2023, 145, 041411. [Google Scholar] [CrossRef]
- Zhu, Q.; Luo, J. Generative transformers for design concept generation. J. Comput. Inf. Sci. Eng. 2023, 23, 041003. [Google Scholar] [CrossRef]
- Fitriawijaya, A.; Jeng, T. Integrating multimodal generative ai and blockchain for enhancing generative design in the early phase of architectural design process. Buildings 2024, 14, 2533. [Google Scholar] [CrossRef]
- Konstantinou, C.; Antonarakos, D.; Angelakis, P.; Gkournelos, C.; Michalos, G.; Makris, S. Leveraging Generative AI Prompt Programming for Human-Robot Collaborative Assembly. Procedia CIRP 2024, 128, 621–626. [Google Scholar] [CrossRef]
- Vemprala, S.H.; Bonatti, R.; Bucker, A.; Kapoor, A. Chatgpt for robotics: Design principles and model abilities. IEEE Access 2024, 12, 55682–55696. [Google Scholar] [CrossRef]
- Zhang, Z.; Chai, W.; Wang, J. Mani-GPT: A generative model for interactive robotic manipulation. Procedia Comput. Sci. 2023, 226, 149–156. [Google Scholar] [CrossRef]
- Oh, S.; Jung, Y.; Kim, S.; Lee, I.; Kang, N. Deep generative design: Integration of topology optimization and generative models. J. Mech. Des. 2019, 141, 111405. [Google Scholar] [CrossRef]
- Regenwetter, L.; Srivastava, A.; Gutfreund, D.; Ahmed, F. Beyond statistical similarity: Rethinking metrics for deep generative models in engineering design. Comput.-Aided Des. 2023, 165, 103609. [Google Scholar] [CrossRef]
- Gan, Y.; Ji, Y.; Jiang, S.; Liu, X.; Feng, Z.; Li, Y.; Liu, Y. Integrating aesthetic and emotional preferences in social robot design: An affective design approach with Kansei Engineering and Deep Convolutional Generative Adversarial Network. Int. J. Ind. Ergon. 2021, 83, 103128. [Google Scholar] [CrossRef]
- Aristeidou, C.; Dimitropoulos, N.; Michalos, G. Generative AI and neural networks towards advanced robot cognition. CIRP Ann. 2024, 73, 21–24. [Google Scholar] [CrossRef]
- Borkar, K.K.; Singh, M.K.; Dasari, R.K.; Babbar, A.; Pandey, A.; Jain, U.; Mishra, P. Path planning design for a wheeled robot: A generative artificial intelligence approach. Int. J. Interact. Des. Manuf. (IJIDeM) 2024, 1–12. [Google Scholar] [CrossRef]
- Chan, W.K.; Wang, P.; Yeow, R.C.-H. Creation of Novel Soft Robot Designs using Generative AI. arXiv 2024, arXiv:2405.01824. [Google Scholar]
- Wang, L.; Chan, Y.-C.; Ahmed, F.; Liu, Z.; Zhu, P.; Chen, W. Deep generative modeling for mechanistic-based learning and design of metamaterial systems. Comput. Methods Appl. Mech. Eng. 2020, 372, 113377. [Google Scholar] [CrossRef]
- Bucher, M.J.J.; Kraus, M.A.; Rust, R.; Tang, S. Performance-based generative design for parametric modeling of engineering structures using deep conditional generative models. Autom. Constr. 2023, 156, 105128. [Google Scholar] [CrossRef]
- Ramezani, A.; Dangol, P.; Sihite, E.; Lessieur, A.; Kelly, P. Generative design of nu’s husky carbon, a morpho-functional, legged robot. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 4040–4046. [Google Scholar]
- Demirel, H.O.; Goldstein, M.H.; Li, X.; Sha, Z. Human-centered generative design framework: An early design framework to support concept creation and evaluation. Int. J. Hum.–Comput. Interact. 2024, 40, 933–944. [Google Scholar] [CrossRef]
- Gradišar, L.; Dolenc, M.; Klinc, R. Towards machine learned generative design. Autom. Constr. 2024, 159, 105284. [Google Scholar] [CrossRef]
- Chen, M.; Sun, Y.; Cai, X.; Liu, B.; Ren, T. Design and implementation of a novel precision irrigation robot based on an intelligent path planning algorithm. arXiv 2020, arXiv:2003.00676. [Google Scholar]
- Liu, Z.; Wang, X.; Zheng, W.; Lv, Z.; Zhang, W. Design of a sweet potato transplanter based on a robot arm. Appl. Sci. 2021, 11, 9349. [Google Scholar] [CrossRef]
- Technologies, N. Oz—The Farming Assistant for Time-Consuming and Arduous Tasks. Available online: https://www.naio-technologies.com/en/oz/ (accessed on 10 December 2024).
- ViTiBOT. BAKUS—Electric Vine Straddle Robot That Meets the Challenges of Sustainable Viticulture. Available online: https://vitibot.fr/vineyards-robots-bakus/?lang=en (accessed on 10 December 2024).
- ROWBOT. ROWBOT—The Future of Farming Robotic Solutions for Row Crop Agriculture. Available online: https://www.rowbot.com/ (accessed on 10 December 2024).
- Chebrolu, N.; Lottes, P.; Schaefer, A.; Winterhalter, W.; Burgard, W.; Stachniss, C. Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields. Int. J. Robot. Res. 2017, 36, 1045–1052. [Google Scholar] [CrossRef]
- Adamides, G.; Katsanos, C.; Christou, G.; Xenos, M.; Papadavid, G.; Hadzilacos, T. User interface considerations for telerobotics: The case of an agricultural robot sprayer. In Proceedings of the Second International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2014), Paphos, Cyprus, 7–10 April 2014; pp. 541–548. [Google Scholar]
- Tom. Tom Autonomously Digitises the Field, Helping Farmers Detect Every Weed and Understand the Crop’s Health. Available online: https://smallrobotco.com/#tom (accessed on 10 December 2024).
- Zhou, L.; Hu, A.; Cheng, Y.; Zhang, W.; Zhang, B.; Lu, X.; Wu, Q.; Ren, N. Barrier-free tomato fruit selection and location based on optimized semantic segmentation and obstacle perception algorithm. Front. Plant Sci. 2024, 15, 1460060. [Google Scholar] [CrossRef]
- Bykov, S. World trends in the creation of robots for spraying crops. Proc. E3S Web Conf. 2023, 380, 01011. [Google Scholar] [CrossRef]
- Baltazar, A.R.; Santos, F.N.d.; Moreira, A.P.; Valente, A.; Cunha, J.B. Smarter robotic sprayer system for precision agriculture. Electronics 2021, 10, 2061. [Google Scholar] [CrossRef]
- Bhandari, S.; Raheja, A.; Renella, N.; Ramirez, R.; Uryeu, D.; Samuel, J. Collaboration between UAVs and UGVs for site-specific application of chemicals. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VIII, Orlando, FL, USA, 30 April–4 May 2023; pp. 123–130. [Google Scholar]
- Iriondo, A.; Lazkano, E.; Ansuategi, A.; Rivera, A.; Lluvia, I.; Tubío, C. Learning positioning policies for mobile manipulation operations with deep reinforcement learning. Int. J. Mach. Learn. Cybern. 2023, 14, 3003–3023. [Google Scholar] [CrossRef]
- Bogue, R. Robots addressing agricultural labour shortages and environmental issues. Ind. Robot. Int. J. Robot. Res. Appl. 2024, 51, 1–6. [Google Scholar] [CrossRef]
- Cantelli, L.; Bonaccorso, F.; Longo, D.; Melita, C.D.; Schillaci, G.; Muscato, G. A small versatile electrical robot for autonomous spraying in agriculture. AgriEngineering 2019, 1, 391–402. [Google Scholar] [CrossRef]
- Oberti, R.; Marchi, M.; Tirelli, P.; Calcante, A.; Iriti, M.; Tona, E.; Hočevar, M.; Baur, J.; Pfaff, J.; Schütz, C. Selective spraying of grapevines for disease control using a modular agricultural robot. Biosyst. Eng. 2016, 146, 203–215. [Google Scholar] [CrossRef]
- Konduri, S.; Chittoor, P.K.; Dandumahanti, B.P.; Yang, Z.; Elara, M.R.; Jaichandar, G.H. Boa Fumigator: An Intelligent Robotic Approach for Mosquito Control. Technologies 2024, 12, 255. [Google Scholar] [CrossRef]
- Jeyabal, S.; Vikram, C.; Chittoor, P.K.; Elara, M.R. Revolutionizing Urban Pest Management with Sensor Fusion and Precision Fumigation Robotics. Appl. Sci. 2024, 14, 7382. [Google Scholar] [CrossRef]
Ref. | Physical Features | Power | Spray System | Autonomy and Navigation | User Interface | Pesticide Application | Performance Metrics | Terrain of Application |
---|---|---|---|---|---|---|---|---|
[24] | Version: Jackle UGV Dimensions: 508 mm × 430 mm × 250 mm Weight: 17 kg | 4 h of operation time | 20 L tank | 2D and 3D LiDAR, RGB cameras, Stereo camera | Point-to-point navigation, Teleoperated | Sprinkler device, 16 small nozzles | 20 kg payload capable | 4-wheel drive, suitable for urban terrains |
[25] | Dimensions: 1600 mm × 1200 mm × 1400 mm | 5 kWh | Customized | - | Human Teleoperated | Customized | 0.25 m/s | Tracked wheels, multi-terrain |
[26] | Version: Oz robot Dimensions: 1300 mm × 470 mm × 830 mm Weight: 150 kg | 8 h of operation time | Customized | 4 wheel drive, RTK GPS | App control | Customized | 1.8 km/h, covers 1000 m2/h | Agricultural tires, off-road applications |
[27] | Version: ViTi Bot BAKUS P75S Dimensions: 3500 mm × 1750 mm × 2000 mm Weight: 2050 kg | 75 kWh 2 h to charge to 80% | Customized | 2 RTK GPS sensors, stereo vision, LiDAR | Autonomous, app control | Customized | Ackerman steering | Agricultural tires or tractor tires, off-road applications |
[28] | Version: Rowbot Dimensions: - Weight: greater than 500 kg | - | 75 L tank | GPS, onboard sensors | App control | Customized | - | Agricultural tires, off-road applications |
[29] | Version: BoniRob Dimensions: 1000 mm × 2000 mm × 1750 mm Weight: greater than 500 kg | - | 20 L tank | 3D LiDAR, IMU, RTK GPS | App control Gazebo | Customized | - | Agricultural tires, off-road applications |
[30] | Version: Summit XL Robotnik Dimensions: 731 mm × 614 mm × 720 mm Weight: 65 kg | 5 h of operation time | 10 L tank | cameras | Teleoperated | Sprinkling spray gun | 50 kg payload 3 m/sspeed | Agricultural tires, off-road applications |
[31] | Version: Tom V4 Dimensions: 1550 mm × 1540 mm × 1300 mm Weight: 350 kg | 4 batteries, each 1.56 kWh. Total 6.24 kWh | - | RTK GPS, camera, LiDAR | Point-to-point navigation | Sprinkling spray gun | 1.5 m/S speed | Agricultural tires, off-road applications |
[32] | Version: Tomato picking Dimensions: Weight: less than 100 kg | - | Customized | RGB Depth camera, LiDARs | Point-to-point navigation | - | - | 4-wheel drive, hub motor, flat surfaces |
[33] | Version: Solinftec 1005 Dimensions: 1550 mm × 1540 mm × 1300 mm Weight: 350 kg | Solar charging | - | GPS, GNSS | - | Targeted sprayer | - | 2-wheel drive, outdoor operation |
[34] | Version: Rochen PRYSM Dimensions: 900 mm × 1500 mm × 1200 mm Weight: 350 kg | - | 20 L tank | LiDAR, GNSS | - | Spray nozzle | 300 kg payload | Agricultural tires, off-road applications |
[35] | Version: R150 XAG Dimensions: 1500 mm × 1000 mm × 500 mm Weight: 50 kg | 4 h of operation time | 20 L tank, dual nozzle, omnidirectional | RTK, GPS | Teleoperated and app-based control | Spraying application | Can cover 5 hectares/h | Agricultural tires, off-road applications |
[36] | Version: Segway omnidirectional platform Dimensions: 2000 mm × 1500 mm × 1000 mm Weight: more than 100 kg | - | Spot spray nozzle | LiDAR, IMU, RGB Depth camera | - | Spot spraying | - | 4-wheel drive, flat surfaces |
[37] | Version: Yanmar YV01 Dimensions: Weight: 1000 kg | Gasoline 19 L tank | 200 L spray chemical tank | GPS, RTK | Teleoperated | Vineyard spraying | Top speed of 4 km/h | Tracked wheels, off-road applications |
[38] | Version: Small versatile electrical bot Dimensions: 880 mm × 1020 mm Weight: | 6–8 h of operation time | 130 L tank | Stereo camera, RTK-DGPS, GNSS, 2D LiDAR | - | 20 spray nozzles | 3 km/h 200 kg payload | Tracked wheels, off-road applications |
[39] | Version: Selective spraying robot Dimensions: 1610 mm × 480 mm × 670 mm Weight: 65 kg | - | 3.5 L tank | Multi-spectral camera | Point-to-point navigation | Precision spraying gun, fan, nozzle | Top speed of 0.2 m/s | Wheeled trailer platform |
Input Prompt | Output without Dataset | Remarks |
---|---|---|
Input Prompt 1: “Design a Fumigation robot for an urban landscape where the robot should be capable of traveling in indoor and semi-outdoor environments with flat surfaces. The output should be in a tabular format with two columns: Parameter and Value. Parameters should include the Dimensions of the robot, weight of the robot, battery capacity, type of autonomy, drive type, PC type, chemical payload capacity, sensors for localization, and spraying type.” | Engineer’s perspective: The specification generated suggests a 2.5 kWh battery but does not specify the type of battery. According to the proposed specifications, the battery weighs approximately 25 kg. Four-wheel drive equips each wheel with a motor, adding more weight. | |
Input Prompt 2: “It should work for 4 h and have precision spraying application.” | Engineer’s perspective: The model updates the battery requirements to meet the robot’s higher operational demand. However, adding to the previous mistake, the battery weight doubled, which is not reflected in the new specifications, and the dimensions remained unchanged. | |
Input Prompt 3: “I want a robot with fully autonomous capabilities. The weight is too much, and I don’t want to subscribe to RTK GPS.” | Final Output | Engineer’s perspective: The weight reduction did not correspond to the battery capacity or dimension change. The model suggested using SLAM LiDAR and visual odometry instead of RTK GPS. Although this is a valid suggestion, the design specifications generated via the model remain unfeasible in real-time development. Therefore, the specifications generated have minimal regard for the practicality of the development. |
Input Prompt | Output with Dataset | Remarks |
---|---|---|
Input Prompt 1: “Design a Fumigation robot for an urban landscape where the robot should be capable of traveling in indoor and semi-outdoor environments with flat surfaces. The output should be in a tabular format with two columns: Parameter and Value. Parameters should include the Dimensions of the robot, weight of the robot, battery capacity, type of autonomy, drive type, PC type, chemical payload capacity, sensors for localization, and spraying type.”; Spraying Robot Dataset: | Engineer’s perspective: The proposed model generates specifications close to real-time deployment capabilities, requiring minor modifications. This is due to the dataset of the existing robot and learning from their data. The model suggests a differential-drive DC motor drive type locomotion suitable for urban landscape applications. Learning from the dataset can provide details on the type of spraying, battery type, and combination of sensors concerning the real-world problem. | |
Input Prompt 2: “It should work for 4 h and have precision spraying application.” | Final Desired Output | Engineer’s perspective: The updated specifications consider the change in battery capacity and update the overall weight accordingly. The proposed model is interdependent with each robot specification and changes the values of each element based on the user’s requirements. This provides a closer perspective to practical robot development. |
Parameter | Specifications Without Dataset | Specifications with Dataset |
---|---|---|
Dimensions | 1200 mm × 800 mm × 600 mm | 800 mm × 600 mm × 1200 mm |
Weight | 50 kg | 60 kg |
Battery capacity | 4.8 kWh | 30 Ah, 48 V (1.44 kWh) lithium-ion (for 4 h operation) |
Type of autonomy | Point-to-point navigation, Li-DAR-based SLAM, visual odometry | Semi-autonomous with manual override |
Drive type | 4-wheel drive, suitable for flat surfaces | DC motor, differential drive with rubber wheels |
Payload capacity | 15 L precision spraying tank | 10 L |
Sensors for localization | LiDAR, RGB depth camera | LiDAR, depth camera, GPS (for semi-outdoors) |
PC | Industrial PC | Industrial PC |
Spraying type | Precision spraying gun, fan, nozzle | Precision nozzle with adjustable flow rate |
Spraying system | 4 h | High-pressure pump, adjustable spray range |
Navigation system | - | Simultaneous localization and mapping |
Precision spraying | - | Vision-guided spraying system |
Terrain | - | Indoor and semi-outdoor, flat surfaces |
Parameter | Specifications from the Proposed Model | Specifications of Developed Robot |
---|---|---|
Dimensions | 800 mm × 600 mm × 1200 mm | 692 mm × 517 mm × 1080 mm |
Weight | 60 kg | 52 kg |
Fumigation unit | 10 L | 10 L, precision spray gun |
CPU processor | Industrial PC | Nuvo-10108GC-RTX3080 industrial processor |
Sensors | 3D LiDAR, RGB depth camera | Hesai QT128 3D LiDAR, Intel RealSense D435i, IMU Vectornav VN-100 |
Battery | 48 V, 30 Ah | 48 V, 25 Ah |
Type of autonomy | Semi-autonomous with manual override | Semi-autonomous with manual override |
Drive type | Differential drive | Two BLDC motors, BLHM450KC-30, differential drive |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chittoor, P.K.; Dandumahanti, B.P.; Veerajagadheswar, P.; Samarakoon, S.M.B.P.; Muthugala, M.A.V.J.; Elara, M.R. Developing an Urban Landscape Fumigation Service Robot: A Machine-Learned, Gen-AI-Based Design Trade Study. Appl. Sci. 2025, 15, 2061. https://doi.org/10.3390/app15042061
Chittoor PK, Dandumahanti BP, Veerajagadheswar P, Samarakoon SMBP, Muthugala MAVJ, Elara MR. Developing an Urban Landscape Fumigation Service Robot: A Machine-Learned, Gen-AI-Based Design Trade Study. Applied Sciences. 2025; 15(4):2061. https://doi.org/10.3390/app15042061
Chicago/Turabian StyleChittoor, Prithvi Krishna, Bhanu Priya Dandumahanti, Prabakaran Veerajagadheswar, S. M. Bhagya P. Samarakoon, M. A. Viraj J. Muthugala, and Mohan Rajesh Elara. 2025. "Developing an Urban Landscape Fumigation Service Robot: A Machine-Learned, Gen-AI-Based Design Trade Study" Applied Sciences 15, no. 4: 2061. https://doi.org/10.3390/app15042061
APA StyleChittoor, P. K., Dandumahanti, B. P., Veerajagadheswar, P., Samarakoon, S. M. B. P., Muthugala, M. A. V. J., & Elara, M. R. (2025). Developing an Urban Landscape Fumigation Service Robot: A Machine-Learned, Gen-AI-Based Design Trade Study. Applied Sciences, 15(4), 2061. https://doi.org/10.3390/app15042061