Next Article in Journal
A Deep Learning-Driven Solution to Limited-Feedback MIMO Relaying Systems
Previous Article in Journal
Large Language Models for Knowledge Graph Embedding: A Survey
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Adaptive Outdoor Cleaning Robot with Real-Time Terrain Perception and Fuzzy Control

by
Raul Fernando Garcia Azcarate
,
Akhil Jayadeep
,
Aung Kyaw Zin
,
James Wei Shung Lee
,
M. A. Viraj J. Muthugala
* and
Mohan Rajesh Elara
Engineering Product Development Pillar, Singapore University of Technology and Design (SUTD), Spingapore 487372, Singapore
*
Author to whom correspondence should be addressed.
Mathematics 2025, 13(14), 2245; https://doi.org/10.3390/math13142245
Submission received: 6 June 2025 / Revised: 26 June 2025 / Accepted: 8 July 2025 / Published: 10 July 2025
(This article belongs to the Special Issue Research and Applications of Neural Networks and Fuzzy Logic)

Abstract

Outdoor cleaning robots must operate reliably across diverse and unstructured surfaces, yet many existing systems lack the adaptability to handle terrain variability. This paper proposes a terrain-aware cleaning framework that dynamically adjusts robot behavior based on real-time surface classification and slope estimation. A 128-channel LiDAR sensor captures signal intensity images, which are processed by a ResNet-18 convolutional neural network to classify floor types as wood, smooth, or rough. Simultaneously, pitch angles from an onboard IMU detect terrain inclination. These inputs are transformed into fuzzy sets and evaluated using a Mamdani-type fuzzy inference system. The controller adjusts brush height, brush speed, and robot velocity through 81 rules derived from 48 structured cleaning experiments across varying terrain and slopes. Validation was conducted in low-light (night-time) conditions, leveraging LiDAR’s lighting-invariant capabilities. Field trials confirm that the robot responds effectively to environmental conditions, such as reducing speed on slopes or increasing brush pressure on rough surfaces. The integration of deep learning and fuzzy control enables safe, energy-efficient, and adaptive cleaning in complex outdoor environments. This work demonstrates the feasibility and real-world applicability for combining perception and inference-based control in terrain-adaptive robotic systems.

1. Introduction

The journey toward autonomous cleaning began with the shift from manual cleaning to semi-automated machines, which introduced basic mechanical assistance. As operational demands increased, especially in larger or more complex areas, fully autonomous systems emerged. For outdoor applications, early efforts involved retrofitting manual sweepers with drive-by-wire systems, onboard computing, and environmental sensors to enable autonomous navigation. Despite these improvements, many existing systems remain tailored to specific environments and lack the flexibility to operate across mixed-use urban spaces such as pavements, parks, and campuses [1,2].
Outdoor cleaning robots are increasingly deployed in environments with highly variable surface conditions, posing significant challenges to maintaining consistent and safe operation. In these unstructured settings, where surface material, friction, and slope may shift frequently, robots must continuously adapt their behavior to prevent instability or loss of performance [3,4]. Recognizing different surface types, such as wood, concrete, or smooth tile, is not merely helpful but essential to performing cleaning tasks safely and efficiently [3,5]. This need has spurred advancements in terrain-aware navigation, where robots sense and interpret environmental features in real time, transforming raw sensor data into actionable insights [6].
Perception is a cornerstone of autonomous cleaning, especially outdoors where lighting and terrain may shift unpredictably. In recent developments, artificial intelligence and advanced perception techniques have further enhanced the autonomy of cleaning robots. Depth cameras that fuse RGB and spatial data allow robots to identify high-traffic areas, focus their cleaning efforts accordingly, and minimize energy spent in low-activity zones [7,8]. Additionally, several studies have shown that robots can identify floor types and adjust cleaning behavior dynamically using image-based dirt detection and unsupervised learning techniques [9]. While RGB-D and stereo vision systems offer valuable depth and visual information, their dependence on ambient lighting often limits their effectiveness in dim or inconsistent conditions [10,11]. These limitations have motivated the adoption of Light Detection and Ranging (LiDAR), which emits its own near-infrared light in the non-visible spectrum and produces dense 3D point clouds that remain accurate regardless of environmental lighting. For this reason, LiDAR can provide robust, long-range sensing independent of illumination constraints [12].
To enable terrain awareness, modern systems commonly employ photometric or range-based images for surface classification [13]. Depth cameras and LiDAR sensors, in combination with deep learning techniques, have been widely adopted to support real-time recognition of terrain types. Among these, convolutional neural networks, particularly those using ResNet architectures, have demonstrated reliable segmentation performance when processing fused LiDAR and visual data [14,15]. These systems allow robots to recognize unstructured or risky surfaces but typically do not react to these conditions beyond passive identification. In contrast, our work integrates this perception into an adaptive control loop, enabling the robot to respond in real time by slowing down, adjusting brush speed, or modifying cleaning height to match surface requirements [14]. ResNet’s residual connections further help preserve performance in deeper networks, ensuring reliable terrain classification [15].
Slope estimation plays an equally vital role in ensuring safety, particularly on inclined or multilevel terrain. Inertial Measurement Units (IMUs) are frequently used to capture angular data that help estimate slope and adjust motion control strategies [16]. When combined with other sensing modalities such as stereo cameras, ultrasonic sensors, or LiDAR, these systems can distinguish between flat, stepped, and inclined surfaces. This enables the robot to take proactive steps, like reducing speed or rerouting, to maintain stability in complex environments [17,18].
While detecting terrain type and slope is essential, it is not sufficient on its own [19]. The robot must also interpret this data and adapt its behavior accordingly to ensure safe and efficient operation. Fuzzy logic offers an effective framework for these problems, allowing systems to handle uncertainty and imprecision in sensor data through rule-based reasoning [20]. Widely used in robotics, fuzzy control frameworks are often applied for real-time behavioral adjustments in response to environmental conditions [21]. Prior studies have used fuzzy logic primarily to influence navigation direction or turning angle in high-risk scenarios, such as obstacle avoidance [4,22]. However, these applications have not leveraged fuzzy inference to adjust internal cleaning parameters such as brush speed, motor torque, or robot speed. In contrast, our approach extends the role of fuzzy control by integrating terrain and slope perception directly into the cleaning decision-making process, allowing dynamic modulation of cleaning intensity and motion behavior based on surface conditions.
Beyond improving safety, fuzzy control has also been linked to energy-efficient behavior by balancing locomotion stability and power consumption, especially under changing or uncertain terrain conditions [23]. These systems often operate alongside higher-level path planning and obstacle avoidance algorithms, allowing for intelligent, responsive navigation. When inputs such as pavement width and surface condition are considered, the fuzzy inference engine can recommend appropriate speeds and adjust motor responses accordingly [9,19,24]. In cleaning-specific use cases, fuzzy logic has also been applied to regulate operational parameters such as brush speed and fan power, based on dirt type and quantity, typically in human-operated machines rather than autonomous robots [25].
By integrating terrain classification, slope detection, and fuzzy control, it becomes possible to build genuinely terrain-aware cleaning robots. While prior studies have demonstrated individual components, such as terrain classification [5,26], slope detection [27], or fuzzy logic for the control of cleaning machines [25], these capabilities were typically explored in isolation and not combined into a unified, responsive system. To the best of our knowledge, no previous work has implemented a complete pipeline that integrates these modules for real-time, adaptive outdoor cleaning. This paper introduces a safety-aware pavement-cleaning robot that adjusts its driving speed, brush height, and brush speed in real time based on the detected floor type and slope, powered by a ResNet-based terrain classifier, IMU-based slope estimation, and a fuzzy logic control engine. LiDAR serves as the primary sensor, ensuring consistent perception performance in outdoor environments regardless of lighting variability.
This work addresses the challenge of real-time terrain-adaptive cleaning for outdoor mobile robots operating on diverse and unpredictable surfaces. Unlike traditional cleaning systems that rely on fixed schedules or static configurations, Panthera 2.0 incorporates a fuzzy logic-based framework that dynamically adjusts key cleaning parameters, including brush height, brush speed, and robot velocity, based on live terrain classification and slope estimation. By relying solely on LiDAR signal data and IMU feedback, the system avoids the limitations of vision-based methods in low-light conditions. The proposed approach is validated through extensive experiments and offers a practical, energy-aware solution for safe and effective autonomous cleaning in complex outdoor environments.
The rest of this paper is organized as follows. Section 2 presents the robot platform used for the implementation. The proposed terrain classification and adaptive cleaning are presented in Section 3. The results for validating the proposed approach are given in Section 4. Section 5 concludes this paper.

2. Robot Platform, Panthera

The Panthera 2.0 is an outdoor pavement-cleaning robot equipped with cleaning modules and a sensing system that enables terrain-adaptive cleaning. Designed specifically for outdoor public environments such as park connectors and urban walkways, Panthera is capable of operating efficiently across different urban settings. The robot integrates a smart lighting system comprising beacons, mode indicator lights, and a user-friendly dashboard to facilitate operator interaction. The Panthera robot is shown in Figure 1.
Depending on the cleaning scenario, the robot supports two main navigation strategies. For long, linear routes (e.g., park connectors), it follows a pre-planned point-to-point trajectory using a path-tracking controller such as Pure Pursuit. In contrast, when operating in wider, open areas like plazas or pedestrian squares, a grid-based coverage path planner is employed to ensure complete surface coverage. These navigation approaches are selected based on task context and terrain layout, allowing Panthera to flexibly adapt its cleaning behavior. While the robot features multiple functional subsystems, this paper focuses exclusively on the cleaning modules and terrain detection system, which are discussed in the following sections.

2.1. Sensing System

Terrain detection is primarily handled by a 128-channel Ouster LiDAR mounted at the front of the robot. To support autonomous navigation, a second 128-channel Ouster LiDAR is mounted on top and used for either mapping or localization, depending on the operational phase. This setup ensures accurate positioning in outdoor environments during cleaning operations. Additionally, a 32-channel LiDAR is positioned at the rear of the robot to assist with obstacle detection and avoidance. This sensor enhances safety by identifying static and dynamic objects in the robot’s blind spots, such as pedestrians, animals, or bicycles. The arrangement of these LiDAR sensors is shown in Figure 1. In addition, ten ultrasonic sensors are distributed around the chassis to enable close-range obstacle detection, further enhancing autonomous navigation capabilities. To monitor the robot’s orientation during slope traversal, a VectorNAV VN-100 IMU sensor from VectorNAV, based in Dallas, USA is used. Although it outputs quaternion values (x, y, z, w), our focus is on extracting the pitch angle, which is critical for evaluating cleaning performance during uphill movement.

2.2. Cleaning Modules

The cleaning modules consist of a brush system, a vacuum system, and a standard bin for the storage of waste.

2.2.1. Brush System

The brush system includes two types of brushes for sweeping, and they are side brushes and cylindrical brushes. The Panthera 2.0 robot is equipped with two side brushes located on the front of the robot. The main function of the side brushes is to sweep the trash and dirt on the ground, directing them toward the central section of the robot. Side brushes also enable the robot to clean edges and reach tight corners, increasing the cleaning effectiveness of the robot. The side brush mechanism is shown in Figure 2. The mechanism has several key components that are essential for the side brushes to carry out its main function of sweeping. These components are the stepper motor, linear guide, brush frame, brush motor, and brush. A stepper motor and linear guide assembly is mounted to each brush frame. This assembly enables the vertical motion of the brush frame and the brush attached, allowing the brush to move up when not in use and move down to come into contact with the ground when performing cleaning operations. The side brushes are each powered by a brush motor, providing the rotational motion required for sweeping to the side brush.
The Panthera 2.0 robot is also fitted with a cylindrical brush assembly as shown in Figure 3, which is located centrally under the robot. The main function of the cylindrical brush assembly is to collect all debris directed and swept in by the side brushes. The debris is subsequently picked up and transferred to the cleaning tray to be kept temporarily. The cylindrical brush assembly consists of several components such as brush motors, cylindrical brushes, a cleaning tray, and the drive mechanism. The assembly has a hollow frame and brush covers acting as a support structure to mount the components. Two counter-rotating cylindrical brushes operating at rotational speeds of between 700 and 1200 RPM help to gather debris from the ground and transfer it to the cleaning tray for temporary storage. Powering these two cylindrical brushes are brush motors that provide rotational motion through the drive mechanisms, comprising belts and pulleys. The combination of side brushes and cylindrical brushes enables the Panthera 2.0 robot to sweep and remove debris from the ground effectively.

2.2.2. Vacuum System and Bin

The primary purpose of the vacuum system is to suck debris from the cleaning tray and transport it into a standard size bin for storage. The vacuum is mounted to the bin by a bin mechanism assembly as shown in Figure 4. The vacuum system has two modes: vacuum mode and blower mode. In vacuum mode, the vacuum provides suction power to suck debris from the cleaning tray into the bin via the hose. In blower mode, the vacuum expels air to help unclog any debris stuck in the hose. The vacuum is attached with a muffler to help silence vacuum motor exhaust and improve sound absorption, with minimal disruption to vacuum suction.
The bin found on the Panthera 2.0 robot is a 120 L standard bin, having dimensions of 0.9 m (H) × 0.4 m (W) × 0.4 m (L), and weighing approximately 2.2 kg. It is supported by a bin lift frame that moves up and down together with the bin. The bin is raised up to a height of 140 mm from the ground when the robot is moving around and performing cleaning operations. The bin is lowered to ground level when it is full and needs to be manually emptied by a human operator. This vertical motion is made possible by the bin mechanism assembly. The assembly is made up of three key components: linear actuators, an enclosure with level detection sensor, and a bin alignment system. Two linear actuators on each side of the bin deliver the force required to lift and lower the bin in a steady manner. Bin guiding forks and the alignment system consisting of level measurement sensors ensure that the bin is aligned perpendicularly to the ground while it is moving up or down. An enclosure provides a tight fitting cover when the bin is raised up fully, preventing any debris from dispersing out when the robot is cleaning. The trash level detection sensor mounted on the enclosure measures the level of rubbish currently in the bin and updates the user interface to notify robot users of the trash levels.

2.3. Motor Control

Panthera 2.0 relies on a network of motors and controllers to coordinate both locomotion and cleaning operations. For movement, the robot is equipped with two Oriental BLV640NM100F drive motors, each linked to a drive wheel via a chain-and-sprocket mechanism, as shown in Figure 5. These motors are regulated by an Oriental BLVD40NM motor controller, enabling precise adjustment of driving speed and direction. Both items are manufactured by Oriental Motor based in Tokyo, Japan.
For cleaning, each side brush assembly includes a dedicated motor for rotation and a stepper motor for vertical adjustment. Brush rotation is controlled via a Roboteq SBLG2360T controller, while height adjustments are handled by an IGUS D1 controller actuating a lifting motor. These components allow for real-time adaptation of brush height and brush speed to suit varying terrain conditions (see Figure 2).
All motor controllers interface with a central Industrial PC (IPC), which serves as the control hub for locomotion and cleaning. Communication occurs over the Modbus protocol, allowing the IPC to transmit real-time commands derived from the fuzzy inference engine. This architecture enables the robot to adjust its mechanical behavior in direct response to changes in floor type and slope detected by the LiDAR and IMU modules.
This unified control structure ensures that brush speed, brush height, and drive velocity are updated dynamically, supporting safe and efficient cleaning performance across diverse terrain conditions.

3. Terrain Classification and Adaptive Cleaning

3.1. Terrain Classification

For this paper, a ResNet-18 convolutional neural network architecture was utilized through the Roboflow platform to perform floor-type classification using LiDAR signal images. The dataset consisted of grayscale images generated from the signal intensity of a 128-channel Ouster LiDAR sensor, mounted on the front-right corner of the Panthera platform, and can be accessed in the Supplementary Materials. The sensor, which is used for obstacle detection as well, is as such angled downward to capture consistent floor scans during motion. The classification task involved three floor categories: wood, characterized by distinct planks along with textured grain and straight, narrow gaps; smooth, comprising glossy and flat surfaces; and rough, consisting of gravel, asphalt, and concrete textures as shown in Figure 6 and Figure 7.
In order to focus exclusively on the floor and remove extraneous regions such as walls or distant objects, each raw image was cropped from a resolution of 1028 × 128 to 320 × 80 pixels. The resulting dataset was then split into training (70%), validation (20%), and testing (10%) sub datasets. Furthermore, to improve generalization and account for the model’s exposure to realistic differences in the images, a series of augmentations were applied. These included horizontal and vertical flips, 90-degree rotations, and minor rotations of ±15 degrees. These augmentations help the model learn features that are rotationally independent and make it more resilient to changes in robot orientation and floor alignment during deployment. Crucially, no augmentations such as blurring and brightness shifts were applied, as LiDAR signal images are inherently invariant to lighting conditions and do not show photometric distortions found in conventional RGB images. Following the application of all the augmentations, the final dataset consisted of 14,727 images. The model was initialized with pre-trained ImageNet weights to enable effective transfer learning by adapting features learned from large-scale natural image datasets to the domain of LiDAR-based floor classification.

3.2. Adaptive Cleaning

To enable adaptive cleaning behavior based on the terrain classification output, a fuzzy logic system was developed to convert surface and slope characteristics into real-time cleaning commands. The fuzzy logic engine was designed to take four inputs: the confidence scores for wood, smooth, and rough floors obtained from the ResNet classifier (normalized between 0 and 1 and mapped to fuzzy sets of Uncertain, Moderate, and Confident), and the pitch angle from the IMU, which was categorized into DownSlope, Flat, and UpSlope using predefined trapezoidal and triangular membership functions. The fuzzy outputs control three key cleaning parameters: brush height, brush speed, and robot speed. Each output was mapped into fuzzy categories, such as Low or Standard brush height, Low or High brush speed, and Low or Normal robot speed, with membership values calibrated through experimental data. The membership functions used are shown in Figure 8.
To define the fuzzy rules, a total of 48 cleaning experiments were conducted—16 for each terrain type: rough, smooth, and wood. For every surface type, eight tests were performed on flat ground and eight on inclined terrain, covering all combinations of three cleaning parameters: brush height (low/standard), brush speed (low/high), and robot speed (low/normal), encoded as binary triplets (e.g., 0, 0, 0 for all low settings). During each test, the average current drawn by the left and right brush motors was recorded, along with maximum and minimum values, to evaluate cleaning effort and surface resistance. A summary of selected results is shown in Table 1, highlighting conditions that motivated the definition of safe and energy-efficient fuzzy rules. The fuzzy membership functions for pitch were defined with conservative thresholds of ±12°, based on empirical slope measurements observed in some outdoor deployment areas. During preliminary testing, slopes up to 7° were measured, which informed the need to support inclines beyond standard flat or mildly sloped environments. The ±12° range provides additional margin to account for unexpected terrain variations, ensuring the system remains responsive and robust under diverse real-world conditions. Similarly, the membership functions for the three cleaning parameters—brush speed, robot speed, and brush height—were defined based on trends observed during the same 48 cleaning experiments. The parameter cutoffs were chosen to reflect configurations that balanced cleaning performance with energy efficiency. For example, brush speeds below 600 RPM consistently led to lower power consumption on smooth and wooden surfaces, while higher speeds (above 675 RPM) were reserved for rough terrain or uncertain classifications. Robot speed membership ranges were selected to avoid instability at high speeds on slopes or delicate surfaces. Likewise, brush height boundaries were informed by current draw measurements: a low brush height (19–20 cm from home) provided effective contact on flat terrain, whereas slightly elevated positions (20.75–21 cm) reduced drag and current spikes on inclines. These experimentally derived boundaries form the basis for the fuzzy sets used in the rule base.
Results from the rough terrain experiments indicated that higher brush heights consistently reduced motor current, especially on inclined surfaces. Low brush height combined with high robot speed significantly increased power draw. On smooth surfaces, parameter sensitivity was lower, with the robot tolerating a broader range of speeds and heights without excessive current. Wood surfaces required special caution due to potential surface damage under aggressive configurations. As a result, fuzzy rules for wood surfaces prioritize conservative configurations, especially when classification confidence is high. These findings align with expected physical interactions between terrain and cleaning hardware. Increased surface roughness and incline naturally elevate mechanical resistance, raising motor current. Conversely, smooth or flat surfaces present minimal resistance, supporting faster motion and more aggressive cleaning configurations.
A Mamdani-type fuzzy inference system with centroid defuzzification was implemented, resulting in a total of 81 rules governing the output actions. These rules were derived empirically from experimental trends observed across the 48 cleaning experiments and were derived directly from experimental insights and encode terrain-dependent control strategies. For instance, when terrain is classified as rough with high confidence and the slope is upward, the robot slows down and raises the brush to reduce drag. Conversely, when smooth terrain is detected on flat ground, the system increases speed and lowers the brush for optimal coverage. In cases of uncertainty or conflicting terrain confidence, the system defaults to a safe fallback rule with low brush speed, low robot speed, and standard brush height to avoid damage. A representative subset of these rules is listed in Table 2.

3.3. Mathematical Formulation of the Fuzzy Inference System

Let x 1 [ 12 , + 12 ] denote the pitch angle obtained from the IMU, and let x 2 , x 3 , x 4 [ 0 , 1 ] represent the confidence levels for rough, smooth, and wood terrain classifications, respectively, as output by the LiDAR-based ResNet model. Each input is associated with fuzzy sets defined via trapezoidal or triangular membership functions, denoted μ i , j ( x i ) , where j indicates the fuzzy label (e.g., Flat, UpSlope, Confident).
The fuzzy rule base comprises 81 rules of the form:
IF x 1 A 1 k x 2 A 2 k x 3 A 3 k x 4 A 4 k y B k
where A i k are fuzzy sets corresponding to the input conditions in rule k, and B k = [ y 1 k , y 2 k , y 3 k ] defines the output actions: brush speed, robot speed, and brush height.
A Mamdani-type fuzzy inference system is used, where individual rule outputs are aggregated, and final crisp values are obtained via centroid defuzzification. For each output variable y i , the defuzzified result is calculated as:
y i = y · μ B i ( y ) d y μ B i ( y ) d y
where μ B i ( y ) is the aggregated membership function for the i th output parameter.
This formulation enables real-time terrain-aware adjustment of cleaning parameters based on both surface classification confidence and slope, improving operational robustness, energy efficiency, and cleaning adaptability.
All fuzzy rules and membership functions were implemented onboard the robot’s control architecture, enabling real-time terrain-aware decision-making. This integration allows the robot to dynamically adjust its cleaning parameters in response to terrain classification and slope inputs, supporting energy efficiency, operational safety, and improved cleaning effectiveness in diverse environments. While the current fuzzy system uses a fixed rule base of 81 empirically derived rules, its scalability is limited due to the rapid growth of rules with each added input category. For instance, adding more terrain types would significantly expand the rule set. Future work could explore ways to simplify the rule base or automate its adaptation, making the system easier to maintain and extend to new environments.

3.4. Overall Architecture

Figure 9 illustrates the overall control architecture of the terrain-adaptive fuzzy logic system, showing the flow from LiDAR and IMU sensors through the classification model to the fuzzy inference system and adaptive cleaning commands.

4. Results

4.1. Training Results of the Terrain Classification Model

Training was conducted over 15 epochs, during which the model exhibited rapid convergence, characterized by a consistent decline in training loss and an overall improvement in validation accuracy (Figure 10). The training loss decreased sharply within the initial three epochs, with a small spike, which may indicate some outlier features in the training batch, but quickly remained near zero for the rest of training, indicating effective minimization of classification error. Validation accuracy fluctuated between 90.0% and 97.1% across epochs, with intermittent local maxima and minima. These fluctuations were probably due to variations in the validation set composition or the presence of ambiguous floor patterns. Despite this variability, the validation performance remained high overall, suggesting that the model consistently learned the more generalizable features from the LiDAR signal images.
The performance of the model was evaluated on a test set containing 614 LiDAR signal images: 204 of wood floors, 204 from smooth surfaces, and 206 from rough textures. These samples were not seen during training or validation and represent an independent test set acquired from the same sensor under varying surface conditions (Figure 11).
The ResNet-18 model achieved an overall classification accuracy of 97.1%, demonstrating its strong generalization to previously unseen data as well. The metrics, including precision, recall, and F1-score, are presented in Table 3 from the confusion matrix in Figure 12, confirming the model’s ability to effectively distinguish between the three floor categories.
The model achieved high classification performance across all three floor types, with an overall accuracy of 98.4%. Wood surfaces were identified with a recall of 1.000, indicating that all wood samples were correctly predicted. However, some rough images were misclassified as wood, slightly lowering wood precision to 0.981. Smooth floors were classified with perfect precision (1.000), meaning that all predictions of the smooth class were correct. A small number of actual smooth samples were incorrectly predicted as rough, leading to a lower recall of 0.980. Rough floor classification showed slight imbalance in performance, with a precision of 0.980 and a lower recall of 0.971, due to misclassifications involving both wood and smooth predictions. These confusions are most likely attributable to overlapping surface characteristics in LiDAR signal data, particularly in regions where wooden planks exhibit irregular reflectivity or where rough textures appear smoother due to lack of features and surface variability. The strong F1-scores (from 0.976 to 0.990) indicate reliable model performance across all classes.
Furthermore, the effectiveness of the geometric augmentations that were implemented on the dataset is reflected in the model’s resilience to floor orientation and viewpoint variability. These results confirm that LiDAR signal images, when processed and classified using a CNN, provide a very reliable and scalable method for terrain-type identification in autonomous robotic systems without the use of additional photometric sensors.

4.2. Overall System

To evaluate the effectiveness of the fuzzy logic-based adaptive cleaning system, two validation experiments were conducted to simulate terrain transitions in real-world cleaning environments. These experiments focused on assessing how the robot adjusted its cleaning parameters, including brush speed, robot speed, and brush height, based on pitch and terrain classif ication confidence.
In Experiment 1 (Figure 13 and Figure 14), the robot navigated a terrain with varying slopes composed of rough and wooden surfaces. It began on flat rough pavement with slight inclines (0–2°, see Figure 13a and Figure 14a) then passed through a flatter segment (1–1.7°, Figure 13b and Figure 14a before encountering a steeper slope (2.6° at 50 s, Figure 14a). Around second 70, the surface transitioned to wood, where the slope increased further to 3.5°, continuing until second 100 (Figure 13d). Terrain classification during this trial is shown in Figure 14b. The system confidently detected rough terrain between 0 and 60 s, except for a temporary misclassification around seconds 40–45, likely caused by a clean patch with minimal texture. Between seconds 60 and 70, the terrain type fluctuated due to the transition to wood then stabilized with high wood confidence beyond second 70. The fuzzy system maintained low brush height, high brush speed, and normal robot speed while traversing flat rough terrain. As the slope increased (see 45–60 s in Figure 14a), the system responded by increasing the brush height to reduce drag (Figure 14c). Upon reaching the steeper wooden section, the fuzzy controller reduced brush speed (Figure 14d) and robot speed (Figure 14e) to protect surface integrity and maintain stability. These results demonstrate the robot’s ability to adjust cleaning behavior based on both terrain type and slope.
In Experiment 2 (Figure 15 and Figure 16), the robot operated on flat ground and transitioned across four surface types: smooth, carpet, rough, and wood. The trial started with confident detection of smooth terrain (see 0–30 s in Figure 16b) as the robot moved over a clean floor (Figure 15a). The fuzzy controller applied high brush speed, high robot speed, and standard brush height during this segment. Around 30–50 s, the robot encountered a carpet section (Figure 15b). Because this material was not included in the training set, the model produced low confidence across all classes (Figure 16b), triggering fallback logic that lowered brush speed and robot speed while keeping brush height unchanged (Figure 16c–e). Between 50 and 85 s, the robot transitioned to rough pavement (Figure 15c). Confidence in rough classification increased after 50 s, though brief dips were observed around seconds 58 and 75. The fuzzy controller responded by reducing brush height for better surface contact (Figure 16c), with occasional fallback adjustments due to uncertainty. At second 93, the robot began detecting wooden flooring (Figure 15d,e), and wood confidence rose significantly. The system lowered brush speed (Figure 16e) and restored brush height to standard, while keeping robot speed high (Figure 16d) to maintain cleaning efficiency. These test cases validate the system’s intended behavior under diverse conditions, confirming that the fuzzy inference system dynamically adjusts outputs in response to changing terrain and confidence levels. The fallback behavior demonstrates robustness in handling unknown or untrained environments, ensuring safe and efficient operation across diverse cleaning scenarios. While surface cleanliness was kept consistent across all trials by design, performance was evaluated based on the system’s ability to select energy-efficient configurations without compromising stability.

5. Conclusions

This work proposed a terrain-aware fuzzy logic control system for adaptive cleaning in autonomous outdoor robots. The system combines LiDAR-based terrain classification, IMU-based slope detection, and a fuzzy inference engine to dynamically adjust key cleaning parameters, such as brush speed, robot speed, and brush height, in real time.
A ResNet-18 model was trained on LiDAR signal intensity images to classify terrain into three categories: rough, smooth, and wood. Classification confidence scores were mapped to fuzzy sets, along with slope angle input from the IMU, to drive a Mamdani-type fuzzy controller composed of 81 rules. These rules were empirically derived from 48 cleaning experiments designed to evaluate the relationship between terrain type, cleaning parameters, and brush motor current.
The system was validated through two real-world experiments involving flat and sloped terrain transitions. Results demonstrated that the robot successfully adjusted its behavior according to terrain conditions, raising brush height and reducing speed on delicate or sloped surfaces, while increasing brush speed and lowering brush height on rough surfaces to improve contact. These findings support the system’s ability to enhance safety, cleaning effectiveness, and energy efficiency in unstructured outdoor environments. Notably, all experiments were performed under night-time conditions, confirming the system’s robustness in low-light environments.
Future work will focus on integrating multi-sensor fusion to improve terrain classification robustness under variable environmental conditions, and on extending the control logic to handle wet or slippery surfaces, where traction and cleaning dynamics present additional challenges.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/math13142245/s1.

Author Contributions

Conceptualization, R.F.G.A. and M.A.V.J.M.; methodology, R.F.G.A., M.A.V.J.M., A.J. and A.K.Z.; software, R.F.G.A., A.J. and A.K.Z.; validation, R.F.G.A., A.J. and A.K.Z.; formal analysis, R.F.G.A. and M.A.V.J.M., M.R.E.; investigation, R.F.G.A., A.K.Z. and A.J.; resources, M.R.E.; data, R.F.G.A., A.K.Z. and A.J.; writing—original draft preparation, J.W.S.L.; supervision, M.R.E. and M.A.V.J.M.; project administration, M.R.E.; funding acquisition, M.R.E. All authors have read and agreed to the published version of the manuscript.

Funding

This research is supported by the National Robotics Programme under its National Robotics Programme (NRP) BAU, PANTHERA 2.0: Deployable Autonomous Pavement Sweeping Robot through Public Trials, Award No. M23NBK0065 and also supported by A*STAR under its RIE2025 IAF-PP programme, Modular Reconfigurable Mobile Robots (MR)2, Grant No. M24N2a0039.

Data Availability Statement

The dataset is provided as a Supplementary Materials.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Sun, S.; Zhang, T.; Xiang, Z.; Han, Y.; Li, D.; Li, J.; Liu, Z.; Ang, M.H. Autonomous Research Platform for Cleaning Operations in Mixed Indoor & Outdoor Environments. In Proceedings of the 2022 17th International Conference on Control, Automation, Robotics and Vision (ICARCV), Singapore, 11–13 December 2022; pp. 287–292. [Google Scholar]
  2. Bharamagoudra, M.R.; Samartha, S.; Dharwad, S.G.; Suriyha, B. A Comprehensive Solution for Indoor and Outdoor Floor Maintenance. In Proceedings of the 2024 15th International Conference on Computing Communication and Networking Technologies (ICCCNT), Kamand, India, 24–28 June 2024; pp. 1–6. [Google Scholar]
  3. Demirtaş, A.; Erdemir, G.; Bayram, H. Indoor surface classification for mobile robots. PeerJ Comput. Sci. 2024, 10, e1730. [Google Scholar] [CrossRef] [PubMed]
  4. Carvalho, A.E.; Portugal, D.; Peixoto, P. On terrain traversability analysis in unstructured environments: Recent advances in forest applications. Intell. Serv. Robot. 2025, 18, 195–213. [Google Scholar] [CrossRef]
  5. Siva, S.; Wigness, M.; Rogers, J.G.; Quang, L.; Zhang, H. Self-reflective terrain-aware robot adaptation for consistent off-road ground navigation. Int. J. Robot. Res. 2024, 43, 1003–1023. [Google Scholar] [CrossRef]
  6. Sevastopoulos, C.; Konstantopoulos, S. A survey of traversability estimation for mobile robots. IEEE Access 2022, 10, 96331–96347. [Google Scholar] [CrossRef]
  7. Sivanantham, V.; Le, A.V.; Shi, Y.; Elara, M.R.; Sheu, B.J. Adaptive floor cleaning strategy by human density surveillance mapping with a reconfigurable multi-purpose service robot. Sensors 2021, 21, 2965. [Google Scholar] [CrossRef] [PubMed]
  8. Jones, S.A.; Nandhakumar, D.; Kumar, A.M.; Pavithra, S. Advancements in AI-Powered Robotic Cleaning Systems: Autonomous Path Planning, Predictive Maintenance, and Cleanliness Assessment Frameworks. In Proceedings of the 2025 3rd International Conference on Intelligent Data Communication Technologies and Internet of Things (IDCIoT), Bengaluru, India, 5–7 February 2025; pp. 1077–1082. [Google Scholar] [CrossRef]
  9. Grünauer, A.; Halmetschlager-Funek, G.; Prankl, J.; Vincze, M. Learning the floor type for automated detection of dirt spots for robotic floor cleaning using Gaussian mixture models. In Proceedings of the Computer Vision Systems: 11th International Conference, ICVS 2017, Shenzhen, China, 10–13 July 2017; Springer: Cham, Switzerland, 2017; pp. 576–589. [Google Scholar]
  10. McManus, C.; Furgale, P.; Stenning, B.; Barfoot, T.D. Lighting-invariant visual teach and repeat using appearance-based lidar. J. Field Robot. 2013, 30, 254–287. [Google Scholar] [CrossRef]
  11. Christian, G.; Woodlief, T.; Elbaum, S. Generating realistic and diverse tests for lidar-based perception systems. In Proceedings of the 2023 IEEE/ACM 45th International Conference on Software Engineering (ICSE), Melbourne, Australia, 14–20 May 2023; pp. 2604–2616. [Google Scholar]
  12. Heng, L.; Choi, B.; Cui, Z.; Geppert, M.; Hu, S.; Kuan, B.; Liu, P.; Nguyen, R.; Yeo, Y.C.; Geiger, A.; et al. Project AutoVision: Localization and 3D Scene Perception for an Autonomous Vehicle with a Multi-Camera System. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 4695–4702. [Google Scholar] [CrossRef]
  13. Belter, D.; Łabęcki, P.; Skrzypczyński, P. Adaptive motion planning for autonomous rough terrain traversal with a walking robot. J. Field Robot. 2016, 33, 337–370. [Google Scholar] [CrossRef]
  14. Wang, L.; Li, S.; Yang, F.; Jiang, X.; Chen, Z.; Miao, K.; Zhao, L.; Gao, J. Terrain preview detection and classification in unstructured scenes based on vision and laser fusion with deep learning. IEEE Access 2024, 12, 137746–137759. [Google Scholar] [CrossRef]
  15. Li, H.; Ye, W.; Liu, J.; Tan, W.; Pirasteh, S.; Fatholahi, S.N.; Li, J. High-resolution terrain modeling using airborne lidar data with transfer learning. Remote Sens. 2021, 13, 3448. [Google Scholar] [CrossRef]
  16. Baishya, N.J.; Bhattacharya, B.; Ogai, H.; Tatsumi, K. Analysis and Design of a Minimalist Step Climbing Robot. Appl. Sci. 2021, 11, 7044. [Google Scholar] [CrossRef]
  17. Xu, Q.; Guo, W.; Sherikar, M.; Wu, C.; Wang, L. Environment Perception and Motion Strategy for Transformable Legged Wheel Robot on rough terrains. In Proceedings of the 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), Kuala Lumpur, Malaysia, 12–15 December 2018; pp. 2153–2158. [Google Scholar] [CrossRef]
  18. Pan, Y.; Xu, X.; Wang, Y.; Ding, X.; Xiong, R. GPU accelerated real-time traversability mapping. In Proceedings of the 2019 IEEE International Conference on Robotics and Biomimetics (ROBIO), Dali, China, 6–8 December 2019; pp. 734–740. [Google Scholar] [CrossRef]
  19. Nishimura, Y.; Yamaguchi, T. Grass cutting robot for inclined surfaces in hilly and mountainous areas. Sensors 2023, 23, 528. [Google Scholar] [CrossRef] [PubMed]
  20. Carter, J.; Chiclana, F.; Khuman, A.S.; Chen, T. Fuzzy Logic: Recent Applications and Developments; Springer International Publishing: Cham, Switzerland, 2021. [Google Scholar]
  21. Muthugala, M.V.J.; Samarakoon, S.B.P.; Mohan Rayguru, M.; Ramalingam, B.; Elara, M.R. Wall-following behavior for a disinfection robot using type 1 and type 2 fuzzy logic systems. Sensors 2020, 20, 4445. [Google Scholar] [CrossRef] [PubMed]
  22. Pandey, A.; Sonkar, R.K.; Pandey, K.K.; Parhi, D.R. Path planning navigation of mobile robot with obstacles avoidance using fuzzy logic controller. In Proceedings of the 2014 IEEE 8th International Conference on Intelligent Systems and Control (ISCO), Coimbatore, India, 10–11 January 2014; pp. 39–41. [Google Scholar] [CrossRef]
  23. Yang, L.; Liu, Z.; Chen, Y. Energy efficient walking control for biped robots using interval type-2 fuzzy logic systems and optimized iteration algorithm. ISA Trans. 2019, 87, 143–153. [Google Scholar] [CrossRef] [PubMed]
  24. Azcarate, R.F.; Daniela, S.; Hayat, A.; Yi, L.; Muthugala, M.A.V.J.; Tang, Q.; Povendhan, A.; Leong, K.; Elara, M. Shared Autonomy for Safety Between a Self-reconfigurable Robot and a Teleoperator Using Multi-layer Fuzzy Logic. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; pp. 141–148. [Google Scholar] [CrossRef]
  25. Wang, H.; Wang, C.; Ao, Y.; Zhang, X. Fuzzy control algorithm of cleaning parameters of street sweeper based on road garbage volume grading. Sci. Rep. 2025, 15, 8405. [Google Scholar] [CrossRef] [PubMed]
  26. Yi, L.; Félix Gómez, B.; Ramalingam, B.; Rayguru, M.M.; Elara, M.R.; Hayat, A.A. Self-reconfigurable robot vision pipeline for safer adaptation to varying pavements width and surface conditions. Sci. Rep. 2022, 12, 14557. [Google Scholar] [CrossRef] [PubMed]
  27. Hartono, R.; Nizar, T. Speed control of a mobile robot using fuzzy logic controller. IOP Conf. Ser. Mater. Sci. Eng. 2019, 662, 022063. [Google Scholar] [CrossRef]
Figure 1. Panthera robot overview.
Figure 1. Panthera robot overview.
Mathematics 13 02245 g001
Figure 2. Panthera 2.0 side brush mechanism.
Figure 2. Panthera 2.0 side brush mechanism.
Mathematics 13 02245 g002
Figure 3. Panthera 2.0 cylindrical brush assembly.
Figure 3. Panthera 2.0 cylindrical brush assembly.
Mathematics 13 02245 g003
Figure 4. Panthera 2.0 vacuum and bin mechanism assembly.
Figure 4. Panthera 2.0 vacuum and bin mechanism assembly.
Mathematics 13 02245 g004
Figure 5. Panthera 2.0 drive mechanism assembly.
Figure 5. Panthera 2.0 drive mechanism assembly.
Mathematics 13 02245 g005
Figure 6. RGB images of three floor classes: (a) wood, (b) rough, and (c) smooth.
Figure 6. RGB images of three floor classes: (a) wood, (b) rough, and (c) smooth.
Mathematics 13 02245 g006
Figure 7. Representative LiDAR signal intensity images of the three floor classes: (a) wood, (b) rough, and (c) smooth.
Figure 7. Representative LiDAR signal intensity images of the three floor classes: (a) wood, (b) rough, and (c) smooth.
Mathematics 13 02245 g007
Figure 8. Fuzzy membership functions used for pitch angle, terrain classification confidence levels, and cleaning parameters. Inputs include pitch from the IMU and confidence scores from the ResNet-18 terrain classifier. Outputs control brush speed, robot speed, and brush height. All functions were defined using triangular and trapezoidal shapes based on experimental data.
Figure 8. Fuzzy membership functions used for pitch angle, terrain classification confidence levels, and cleaning parameters. Inputs include pitch from the IMU and confidence scores from the ResNet-18 terrain classifier. Outputs control brush speed, robot speed, and brush height. All functions were defined using triangular and trapezoidal shapes based on experimental data.
Mathematics 13 02245 g008
Figure 9. System diagram of the fuzzy inference-based terrain-adaptive cleaning controller. The robot receives LiDAR and IMU data, which are processed through a ResNet-18 model to estimate terrain confidence levels and pitch angle. These inputs are passed through a Mamdani-type fuzzy inference engine, producing adaptive outputs: brush speed, robot speed, and brush height.
Figure 9. System diagram of the fuzzy inference-based terrain-adaptive cleaning controller. The robot receives LiDAR and IMU data, which are processed through a ResNet-18 model to estimate terrain confidence levels and pitch angle. These inputs are passed through a Mamdani-type fuzzy inference engine, producing adaptive outputs: brush speed, robot speed, and brush height.
Mathematics 13 02245 g009
Figure 10. Training loss and validation accuracy of the ResNet-18 model over 14 epochs. The training was conducted on the Roboflow platform.
Figure 10. Training loss and validation accuracy of the ResNet-18 model over 14 epochs. The training was conducted on the Roboflow platform.
Mathematics 13 02245 g010
Figure 11. Representative LiDAR signal intensity images of the three floor types classified by the model: (a) rough, (b) wood, and (c) smooth.
Figure 11. Representative LiDAR signal intensity images of the three floor types classified by the model: (a) rough, (b) wood, and (c) smooth.
Mathematics 13 02245 g011
Figure 12. Confusion matrix showing predictions of the ResNet-18 model on the combined test set of 324 LiDAR signal images. The confusion matrix is visualized using a color gradient, where darker shades indicate higher frequencies and lighter shades represent lower frequencies.
Figure 12. Confusion matrix showing predictions of the ResNet-18 model on the combined test set of 324 LiDAR signal images. The confusion matrix is visualized using a color gradient, where darker shades indicate higher frequencies and lighter shades represent lower frequencies.
Mathematics 13 02245 g012
Figure 13. Experiment 1: slope terrain transition photos.(a) Robot on flat rough terrain, (b) transitioning to sloped rough terrain, (c) transitioning to sloped wood terrain, (d) on wooden slope.
Figure 13. Experiment 1: slope terrain transition photos.(a) Robot on flat rough terrain, (b) transitioning to sloped rough terrain, (c) transitioning to sloped wood terrain, (d) on wooden slope.
Mathematics 13 02245 g013
Figure 14. Experiment 1: slope terrain transition plots. (a) IMU pitch angle data showing terrain inclination changes throughout the experiment. (b) Terrain classification confidence levels (rough, smooth, wood). (c) Brush height adaptation based on slope and terrain. (d) Brush speed regulation in response to terrain and classification certainty. (e) Robot speed modulation to ensure safe cleaning across slope transitions.
Figure 14. Experiment 1: slope terrain transition plots. (a) IMU pitch angle data showing terrain inclination changes throughout the experiment. (b) Terrain classification confidence levels (rough, smooth, wood). (c) Brush height adaptation based on slope and terrain. (d) Brush speed regulation in response to terrain and classification certainty. (e) Robot speed modulation to ensure safe cleaning across slope transitions.
Mathematics 13 02245 g014
Figure 15. Experiment 2: flat terrain transition photos. (a) Robot on smooth floor, (b) transitioning to carpet, (c) transitioning to rough floor, (d) transitioning to wood, (e) on wooden floor.
Figure 15. Experiment 2: flat terrain transition photos. (a) Robot on smooth floor, (b) transitioning to carpet, (c) transitioning to rough floor, (d) transitioning to wood, (e) on wooden floor.
Mathematics 13 02245 g015
Figure 16. Experiment 2: flat terrain transition plots. (a) IMU pitch data confirming flat terrain throughout the experiment. (b) Terrain classification confidence levels (rough, smooth, wood). (c) Brush height adjustment in response to terrain changes and classification fluctuations. (d) Robot speed variation to ensure control during uncertain classification phases. (e) Brush speed adaptation for effective cleaning and surface preservation.
Figure 16. Experiment 2: flat terrain transition plots. (a) IMU pitch data confirming flat terrain throughout the experiment. (b) Terrain classification confidence levels (rough, smooth, wood). (c) Brush height adjustment in response to terrain changes and classification fluctuations. (d) Robot speed variation to ensure control during uncertain classification phases. (e) Brush speed adaptation for effective cleaning and surface preservation.
Mathematics 13 02245 g016
Table 1. Selected experimental results illustrating current draw trends under different cleaning parameter combinations and terrain conditions.
Table 1. Selected experimental results illustrating current draw trends under different cleaning parameter combinations and terrain conditions.
TerrainSlopeComboAvg Current (A)ObservationFuzzy Rule Outcome
RoughFlat0, 1, 03.86High current at low heightRaise brush height
RoughSlope0, 0, 14.52High speed increases dragReduce robot speed
WoodFlat0, 1, 03.11Surface risk due to speedLower brush speed
WoodSlope0, 1, 03.60Aggressive on slopeConservative setting
SmoothFlat1, 0, 12.50Low current, stableAcceptable config
SmoothSlope1, 1, 12.48Minimal load observedMaintain setting
Table 2. Representative fuzzy logic rules linking slope and terrain confidence to adaptive cleaning parameters. Abbreviations: down = DownSlope, flat = Flat, up= UpSlope, conf = Confident, mod = Moderate, unc = Uncertain, stand = Standard brush height. Improbable rules with multiple confident terrain types are excluded.
Table 2. Representative fuzzy logic rules linking slope and terrain confidence to adaptive cleaning parameters. Abbreviations: down = DownSlope, flat = Flat, up= UpSlope, conf = Confident, mod = Moderate, unc = Uncertain, stand = Standard brush height. Improbable rules with multiple confident terrain types are excluded.
PitchRoughSmoothWoodB SpeedR SpeedBrush Height
downconfmod/uncmod/unchighslowstand
downmod/uncconfmod/unchighslowstand
downmod/uncmod/uncconflowslowstand
flatconfmod/uncmod/unchighnormallow
flatmod/uncconfmod/unchighnormalstand
flatmod/uncmod/uncconflownormalstand
upconfmod/uncmod/unchighslowstand
upmod/uncconfmod/unchighslowstand
upmod/uncmod/uncconflowslowstand
downmod/uncmod/uncmod/unclowslowstand
flatmod/uncmod/uncmod/unclowslowstand
upmod/uncmod/uncmod/unclowslowstand
Table 3. Classification performance of the ResNet-18 model on the test set (614 samples).
Table 3. Classification performance of the ResNet-18 model on the test set (614 samples).
ClassPrecisionRecallF1-Score
Wood0.9811.0000.990
Smooth0.9900.9800.985
Rough0.9800.9710.976
Accuracy 0.984
Macro Average0.9840.9840.984
Weighted Average0.9840.9840.984
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Azcarate, R.F.G.; Jayadeep, A.; Zin, A.K.; Lee, J.W.S.; Muthugala, M.A.V.J.; Elara, M.R. Adaptive Outdoor Cleaning Robot with Real-Time Terrain Perception and Fuzzy Control. Mathematics 2025, 13, 2245. https://doi.org/10.3390/math13142245

AMA Style

Azcarate RFG, Jayadeep A, Zin AK, Lee JWS, Muthugala MAVJ, Elara MR. Adaptive Outdoor Cleaning Robot with Real-Time Terrain Perception and Fuzzy Control. Mathematics. 2025; 13(14):2245. https://doi.org/10.3390/math13142245

Chicago/Turabian Style

Azcarate, Raul Fernando Garcia, Akhil Jayadeep, Aung Kyaw Zin, James Wei Shung Lee, M. A. Viraj J. Muthugala, and Mohan Rajesh Elara. 2025. "Adaptive Outdoor Cleaning Robot with Real-Time Terrain Perception and Fuzzy Control" Mathematics 13, no. 14: 2245. https://doi.org/10.3390/math13142245

APA Style

Azcarate, R. F. G., Jayadeep, A., Zin, A. K., Lee, J. W. S., Muthugala, M. A. V. J., & Elara, M. R. (2025). Adaptive Outdoor Cleaning Robot with Real-Time Terrain Perception and Fuzzy Control. Mathematics, 13(14), 2245. https://doi.org/10.3390/math13142245

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop