Next Article in Journal
Real-Time Vehicle Emergency Braking Detection with Moving Average Method Based on Accelerometer and Gyroscope Data
Next Article in Special Issue
The Impact of Time Delays in Traffic Information Transmission Using ITS and C-ITS Systems: A Case-Study on a Motorway Section Between Two Tunnels
Previous Article in Journal
Development and Verification of a FEM Model of Wheel–Rail Contact, Suitable for Large Parametric Analysis of Independent Guided Wheels
Previous Article in Special Issue
Influence of ADAS on Driver Distraction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Performance of Advanced Rider Assistance Systems in Varying Weather Conditions

1
School of Science and Technology, University of Trás-os-Montes and Alto Douro (UTAD), 5000-801 Vila Real, Portugal
2
Institute for Systems and Computer Engineering, Technology and Science, 4200-465 Porto, Portugal
*
Author to whom correspondence should be addressed.
Vehicles 2025, 7(4), 105; https://doi.org/10.3390/vehicles7040105
Submission received: 22 July 2025 / Revised: 21 September 2025 / Accepted: 22 September 2025 / Published: 24 September 2025

Abstract

Advanced rider assistance systems (ARAS) play a crucial role in enhancing motorcycle safety through features such as collision avoidance, blind-spot detection, and adaptive cruise control, which rely heavily on sensors like radar, cameras, and LiDAR. However, their performance is often compromised under adverse weather conditions, leading to sensor interference, reduced visibility, and inconsistent reliability. This study evaluates the effectiveness and limitations of ARAS technologies in rain, fog, and snow, focusing on how sensor performance, algorithms, techniques, and dataset suitability influence system reliability. A thematic analysis was conducted, selecting studies focused on ARAS in adverse weather conditions based on specific selection criteria. The analysis shows that while ARAS offers substantial safety benefits, its accuracy declines in challenging environments. Existing datasets, algorithms, and techniques were reviewed to identify the most effective options for ARAS applications. However, more comprehensive weather-resilient datasets and adaptive multi-sensor fusion approaches are still needed. Advancing in these areas will be critical to improving the robustness of ARAS and ensuring safer riding experiences across diverse environmental conditions.

1. Introduction

Motorcycle safety is a significant concern worldwide, with motorcycle accidents often resulting in severe injuries and fatalities [1]. Motorcycles represent a prevalent and often economical mode of transportation globally, with a significant presence in Europe, where approximately 25 million were recorded in 2018 [2]. Despite their utility and popularity, motorcyclists face considerably elevated risks of severe injury and fatality on the roads due to their inherent vulnerability. In 2019, more than 3500 motorcyclists lost their lives on European roads, representing 16% of all road fatalities, a proportion that has slightly increased over the last decade despite a 16% reduction in absolute numbers [2]. This increased risk of fatalities and injuries can be broadly categorized into human factors (e.g., age, inexperience, speeding, helmet non-use), vehicle-related factors (e.g., lack of safety features, braking instability), and environmental factors (e.g., poor road infrastructure and adverse weather conditions) [3].
Several factors contribute to the prevalence of motorcycle accidents. A primary concern is that motorcyclists are frequently overlooked by other road users, often due to their smaller visual profile and blind spots [2]. Additionally, inadequate or poorly maintained road infrastructure, such as uneven surfaces, potholes, and loose gravel or material, poses significant hazards specifically for two-wheeled vehicles [2,4]. Rider-specific factors also play a critical role; these include inexperience, particularly among younger riders, excessive speed, and impairment due to alcohol or drugs [2]. Fatigue and risky riding behaviors are further elements that can compromise safety [3].
Efforts to improve motorcycle safety have seen advancements in vehicle technology, with anti-lock braking systems notably contributing to a reduction in accidents [2]. However, beyond technological improvements, a holistic approach is deemed essential for addressing the multifaceted safety challenges confronting powered two- and three-wheeler users globally. This includes promoting safer road infrastructure, improving enforcement of traffic laws, fostering safer post-crash care, and enhancing public awareness regarding motorcycle safety [3]. The complex relationship among these factors requires continued and comprehensive strategies to mitigate risks and improve road safety for motorcyclists worldwide.
To address these risks, advanced rider assistance systems (ARAS) have emerged as a promising solution [5,6,7]. ARAS are systems designed to aid the rider during the operation of the vehicle. These systems integrate various technologies designed to enhance rider safety by providing real-time alerts and assistance, such as collision avoidance blind-spot detection adaptive cruise control [5], and anti-lock braking system [8,9,10,11,12]. By offering these features, ARAS aims to reduce human error, which is the leading cause of motorcycle accidents.
However, the effectiveness of ARAS is not uniform across all conditions; adverse weather conditions, such as heavy rain or fog, can significantly affect the performance of sensor-based systems integral to ARAS. These environmental factors can reduce the reliability and effectiveness of the assistance systems in real-world scenarios [13,14]. Recent research on vehicle speeds showed that all vehicle types traveled at better speeds under sunny conditions, while significant variations were observed in rainy weather. In particular, motorcycles were more affected by adverse weather, reflecting their greater exposure as compared to heavier vehicles [15]. Adverse weather, such as rain, fog, and snow, presents unique challenges to sensor systems that are central to ARAS [16]. For instance, optical sensors such as cameras may experience reduced visibility in fog or rain, Radar (radio detection and ranging) may be disrupted by heavy precipitation or snow, and LiDAR (light detection and ranging) systems may face difficulties in low-contrast environments.
In recent years, several reviews have explored the development and potential of ARAS for motorcycles. For example, Savino et al. [11] provided a systematic review of active safety systems for powered two-wheelers, highlighting the diversity of technologies and the lack of comparative evaluations across systems. Kaye et al. [17] examined rider beliefs and perceptions toward ARAS using the theory of planned behaviour, offering insights into the psychological and behavioural barriers to adoption. Naweed and Blackman [18] critically analyzed how ARAS are discussed and marketed by manufacturers, exposing a gap between industry claims and empirical evidence. Other reviews have addressed human factors [19], safety discourse [18], and system-level effectiveness through real crash data [20]. Despite these valuable contributions, prior reviews have typically focused on specific aspects such as rider psychology, industry framing, or single-system performance. This paper extends the literature by providing an integrated, weather-focused evaluation of ARAS, synthesizing information on individual modules, sensor technologies, machine learning algorithms, and public datasets. Uniquely, we structure the review around system suitability under adverse weather conditions, an area that remains underexplored in the existing literature.
This study evaluates and highlights the effectiveness of current ARAS technologies under different weather conditions. It will assess how various sensors, LiDAR, Radar, and cameras perform under rain, fog, and snow scenarios. Additionally, the study will identify which technologies and datasets are best suited to maintain optimal performance in these conditions. This study provides insights into the most reliable ARAS systems and algorithms for enhancing motorcycle safety across diverse environmental challenges by understanding these limitations.
The organization of the paper is outlined in Figure 1 and structured as follows: Section 2 describes the methodology and data collection process for the review. Section 3 discusses the different modules of ARAS under various weather conditions, where the main focus is on evaluating rainy, foggy, and snowy conditions for each module. Section 4 focuses on ARAS sensor technologies, where the central focus is on Lidar, Radar, and camera behavior in adverse weather conditions. Section 5 provides an overview of relevant datasets that are applicable to ADAS or ARAS and have also been evaluated for adverse weather conditions. Furthermore, Section 6 examines algorithms and techniques that are directly or indirectly related to ARAS, and Section 7 presents the discussion, analysis, conclusion, and future recommendations.

2. Method

This study employs a qualitative, literature-driven methodology to investigate the performance of ARAS under diverse weather conditions. Rather than following a formal systematic review protocol, this work employs a thematic review approach in which existing research is collected, examined, and synthesized around clearly defined themes rather than being organized strictly by chronology or methodology. The emphasis is on identifying patterns, recurring topics, and conceptual categories across studies, which are then grouped and discussed under thematic headings. This approach was selected to provide a structured overview of existing work by organizing studies according to predefined themes. The scope was limited to ARAS-related research that explicitly considered environmental influences.
Google Scholar, IEEE Xplore, and ScienceDirect were used to identify relevant articles from peer-reviewed journals, conference proceedings, and established datasets in the fields of intelligent transportation systems and computer vision. The selection criteria prioritized studies evaluating ARAS or related advanced driver assistance systems (ADAS) in relation to environmental factors, such as sun, rain, fog, and snow. A special focus was placed on papers presenting empirical results, technology comparisons, or novel methods involving Radar, LiDAR, and camera-based systems. A total of 136 papers were selected. To better understand the evolution of research activity in the ARAS field, the publication years of the reviewed articles were analyzed. As illustrated in Figure 2, ARAS-related studies remained limited until around 2015, after which a steady growth trend emerged. The number of publications increased significantly from 2019 onward, aligning with the broader adoption of advanced sensing technologies, increased interest in motorcycle safety, and the rising availability of benchmark datasets. The recent surge suggests a transition from feasibility studies toward system integration and evaluation under diverse operational conditions, including adverse weather, underscoring the relevance of this review’s focus.

3. Modules of ARAS Under Various Weather Conditions

ARAS has become a central technology in improving motorcycle safety and rider confidence, particularly in challenging weather conditions. The system integrates a range of technologies, including Radar, LiDAR, cameras, and advanced control algorithms, to support riders with real-time situational awareness and automated responses. Key ARAS functionalities include adaptive cruise control (ACC) for maintaining safe distances, anti-lock braking systems (ABS) to prevent wheel lock-up, and automatic emergency braking (AEB) for rapid collision mitigation. Additional features like blind-spot detection (BSD), curve warning, front collision warning (FCW), and intersection support system (ISS) provide proactive hazard recognition, especially under poor visibility or slippery road surfaces. Lane-keeping assist system (LKAS) and speed alerts (SAs) help drivers maintain vehicle discipline, while stability and traction control systems (SCS and TCS) ensure optimal handling during rain, fog, or snow. All these technologies from ADAS are increasingly central to ARAS, with companies like Robert Bosch GmbH and Continental AG developing radar-based safety features such as adaptive cruise control, collision warning, and blind-spot detection. Motorcycle manufacturers, including BMW Motorrad, Honda, Yamaha, Kawasaki, Ducati, and KTM, are integrating these features into higher-end models. Weather-related challenges pose significant risks to motorcyclists; therefore, the role of ARAS in maintaining safety and performance across environmental conditions is increasingly critical and well-supported by empirical research and dataset-driven innovations.

3.1. Adaptive Cruise Control Under Various Weather Conditions

ACC systems are designed to maintain a safe vehicle distance and speed automatically, enhancing both comfort and safety during driving. Under optimal weather conditions, such as sunny or dry environments, ACC systems demonstrate reliable performance, with minimal velocity and relative distance errors, less than 0.6% and 0.5%, respectively, thereby ensuring high operational accuracy and passenger comfort [21].
In rainy weather, system performance slightly degrades due to increased braking distances and reduced tire-road friction. Real-world studies on commercial motor vehicles indicated a modest reduction in ACC usage during rain and a significant increase in following headways to maintain safety [22]. Simulated environments further validated these data by showing increased travel times and reduced average speeds in rainy scenarios [23].
Although specific experimental data on foggy conditions were sparse, the literature recognizes fog as a major challenge due to sensor performance limitations. Radar and camera systems, critical components of ACC, may experience degraded object detection in fog, prompting many manufacturers to discourage ACC use in such environments [21].
Snowy and icy conditions pose the most significant challenges to ACC functionality. Drastically reduced road friction requires extended stopping distances, up to four times greater than dry conditions. However, adaptive control methods such as fuzzy logic and dynamically adjusted reference signals (e.g., Gaussian, sine-based) enable ACC systems to maintain safe operation despite severe weather. These systems modulate target velocity and safe distances in real time, effectively balancing safety and comfort [21,24]. Table 1 provides a summary of ACC performance under different weather conditions.

3.2. ABS Performance Under Different Weather Conditions

ABS exhibits variable performance depending on road surface conditions, which are influenced by weather. Under sunny and dry conditions, ABS systems perform optimally. They help maintain vehicle stability, minimize wheel lock, and reduce stopping distances [25]. Enhanced friction from both the tire-road interface and high-performance brake materials like carbon-fiber ceramic discs further boosts ABS effectiveness in such scenarios [25].
In rainy weather, ABS remains beneficial but with diminished effectiveness due to a significant drop in tire-road friction caused by water films. While ABS-equipped vehicles still outperform non-ABS ones, studies report increased stopping distances and late-phase wheel lock during braking on wet asphalt [25]. Nonetheless, ABS has been shown to reduce crash involvement significantly, with injury reductions reported as high as 57–60% in wet road conditions [26].
Foggy conditions were not explicitly tested in the analyzed studies. However, given that fog often results in damp road surfaces, ABS performance can be inferred to follow patterns similar to rainy conditions, providing continued benefits through improved steering control, albeit with some performance loss due to moisture-related traction reductions.
On snowy or icy roads, ABS effectiveness is most compromised. The significantly lower friction and unstable slip conditions on these surfaces increase stopping distances, even with ABS activated. Table 2 summarizes the effectiveness of ABS systems, while some studies show ABS can become less efficient or even detrimental under such conditions if not paired with additional systems like traction control or adaptive braking logic [27,28].

3.3. Automatic Emergency Braking Systems in Different Weather Conditions

AEB systems are a critical component of modern vehicle safety technology. Their performance, however, is notably influenced by environmental conditions, particularly weather. This section synthesizes findings from recent studies to evaluate how AEB systems perform in sunny, rainy, foggy, and snowy conditions.
Under sunny weather, AEB systems generally perform at optimal levels. Sensor visibility is clear, and traction is high, which supports precise object detection and effective braking response [29].
In rainy weather, performance begins to degrade. Camera and LiDAR sensors experience reduced reliability due to water interference, and lower road adhesion leads to increased braking distances. While AEB systems still offer some crash mitigation, particularly for pedestrians, their effectiveness for bicyclist-related incidents is not statistically significant [29,30].
Foggy conditions pose even greater challenges, as dense fog interferes with sensor accuracy, especially optical systems. Although some crash risk reduction has been observed in pedestrian scenarios, the effectiveness remains inconsistent and largely ineffective for bicyclist incidents [30].
In snowy environments, AEB performance is significantly compromised. Snow reduces both traction and sensor visibility, particularly for cameras and LiDAR. Unless systems are specifically designed or adapted to low-friction conditions, braking performance suffers greatly. However, research indicates that driver-trust and safety perceptions can be improved when AEB systems incorporate snow-adaptive braking algorithms [31]. Table 3 provides an overview of the AEB system performance under various weather conditions.

3.4. Blind-Spot Detection Systems Under Different Weather Conditions

BSD systems demonstrate strong performance in clear weather but face various degrees of degradation under adverse conditions. In sunny conditions, vision-based systems utilizing optical flow, feature extraction, and convolutional neural networks consistently perform with high accuracy, often exceeding 95% detection rates [32,33,34].
Under rainy conditions, the performance of purely camera-based systems deteriorates due to visual noise from water droplets, reflections, and reduced contrast [34,35]. However, sensor fusion systems integrating Radar and cameras show greater resilience, compensating for visual weaknesses with Radar robustness [35,36].
Foggy weather was less commonly tested directly, but vision-based systems are expected to underperform due to low visibility and contrast loss. Radar and LiDAR may help mitigate these issues, although their effectiveness can vary with fog density [35].
Snowy conditions pose significant challenges for both Radar and vision systems. Snow introduces signal attenuation and environmental clutter for Radar and obscures visual cues for cameras. Most single-sensor systems are insufficient under such conditions, reinforcing the need for multi-sensor redundancy and adaptive algorithms [35].
Overall, BSD systems perform best in sunny conditions and can maintain acceptable performance in rain when sensor fusion is used. Fog and snow remain problematic and require further research into adaptive sensing strategies and robust environmental perception.

3.5. Curve Warning System Performance Under Different Weather Conditions

Curve warning systems, as part of broader ARAS, demonstrate varying levels of effectiveness across different weather conditions. Under sunny or clear conditions, these systems are generally assumed to function optimally, as environmental interference is minimal. While no specific performance limitations were identified, user studies primarily focused on adverse conditions, offering limited insights into their relative benefit during ideal weather [17].
In rainy conditions, riders consistently acknowledged the utility of systems like ABS, traction control, and curve warnings. These features enhance motorcycle control and safety by mitigating slip risks and improving braking and cornering reliability on wet surfaces. Riders found ARAS particularly reassuring in rain, though concerns about potential overreliance and system malfunction were also noted [17].
Foggy conditions presented a significant context for performance testing. Simulator-based experiments revealed that curve warning systems integrated with Head-Up Displays (HUDs), particularly when paired with auditory alerts, substantially reduced lane departure and speed deviations in dense fog. These systems were especially effective for female and older riders, offering meaningful improvements in situational awareness and control [37].
In snowy or icy conditions, systems targeting icy-curve warnings were found to be beneficial for hazard anticipation and reaction. Early alerts improved rider responses, though stability concerns arose with abrupt automated interventions like autonomous emergency braking. These findings underscore the need for carefully timed and user-transparent warnings to maintain rider control [19]. Table 4 provides the performance summary of the curve warning system in different weather conditions.

3.6. FCW System Performance Under Different Weather Conditions

FCW systems demonstrate varying levels of performance depending on environmental conditions, particularly weather. Under sunny conditions, systems generally exhibit optimal detection accuracy. Chou et al. [38] reported detection rates exceeding 90% in both sunny and cloudy environments, reflecting the effectiveness of vision-based detection under high-visibility conditions.
In rainy conditions, performance moderately declines due to the reduction in visibility and interference caused by rain droplets and road spray. Despite this, Chou et al. [38] observed detection rates over 80%, suggesting resilience to moderate visual distortion. Pan et al. [39] similarly noted a drop in recognition accuracy from ~91% to ~81.9% when transitioning from clear to rainy weather, alongside increased computational demand.
Foggy conditions pose a significant challenge to FCW systems, especially those reliant on vision or Radar. Several studies have demonstrated that fog can severely impair detection accuracy and reduce warning times. Chen et al. [40] introduced the ViCoWS system, which adjusted its warning horizon based on visibility and extended the warning time up to 4.5 s under dense fog (120 m visibility). Additionally, Zhang et al. [41] developed a low-visibility FCW algorithm that significantly reduced collision risk compared to standard systems, with a reported improvement of up to 10.88 times in fog scenarios.
In contrast, snowy conditions remain less explored within the current literature. While snow presents similar visual and sensor interference challenges as fog and rain, no empirical performance data specific to snow were found among the reviewed studies. It is reasonable to infer that performance would similarly degrade unless systems are explicitly designed to compensate for snow-induced distortions. The system performance is summarized in Table 5.

3.7. Intersection Support Systems Under Different Weather Conditions

ISS, particularly those designed for powered two-wheeled vehicles and vehicular ad hoc networks (VANETs), show promising performance under ideal or sunny conditions. Evaluations performed in simulated or controlled environments demonstrate the systems’ ability to assess threats effectively using receding horizon control and onboard sensors [42]. These systems offer accurate real-time assessments, effective rider feedback mechanisms, and high user acceptance in clear weather [42,43].
However, the performance under adverse weather conditions such as rain, fog, and snow is largely untested in the reviewed literature, as shown in Table 6. While the VANET-based frameworks acknowledge environmental and sensor reliability challenges [44]. There is a lack of explicit modeling or testing under such conditions. Rain may degrade the accuracy of visual sensors and laser scanners, while fog introduces visibility constraints that could impact non-cooperative systems. Snow poses more severe limitations due to both environmental occlusions and reduced vehicle maneuverability factors that challenge the assumptions made in current dynamic models and control strategies [44].
Although technologies like Radar and DSRC communications are better suited for adverse weather, the current systems have not incorporated or tested such adaptations. This indicates a need for expanded validation across realistic, weather-diverse scenarios to ensure robust and reliable system performance in all conditions [43,44].

3.8. Lane-Keeping Assist System Under Different Weather Conditions

LKAS demonstrates high reliability in sunny weather, benefiting from optimal visibility and well-marked lanes. Under such conditions, lane detection algorithms achieve accuracy rates above 98%, making them highly effective for real-time applications [45].
However, performance begins to degrade in adverse weather. In rainy conditions, LKAS effectiveness diminishes due to factors like reduced contrast and water interference on sensors. Despite this, systems with enhanced image processing, such as those using convolutional neural networks (CNNs) or histogram equalization, retain moderate accuracy [45].
Foggy conditions present a greater challenge, significantly impairing system reliability. Visibility is reduced, and lane markings become indistinct, which affects both human and algorithmic lane-keeping capabilities. While systems employing optical flow and contextual regularization offer some improvement, the overall performance remains suboptimal [46].
Snowy conditions are among the most problematic for LKAS. Snow often covers lane markings entirely, leading to substantial degradation in detection accuracy. Nonetheless, a few robust algorithms manage to extract features despite the snow cover, though their success rates are still notably lower compared to clear weather scenarios [45]. The overall performance is summarized in Table 7.

3.9. Traffic Sign Detection Systems Under Weather Conditions

Traffic sign detection systems, particularly those used for “speed alert” functions, demonstrate varying levels of effectiveness depending on weather conditions. In sunny weather, most systems perform at their best due to high visibility and clear contours. For example, YOLOv3-based systems achieve high average precision (AP) and low false negatives in clear conditions [47]. Similarly, DFR-TSD and improved YOLOv5-based models report excellent recall and precision under sunny scenarios [48,49].
In rainy conditions, performance consistently drops across all models due to visibility reduction and motion blur. However, the DFR-TSD framework’s enhancement module and the YOLOv5-based system with coordinate attention (CA) show significant robustness and improved detection rates despite these challenges [48,49].
Foggy environments pose the greatest challenge for detection models, particularly due to image blurring and low contrast. DPF-YOLOv8, which was explicitly trained on fog-augmented datasets, improves mean average precision (mAP) by over 2% compared to standard YOLOv8, demonstrating superior adaptation to hazy weather [50]. Similarly, DFR-TSD’s modular architecture enhances sign visibility, reducing false negatives even in dense fog [48].
Under snowy conditions, the detection accuracy is moderately affected. While visibility is sometimes compromised, models like DFR-TSD and improved YOLOv5 maintain high precision and relatively stable recall due to robust feature extraction techniques [48,49]. The summarized performance of the system is illustrated in Table 8.

3.10. Stability Control System Performance

SCS, including technologies such as ABS, TCS, and MSC, demonstrates varying levels of effectiveness depending on environmental conditions.
In sunny weather, these systems perform optimally. Favorable conditions like dry and warm roads have been associated with an increase in motorcycle use and extended riding seasons, although not with a reduction in system performance [51]. This suggests that under dry conditions, these technologies function as designed, providing reliable support for vehicle control and crash avoidance.
In rainy conditions, performance may be compromised. The effectiveness of stability-related ADAS can be diminished due to decreased traction and sensor limitations caused by precipitation [51,52]. While systems like ABS offer significant benefits, their full potential can be impaired by real-world conditions, such as wet surfaces.
Under foggy weather, system performance is further impacted. Detection systems like Radar and cameras may suffer reduced visibility, affecting the timely activation of emergency interventions [52]. Although direct data on fog-related motorcycle stability is limited, modeling adjustments in ADAS performance confirm these limitations.
In snowy weather, performance is least reliable. Snow creates severe traction challenges and may obscure sensor functionality entirely. As a result, the reduction in crash potential attributed to these systems is expected to be significantly lower, and no empirical motorcycle-specific data are yet available [52]. Table 9 summarizes the performance of the system in adverse weather conditions.

3.11. Traction Control System Performance Across Different Weather Conditions

TCS performance varies significantly depending on environmental conditions. Under sunny conditions, TCS operates in its optimal state due to high road–tire friction. These conditions are frequently used as benchmarks in modeling and control design, where the system can maintain stability with minimal wheel slip [53].
In rainy conditions, TCS performance is moderately affected. Although road–tire friction is reduced, the damping effect on engine-to-slip dynamics can assist in maintaining control, assuming the control system is properly tuned for such transitions [54]. However, timely torque modulation becomes more critical.
Under foggy conditions, while road traction may remain relatively unchanged, sensor reliability, particularly for vision-based or sensor fusion-dependent systems, declines. This can hinder the TCS’s ability to accurately assess traction and react effectively, highlighting a need for robust perception algorithms [55].
In snowy conditions, the challenges are most severe. The road–tire friction is drastically lowered, and maintaining a stable slip ratio becomes increasingly difficult. Advanced control strategies, such as model predictive control or machine learning-based methods, are often required to mitigate excessive slip and loss of stability [53,54]. The system performance is evaluated in Table 10.
The analysis of ARAS modules under different weather conditions described in Section 3 is illustrated in the heatmap. The heatmap displays the performance of ARAS across various weather conditions, sunny, rainy, foggy, and snowy, utilizing normalized performance levels as indicated in Table 11. These normalized terms reflect the systems’ relative effectiveness in providing reliable support in adverse environments.
From Figure 3, it is clear that ACC, LKAS, and the stability control system (SCS) maintain relatively high performance across most weather conditions. For example, ACC shows Excellent performance in sunny conditions and sustains at least moderate performance in rain and fog, outperforming more weather-sensitive systems like AEB, which drops to Poor in snowy conditions.
The ISS module is notably vulnerable to extreme weather, becoming unusable/unknown in snowy conditions, likely due to the complexity of decision-making at intersections under reduced visibility or occluded road markings.
In terms of robustness, SCS and ACC are among the most consistent performers across all conditions. While AEB and ISS show significant degradation, especially in snow, which is the most challenging weather condition overall.
This analysis suggests that while many ARAS components are effective in favorable weather, there is a need for improvement in perception and control systems to handle low-visibility or low-traction environments, particularly for systems that rely heavily on camera input.

4. Sensors Technology of ARAS

ARAS rely heavily on sensor technologies to perceive and interpret the surrounding environment. The performance and reliability of these systems depend on the type, configuration, and integration of sensors. This chapter focuses on three key sensor types commonly used in ARAS: Radar, LiDAR, and cameras. Each has unique advantages and limitations, particularly when operating in challenging weather conditions.

4.1. Radar in ARAS

Radar is a system that uses radio waves to determine the distance to objects by analyzing reflected signals. While initially crucial in military applications for detecting aircraft and other objects, and later expanded to various fields like traffic control, meteorology, and astronomy [56], Radar is now a core technology in car-based ADAS. In cars, Radar sensors are used to differentiate the vehicle’s surroundings, identify different object types by analyzing signal reflections [57,58,59,60,61], and provide real-time data to enhance safety and comfort.
Building on Radar’s foundational role in automotive ADAS, this technology is now adapted for ARAS in two-wheelers. Similar to its use in cars, Radar in ARAS employs radio waves to detect and track surrounding objects, such as vehicles, pedestrians, or obstacles, even in challenging conditions like low visibility or heavy rain [62]. For riders, Radar-enabled ARAS enhances safety by providing critical features such as ACC (maintaining a safe distance from vehicles ahead) and collision warning systems, which leverage Radar–vision fusion techniques to improve accuracy in dynamic environments [63]. Additionally, blind-spot detection benefits from advancements in synthetic aperture Radar (SAR) imaging, enabling high-resolution object recognition tailored to motorcycle dynamics [64]. However, implementing Radar in ARAS requires overcoming challenges like sensor miniaturization and vibration resistance, which are being addressed through innovations in multi-polarimetric imaging and deep learning [65]. By integrating Radar with cameras and lidar, ARAS aims to create a comprehensive safety net for riders, reducing accident risks and bridging the gap between automotive and motorcycle safety innovations. Table 12 provides an overview of various Radar types, detailing their specific applications and functional roles.
Automotive Radar systems operate across different frequency bands, each optimized for specific detection ranges and fields of view (FoV). Short-range Radars (SRRs) typically use the 24 GHz band, providing a wide azimuth FoV for applications like blind-spot detection and parking assist. Mid-range Radars (MRRs) and long-range Radars (LRRs) primarily operate in the 76–77 GHz band, offering increased range and precision for collision avoidance and adaptive cruise control [69,76]. The 77–81 GHz band is emerging for future high-resolution Radar applications, enhancing detection accuracy for autonomous driving [57,77]. Table 13 summarizes the classification, frequency ranges, and key applications of automotive Radar.

4.2. Radar in Different Weather Conditions

Radar systems, leveraging millimeter-wave technology, maintain robust performance across diverse weather conditions. In sunny conditions, Radar operates unaffected by lightning, ensuring consistent object detection [83]. During rain, Radar signals penetrate droplets effectively, though heavy rain may cause minor attenuation, reducing range [83,84]. In fog, Radar excels due to its wavelength (1–10 mm) being much larger than fog particles (1–10 µm), minimizing scattering and enabling reliable detection [85]. Snow has a limited impact on Radar, as light snowflakes do not significantly disrupt signals, though heavy snow can cause scattering. However, Radar retains functionality, and these attributes make Radar critical for ARAS, ensuring all-weather reliability [83,85]. Table 14 provides a focused evaluation of Radar performance under different weather conditions, including the types of Radar used, their effectiveness, and relevant applications.

4.3. Lidars in Different Weather Conditions

LiDAR systems face significant challenges in adverse weather due to light scattering and signal attenuation caused by atmospheric particles. In fog, dense water droplets scatter laser beams, severely reducing detection range and accuracy [91,92]. Rain and snow exacerbate this issue, with heavy precipitation causing signal attenuation and incomplete point clouds. For example, studies show LiDAR performance drops by over 50% in heavy rain or snow [93,94,95], while fog can degrade object classification by up to 13.43%. Arctic conditions further impair LiDAR due to snow accumulation and salt residue on sensors. These limitations highlight LiDAR’s vulnerability compared to Radar, which penetrates such obstacles more effectively [96]. Table 15 offers a detailed analysis of LiDAR performance across different weather conditions, focusing on its sensing capabilities, types of LiDAR, and related applications.
Figure 4 compares key performance attributes of 905 nm and 1550 nm LiDAR technologies. While 905 nm systems are more mature and cost-effective, they generally offer shorter range and lower environmental resilience. In contrast, 1550 nm LiDAR provides superior range, precision, and beam quality, but at the cost of higher complexity and sensitivity to water absorption. The performance of each metric is measured for 1 to 3, representing poor as 1 and 3 as good, in which 905 nm excels in cost, maturity, power efficiency, and water absorption. 1550 nm outperforms in distance, penetrability, and optical performance (divergence, etc.).

4.4. Cameras in Different Weather Conditions

ARAS often relies on cameras as primary sensors for tasks such as object detection, lane-keeping, and collision avoidance. Cameras are versatile sensors for ARAS, performing well in sunny and low-light conditions with proper calibration. However, they face challenges in foggy, snowy, and rainy conditions, particularly regarding visibility and lens obstruction. To overcome these limitations, ARAS systems often integrate cameras with other sensors like LiDAR or Radar, leveraging multi-sensor fusion to ensure robust performance across all conditions. Additionally, advancements in adaptive algorithms and hardware design continue to improve camera resilience in adverse environments. Table 16 provides an evaluation of camera-based sensor performance under various weather conditions, focusing on image clarity, sensor type, and adaptability to environmental challenges.

4.5. Sensor Fusion

While previous sections have examined the individual capabilities of radar, LiDAR, and camera systems, it is increasingly clear that no single sensor is sufficient to guarantee reliable environmental perception in all weather conditions. Sensor fusion aims to combine the complementary information from multiple sensors to improve the robustness and accuracy of advanced rider assistance systems (ARAS).
In adverse conditions such as fog, heavy rain, or snow, sensors degrade in different ways. Rather than relying solely on one modality, multimodal fusion techniques process data from multiple sources, simultaneously leveraging the spatial accuracy of LiDAR, the semantic richness of vision, and the weather resilience of radar. This approach allows ARAS to maintain perception integrity even when some inputs are compromised [105,106].
Recent advances in deep learning-based sensor fusion have significantly improved detection performance under harsh conditions. For example, Robust-FusionNet introduces a pointwise-aligned fusion method using K-means++ clustering, along with an implicit feature pyramid network (i-FPN) and hybrid attention modules to mitigate sensor distortion caused by rain, fog, or exposure imbalance [107]. Similarly, other architectures like PointPainting and EPNet enrich point cloud data with semantic image features to enhance the detection of distant or occluded objects in poor visibility [108,109].
Beyond performance, adaptability is becoming central. Fusion networks are increasingly trained to adjust dynamically to changing weather, weighting sensor inputs based on real-time quality assessments [110]. These adaptive systems are better equipped to maintain reliability across diverse riding environments, especially in motorcycle contexts where compact design and sensor limitations add complexity.
Despite progress, challenges persist. Fusion models require precise calibration, synchronization, and real-time processing capability. Adverse weather also introduces uncertainty and noise, requiring advanced attention and alignment mechanisms to avoid cascading errors. The current trend toward multimodal datasets and simulation tools supports progress in this area, but weather-resilient sensor fusion for two-wheelers remains a developing field.

5. Datasets’ Performance in Adverse Weather Conditions

High-quality datasets are crucial for developing, training, and evaluating ARAS, particularly under real-world and diverse environmental conditions. Table 17 presents a comparative overview of key public datasets relevant to ARAS and autonomous driving, assessed by their coverage of different weather conditions. Each dataset is marked for the presence of sunny, rainy, foggy, and snowy scenarios, which are critical for developing and validating perception systems in diverse environments. This table identifies suitable datasets for training and testing ARAS algorithms under normal and adverse conditions.

Weather-Condition Suitability for ARAS Datasets

ARAS for motorcycles relies heavily on sensor fusion to navigate safely under diverse environmental conditions. However, weather plays a critical role as it influences the effectiveness of these sensors. Radar has been shown to maintain high performance in rain, fog, and snow, making it a key enabler for all-weather ARAS functions [90,111]. In contrast, LiDAR systems, although accurate in clear conditions, suffer from performance degradation due to light scattering and reflection in rain and snow [112]. Camera-based systems are particularly vulnerable to adverse lighting and weather, such as fog or direct sunlight, leading to reduced image clarity and incorrect object detection [111,117]. Inertial measurement units (IMUs) and GPS modules typically offer stable performance but are not resistant to signal attenuation, especially in dense urban areas or heavy precipitation [112]. Multi-sensor datasets such as RADIATE [111], K-Radar [90], Boreas [112], and the Oxford Radar RobotCar Dataset [117] provide crucial benchmarks for testing sensor resilience across weather scenarios. Additionally, Sheeny [131] demonstrates that Radar-infrared fusion can improve object recognition in poor visibility, highlighting the need for advanced sensor integration under real-world conditions. These datasets and studies form the empirical backbone for designing ARAS capable of functioning safely year-round. Table 18 provides an overview of the suitability of datasets for ARAS, with emphasis on sensor modalities and weather condition coverage relevant for autonomous vehicle research.
Table 19 provides a consolidated summary of sensor limitations by weather condition, categorizing how Radar, LiDAR, and camera systems are affected across sunny, rainy, foggy, and snowy environments.

6. Algorithms and Techniques

Machine learning plays a central role in advanced rider assistance systems (ARAS), enabling key perception and decision-making capabilities required for safe operation in complex environments. The primary tasks involved in environmental perception include object detection (identifying the presence and location of relevant objects), segmentation (delineating object boundaries or regions), and classification (assigning semantic labels to objects or scene elements) [133]. These tasks are typically addressed using different machine learning paradigms, each suited to different aspects of the problem, as shown in Table 20.
Broadly, machine learning can be categorized into three main types: supervised learning, where models are trained on labelled datasets to learn mappings between inputs and outputs; unsupervised learning, which extracts structure from unlabelled data, supporting clustering or anomaly detection; and reinforcement learning, in which agents learn optimal actions through interaction and feedback, applicable in adaptive behaviour and control strategies [134,135]. Understanding these paradigms provides a foundation for evaluating the algorithmic approaches used in ARAS systems, especially when addressing the challenges posed by adverse weather and limited computing resources.
ARAS and ADAS rely on a layered pipeline of perception, interpretation, and control to ensure safe operation across diverse and dynamic environments. At the foundation is perception, where sensors and algorithms detect, segment, and classify environmental elements. The next layer is interpretation and understanding, which gives semantic context, like predicting object trajectories and understanding scene layout. Finally, decision and control support uses these interpreted data to assist with real-time actions such as collision avoidance or path planning. Supporting all these layers are cross-cutting task techniques like sensor fusion, uncertainty estimation, and real-time optimization, which enhance reliability and robustness, especially in challenging conditions, as show in Table 21.

Performance Considerations and Algorithmic Trade-Offs

Machine learning plays a central role in ARAS systems, supporting perception, interpretation, and decision-making across a wide range of tasks. In perception, CNN-based architectures such as YOLO, DeepLabv3+, and Mask R-CNN offer high accuracy and fast inference, making them suitable for real-time object detection and segmentation [105]. However, their effectiveness is often reduced in adverse weather conditions such as fog, rain, and low visibility, where sensor noise degrades performance [109]. Lane detection and monocular depth estimation models, while lightweight and efficient, show similar limitations in challenging environments and often require weather-adapted training data or sensor fusion to maintain reliability [109].
In interpretation tasks, trajectory and behaviour prediction methods—including LSTMs, attention-based models, and graph networks—have demonstrated success in structured environments. Yet, they remain sensitive to ambiguous perception inputs and may struggle with occlusions or unexpected behaviour from other road users [136]. Localization systems such as ORB-SLAM2 and VINS-Fusion are effective in favourable conditions but degrade under snow or rapidly changing lighting. LiDAR-based approaches provide better robustness but come with higher computational cost and integration complexity [105].
Control and decision-making in ARAS often rely on traditional approaches like PID controllers and model predictive control (MPC), known for their transparency, stability, and real-time performance [136]. While reinforcement learning and imitation learning offer flexibility and adaptability in simulation, they are currently limited by high data demands and a lack of interpretability, especially in safety-critical motorcycle contexts [136].
To improve system resilience in diverse and dynamic conditions, sensor fusion has become increasingly important. Multimodal fusion approaches combining radar, LiDAR, and cameras enable more reliable detection and environmental understanding, especially under adverse weather conditions [105,109]. However, fusion models also bring added challenges such as increased computational demands, the need for precise calibration, and real-time synchronization—factors that are particularly relevant in the compact and resource-constrained context of two-wheeled vehicles [105].
Emerging solutions in edge AI optimization, such as model pruning, quantization, and hardware acceleration, have shown promise in addressing real-time performance constraints [136]. Additionally, recent work on adaptive fusion weighting and uncertainty estimation may further enhance robustness in unpredictable conditions [109]. Together, these advances suggest a clear trend toward practical, weather-resilient ARAS architectures that aim to balance performance, efficiency, and safety.

7. Discussion and Analysis

This study provides a comprehensive evaluation of ARAS under diverse environmental conditions, with a particular focus on weather variability, a critical factor in ensuring motorcyclist safety and ARAS reliability. The motivation behind this work stems from the pressing need to understand how current ARAS solutions perform in real-world, adverse weather scenarios, and to identify technological and data-driven gaps that must be addressed to advance safety in two-wheeled transportation. The findings confirm that while ARAS modules such as ACC, ABS, AEB, and BSD perform reliably in clear weather, their performance declines in fog, snow, and heavy rain, mostly for systems dependent on optical sensors or visual recognition. This observation highlights not only the sensitivity of perception in adverse environments but also the limitations of benchmarking ARAS primarily under controlled conditions.
Subsequently, the paper shifts focus to enabling sensor technologies, RADAR, LiDAR, and camera systems, evaluating their individual and integrated performance in adverse conditions. RADAR maintains operational robustness in poor visibility, yet its relatively low resolution limits its usefulness for fine-grained detection and curve warning functions. By contrast, LiDAR and cameras deliver richer semantic information but are disproportionately affected by occlusion, light scattering, and precipitation. These complementary strengths and weaknesses indicate that sensor fusion is not a secondary improvement but a necessary design choice for ARAS. However, the transfer of sensor fusion strategies from car-based ADAS to motorcycles is not straightforward due to differences in dynamics, mounting constraints, and exposure to environmental noise.
Recognizing the pivotal role datasets play in both system training and performance benchmarking, the paper further explores datasets used across both ADAS and ARAS domains. Well-known resources such as KITTI, nuScenes, and RADIATE remain valuable for perception research, but they are car-centric and fail to capture motorcycle-specific dynamics such as roll angle, vibration, and limited sensor installation space. Although emerging ARAS datasets developed through academic–industry collaborations mark an important step forward, they are still limited in scale and diversity. As a result, algorithm development and performance benchmarking remain constrained, and there is an urgent need for large-scale, standardized datasets that reflect the realities of two-wheeled transportation in adverse weather conditions.
Lastly, the analysis of algorithms and techniques illustrates that deep learning approaches, especially with LiDAR input, can significantly enhance perception under low visibility conditions. However, their heavy reliance on large balanced datasets makes them prone to biases, an issue amplified by the lack of ARAS-specific data. Radar-based methods remain stable in rain but struggle with high-resolution classification of small or fast-moving targets. These limitations illustrate the trade-offs between robustness and resolution that continue to shape ARAS development and emphasize the importance of evaluating algorithms under conditions that mirror real-world complexity rather than controlled benchmarks.

7.1. Conclusions

This study examined the role of advanced rider assistance systems (ARAS) in enhancing motorcycle safety, with a particular focus on performance under varying weather conditions. The discussion highlighted different ARAS modules and their operational principles, emphasizing the limitations of sensor performance in adverse environments such as rain, fog, and snow. Addressing these challenges requires the use of diverse datasets that capture a wide range of weather scenarios, enabling effective benchmarking of ARAS modules through criteria that balance data quality, environmental diversity, and applicability to two-wheeler dynamics. In addition, a range of algorithms and techniques were reviewed, with links to recommended datasets and software tools to support practical deployment. By integrating insights on ARAS modules, sensor limitations, datasets, and algorithms, this study provides a structured foundation for advancing research in challenging weather contexts and contributes to the development of robust, weather-resilient ARAS solutions that enhance rider safety across diverse environmental conditions.

7.2. Future Recommendations

As ARAS technology advances toward real-world deployment, several critical challenges remain. One of the most pressing issues is ensuring reliable system performance in diverse and adverse weather conditions, where sensor degradation and data noise significantly affect perception accuracy. While this review highlights the benefits of radar, LiDAR, and camera fusion, deploying these multi-sensor systems on motorcycles presents unique constraints related to space, cost, and power efficiency. Additionally, deep learning models used for sensor fusion often require large datasets, high computational resources, and struggle with real-time inference, especially under changing environmental conditions. Future ARAS development must focus on creating lightweight, adaptive algorithms capable of processing fused data under strict latency and hardware limitations. Emerging approaches such as self-supervised learning, edge AI, and multimodal transformer architectures offer promising directions for improving system generalization, robustness, and efficiency. At the same time, mass adoption will depend on addressing broader integration and regulatory challenges, including standardization of sensor protocols, fail-safe mechanisms, and ensuring user trust in automated interventions. Moving forward, ARAS research must not only refine technical performance but also align with deployment realities and safety-critical requirements in the motorcycle domain.

Author Contributions

Conceptualization, Z.U., and J.A.C.d.S.; methodology, R.R.N.; validation, V.F., J.B., and A.R.; formal analysis, Z.U., J.A.C.d.S. and R.R.N.; investigation, R.R.N., Z.U., and E.J.S.P.; data curation, R.R.N., and E.J.S.P.; writing—original draft preparation, Z.U., and J.A.C.d.S.; writing—review and editing J.A.C.d.S., and Z.U., E.J.S.P.; visualization, Z.U.; supervision, V.F., E.J.S.P. and A.R.; project administration, J.B., and A.R.; funding acquisition, J.B., and A.R. All authors have read and agreed to the published version of the manuscript.

Funding

The study was developed under the project A-MoVeR–“Mobilizing Agenda for the Development of Products & Systems towards an Intelligent and Green Mobility”, Operation No. 02/C05-i01.01/2022.PC646908627-00000069, approved under the terms of the call No. 02/C05-i01/2022–Mobilizing Agendas for Business Innovation, financed by European Funds provided to Portugal by the Recovery and Resilience Plan (RRP), in the Scope of the European Recovery and Resilience Facility (RRF), framed in the Next Generation UE, for the period from 2021–2026.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Bloomberg, M.R. Global Status Report on Road Safety 2023. 2023. Available online: https://iris.who.int/bitstream/handle/10665/375016/9789240086517-eng.pdf?sequence=1 (accessed on 25 May 2025).
  2. Thematic Reports—European Commission. Available online: https://road-safety.transport.ec.europa.eu/european-road-safety-observatory/data-and-analysis/thematic-reports_en (accessed on 25 May 2025).
  3. WHO. Powered Two-and Three-Wheeler Safety: A Road Safety Manual for Decision-Makers and Practitioners; WHO: Geneva, Switzerland, 2022. [Google Scholar]
  4. Motorcycle Safety and Accidents in Europe. Available online: https://www.femamotorcycling.eu/motorcycle-safety-and-accidents/ (accessed on 23 June 2025).
  5. Hamm, M.; Lichtenthäler, J. BMW Motorrad Rider Assistance Systems. In Proceedings of the 12th International Munich Chassis Symposium 2021: Chassis. Tech Plus, Munich, Germany, 29–30 June 2021; Springer: Berlin/Heidelberg, Germany, 2022; pp. 744–758. [Google Scholar]
  6. Kuschefski, A.; Haasper, M.; Vallese, A. Advanced rider assistance systems for powered two-wheelers (ARAS-PTW). In Proceedings of the 22nd International Technical Conference on the Enhanced Safety of Vehicles (ESV), Washington, DC, USA, 13–16 June 2011; Available online: https://trid.trb.org/View/1363491 (accessed on 29 May 2025).
  7. Bekiaris, E.D.; Spadoni, A.; Nikolaou, S.I. SAFERIDER Project: New safety and comfort in Powered Two Wheelers. In Proceedings of the 2009 2nd Conference on Human System Interactions, Catania, Italy, 21–23 May 2009; IEEE: New York, NY, USA, 2009; pp. 600–602. [Google Scholar]
  8. Bosch, Advanced Rider Assistance Systems. Available online: https://www.bosch-mobility.com/en/solutions/assistance-systems/advanced-rider-assistance-systems-2w/ (accessed on 21 March 2024).
  9. Noriaki, I.; Shintaro, O.; Yoshihiro, S.; Kazuya, O.; Grewe, R. Collision Risk Prediction Utilizing Road Safety Mirrors at Blind Intersections. In Proceedings of the 27th International Technical Conference on the Enhanced Safety of Vehicles (ESV) National Highway Traffic Safety Administration, Yokohama, Japan, 3–6 April 2023. [Google Scholar]
  10. Teoh, E.R. Motorcycle antilock braking systems and fatal crash rates: Updated results. Traffic Inj. Prev. 2022, 23, 203–207. [Google Scholar] [CrossRef]
  11. Savino, G.; Lot, R.; Massaro, M.; Rizzi, M.; Symeonidis, I.; Will, S.; Brown, J. Active safety systems for powered two-wheelers: A systematic review. Traffic Inj. Prev. 2020, 21, 78–86. [Google Scholar] [CrossRef]
  12. Ullah, J.S.Z.; Reis, A.; Pires, E.; Pendão, C.; Filipe, V. Riding with Intelligence: Advanced Rider. In Proceedings of the Distributed Computing and Artificial Intelligence, Special Sessions I, 21st International Conference, Salamanca, Spain, 26–28 June 2024; Springer Nature: Berlin/Heidelberg, Germany; p. 226. [Google Scholar]
  13. Chidambaram, R.K.; Pedapati, P.R. Challenges in implementing rider assisting system to enhance the power two-wheeler safety: A review. Int. J. Veh. Auton. Syst. 2023, 17, 73–105. [Google Scholar] [CrossRef]
  14. Hu, L.; Li, H.; Yi, P.; Huang, J.; Lin, M.; Wang, H. Investigation on AEB key parameters for improving car to two-wheeler collision safety using in-depth traffic accident data. IEEE Trans. Veh. Technol. 2022, 72, 113–124. [Google Scholar] [CrossRef]
  15. Omar, N.; Kasno, S.F.; Jablan, N.A.; Ibrahim, A.; Leh, F.L.N.; Chung, J.L.J.; Wen, P.J.; Dampa, D.; Zheng, L.J.; Hao, K.P.J. Effect of Weather on Vehicles Speed (Case Study at KM23 FT050). Recent Trends Civ. Eng. Built Environ. 2024, 5, 85–91. [Google Scholar]
  16. Kutila, M.; Pyykönen, P.; Ritter, W.; Sawade, O.; Schäufele, B. Automotive LIDAR sensor development scenarios for harsh weather conditions. In Proceedings of the 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), Rio de Janeiro, Brazil, 1–4 November 2016; IEEE: New York, NY, USA, 2016; pp. 265–270. [Google Scholar]
  17. Kaye, S.-A.; Nandavar, S.; Lewis, I.; Blackman, R.; Schramm, A.; McDonald, M.; Oviedo-Trespalacios, O.; Haworth, N. Exploring beliefs and perceptions towards Advanced Rider Assistance Systems (ARAS) in motorcycle safety. Transp. Res. Part F Traffic Psychol. Behav. 2024, 102, 77–87. [Google Scholar] [CrossRef]
  18. Naweed, A.; Blackman, R. The art of riding safely: A critical examination of advanced rider assistance systems in motorcycle safety discourse. Transp. Res. Part F Traffic Psychol. Behav. 2024, 107, 1198–1213. [Google Scholar] [CrossRef]
  19. Coelho, C.J.; Garets, S.B.; Bailey, J.D.; Frank, T.A.; Scully, I.D.; Cades, D.M. Human Factors Issues of Advanced Rider Assistance Systems (ARAS). In Proceedings of the Human Factors in Transportation 2023, San Francisco, CA, USA, 20–24 July 2023; pp. 292–303. [Google Scholar] [CrossRef]
  20. Ait-Moula, A.; Riahi, E.; Serre, T. Effect of advanced rider assistance system on powered two wheelers crashes. Heliyon 2024, 10, e26031. [Google Scholar] [CrossRef]
  21. Shojaeefard, M.H.; Mollajafari, M.; Ebrahimi-Nejad, S.; Tayebi, S. Weather-aware fuzzy adaptive cruise control: Dynamic reference signal design. Comput. Electr. Eng. 2023, 110, 108903. [Google Scholar] [CrossRef]
  22. Grove, K.; Atwood, J.; Hill, P.; Fitch, G.; DiFonzo, A.; Marchese, M.; Blanco, M. Commercial motor vehicle driver performance with adaptive cruise control in adverse weather. Procedia Manuf. 2015, 3, 2777–2783. [Google Scholar] [CrossRef][Green Version]
  23. Al-Hindawi, R.; Alhadidi, T.; Adas, M. Evaluation and Optimization of Adaptive Cruise Control in Autonomous Vehicles using the CARLA Simulator: A Study on Performance under Wet and Dry Weather Conditions. In Proceedings of the 2024 IEEE International Conference on Advanced Systems and Emerging Technologies (IC_ASET), Hammamet, Tunisia, 27–29 April 2024. [Google Scholar] [CrossRef]
  24. Gulzar, F.; Butt, Y.A.; Iqbal, A. Adaptive Cruise Control for Ground Vehicles using Control Barrier Function under Weather based Surface Conditions. In Proceedings of the 19th International Bhurban Conference on Applied Sciences and Technology (IBCAST), Islamabad, Pakistan, 16–20 August 2022. [Google Scholar] [CrossRef]
  25. Rabbani, G.A.; Syahriar, A.; Astharini, D. Controlling the Performance of Anti-lock Braking System at Various Tracks and Vehicle Conditions. In Proceedings of the International Conference on Engineering and Information Technology for Sustainable Industry (ICONETSI), Online, 21 November 2022. [Google Scholar] [CrossRef]
  26. Fildes, B.; Newstead, S.; Rizzi, M.; Fitzharris, M.; Budd, L. Evaluation of the Effectiveness of Anti-Lock Braking Systems on Motorcycle Safety in Australia. 2015. Available online: https://media.nrspp.org.au/wp-content/uploads/2024/10/06001604/muarc327.pdf (accessed on 1 June 2025).
  27. Ahangarnejad, A.H.; Radmehr, A.; Ahmadian, M. A Review of Vehicle Active Safety Control Methods: From Antilock Brakes to Semiautonomy. J. Vib. Control 2020, 27, 1683–1712. [Google Scholar] [CrossRef]
  28. Burton, D.; Delaney, A.; Newstead, S.; Logan, D.; Fildes, B. Evaluation of Anti-Lock Braking Systems Effectiveness. 2004. Available online: https://www.racv.com.au (accessed on 1 June 2025).
  29. Haus, S.H.; Sherony, R.; Gabler, H.C.; Cicchino, J.B. Potential Effectiveness of Bicycle-Automatic Emergency Braking using the Washtenaw Area Transportation Study Data Set. Transp. Res. Rec. J. Transp. Res. Board 2021, 2675, 265–270. [Google Scholar] [CrossRef]
  30. Hu, W.; Cicchino, J.B.; Sherony, R. Effects of automatic emergency braking systems to reduce risk of crash and serious injuries among pedestrians and bicyclists. Traffic Saf. Res. 2022, 9, e000085. [Google Scholar]
  31. Yang, W.; Zhang, X.; Lei, Q.; Cheng, X. Research on Longitudinal Active Collision Avoidance of Autonomous Emergency Braking Pedestrian System (AEB-P). Sensors 2019, 19, 4671. [Google Scholar] [CrossRef]
  32. Tseng, D.-C.; Hsu, C.-T.; Chen, W.-S. Blind-Spot Vehicle Detection Using Motion and Static Features. Int. J. Mach. Learn. Comput. 2014, 4, 516–523. [Google Scholar] [CrossRef]
  33. Fernández, C.; Llorca, D.F.; Sotelo, M.A.; Daza, I.G.; Hellín, A.M.; Álvarez, S. Real-time vision-based blind spot warning system: Experiments with motorcycles in daytime/nighttime conditions. Int. J. Automot. Technol. 2013, 14, 113–122. [Google Scholar] [CrossRef]
  34. Zhao, Y.; Bai, L.; Lyu, Y.; Huang, X. Camera-based blind spot detection with a general purpose lightweight neural network. Electronics 2019, 8, 233. [Google Scholar] [CrossRef]
  35. Bagi, S.S.G.; Khoshnevisan, M.; Garakani, H.G.; Moshiri, B. Sensing Structure for Blind Spot Detection System in Vehicles. In Proceedings of the 2019 International Conference on Control, Automation and Information Sciences (ICCAIS), Chengdu, China, 23–26 October 2019; pp. 1–6. [Google Scholar] [CrossRef]
  36. Huu, P.N.; Quynh, P.N.; Nhat, M.H.; Dang, D.D.; Phuong, A.N.T.; Le Chi, T.; Hai, T.L.T.; Nguyen, T.D.; Minh, Q.T. Vehicle Blind Spot Detection and Tracking System Based on Machine Learning Model. In Proceedings of the 2023 8th International Scientific Conference on Applying New Technology in Green Buildings (ATiGB), Danang, Vietnam, 10–11 November 2023; pp. 292–296. [Google Scholar] [CrossRef]
  37. Zhu, J. Investigation of Factors Contributing to Fog-Related Single Vehicle Crashes. 2018. Available online: https://stars.library.ucf.edu/etd/5775 (accessed on 2 June 2025).
  38. Chou, E.-F.; Tseng, D.-C. Weather-adapted Vehicle Detection for Forward Collision Warning System. In Proceedings of the World Congress on Engineering, London, UK, 6–8 July 2011; Volume II. [Google Scholar]
  39. Pan, J.-S.; Ma, S.; Chen, S.-H.; Yang, C.-S. Vision-based Vehicle Forward Collision Warning System Using Optical Flow Algorithm. J. Inf. Hiding Multimed. Signal Process. 2015, 6, 1029–1037. [Google Scholar]
  40. Chen, K.-P.; Hsiung, P.-A. Vehicle Collision Prediction under Reduced Visibility Conditions. Sensors 2018, 18, 3026. [Google Scholar] [CrossRef]
  41. Zhang, W.; Zhao, Y.; Zhou, F.; Deng, X.; Xie, D.; Xu, Y. Optimization of Forward Collision Warning Algorithm and Driver-in-the-Loop Validation for Intelligent Vehicles under Low-Visibility Conditions. Sensors 2022, 22, 8926. [Google Scholar] [CrossRef]
  42. Biral, F.; Lot, R.; Rota, S.; Fontana, M.; Huth, V. Intersection support system for powered two-wheeled vehicles: Threat assessment based on a receding horizon approach. IEEE Trans. Intell. Transp. Syst. 2012, 13, 805–816. [Google Scholar] [CrossRef]
  43. Barmpounakis, E.N.; Vlahogianni, E.I.; Golias, J.C. Intelligent transportation systems and powered two wheelers traffic. IEEE Trans. Intell. Transp. Syst. 2016, 17, 908–919. [Google Scholar] [CrossRef]
  44. Xia, Z.; Wu, J.; Wu, L.; Chen, Y.; Yang, J.; Yu, P.S. A comprehensive survey of the key technologies and challenges surrounding vehicular ad hoc networks. ACM Trans. Intell. Syst. Technol. 2021, 12, 1–30. [Google Scholar] [CrossRef]
  45. Faizan, M.; Hafeez, A.; Siddique, M.A.; Qureshi, M.A.; Ahsan, M. Design and development of in-vehicle lane departure warning system using standard global positioning system. In Proceedings of the 2019 16th International Bhurban Conference on Applied Sciences and Technology (IBCAST), Islamabad, Pakistan, 8–12 January 2019; pp. 699–704. [Google Scholar]
  46. Islam, M.S.; Nafi, S.N.; Tavakkoli, A. Analyzing the effect of fog weather conditions on driver lane-keeping performance using the SHRP2 naturalistic driving study data. J. Transp. Saf. Secur. 2020, 12, 86–109. [Google Scholar]
  47. Mijić, D.; Brisinello, M.; Vranješ, M.; Grbić, R. Traffic Sign Detection Using YOLOv3. In Proceedings of the 2020 IEEE International Conference on Consumer Electronics—Berlin (ICCE-Berlin), Berlin, Germany, 9–11 November 2020. [Google Scholar] [CrossRef]
  48. Ahmed, S.; Kamal, U.; Hasan, M.K. DFR-TSD: A Deep Learning Based Framework for Robust Traffic Sign Detection Under Challenging Weather Conditions. IEEE Trans. Intell. Transp. Syst. 2022, 23, 5150–5162. [Google Scholar] [CrossRef]
  49. Qu, S.; Yang, X.; Zhou, H.; Xie, Y. Improved YOLOv5-based for small traffic sign detection under complex weather. Sci. Rep. 2023, 13, 16219. [Google Scholar] [CrossRef]
  50. Zhang, Y.; Liu, H.; Dong, D.; Duan, X.; Lin, F.; Liu, Z. DPF-YOLOv8: Dual Path Feature Fusion Network for Traffic Sign Detection in Hazy Weather. Electronics 2024, 13, 4016. [Google Scholar] [CrossRef]
  51. Board, N.T.S. Select Risk Factors Associated with Causes of Motorcycle Crashes. NTSB Saf. Rep., No. NTSB/SR-18/01. 2018. Available online: https://www.ntsb.gov/safety/safety-studies/Documents/SR1801.pdf (accessed on 2 June 2025).
  52. Aleksa, M.; Schaub, A.; Erdelean, I.; Wittmann, S.; Soteropoulos, A.; Fürdös, A. Impact analysis of Advanced Driver Assistance Systems (ADAS) regarding road safety—Computing reduction potentials. Eur. Transp. Res. Rev. 2024, 16, 39. [Google Scholar] [CrossRef]
  53. Haque, T.S.; Rahman, M.H.; Islam, M.R.; Razzak, M.A.; Badal, F.R.; Ahamed, H.; Moyeen, S.I.; Das, S.K.; Ali, F.; Tasneem, Z.; et al. A Review on Driving Control Issues for Smart Electric Vehicles. IEEE Access 2021, 9, 135440–135467. [Google Scholar] [CrossRef]
  54. Massaro, M.; Sartori, R.; Lot, R. Numerical investigation of engine-to-slip dynamics for motorcycle traction control applications. Veh. Syst. Dyn. 2011, 49, 419–432. [Google Scholar] [CrossRef]
  55. Gaevskiy, V.V.; Ivanov, A.M. Problems of the application of intelligent driver assistance systems on a single-track vehicles. IOP Conf. Ser. Mater. Sci. Eng. 2018, 386, 12021. [Google Scholar] [CrossRef]
  56. Kari, S.S.; Raj, A.A.B. Evolutionary developments of today’s remote sensing radar technology—Right from the telemobiloscope: A review. IEEE Geosci. Remote Sens. Mag. 2023, 12, 67–107. [Google Scholar] [CrossRef]
  57. Bilik, I.; Longman, O.; Villeval, S.; Tabrikian, J. The rise of radar for autonomous vehicles: Signal processing solutions and future research directions. IEEE Signal Process. Mag. 2019, 36, 20–31. [Google Scholar] [CrossRef]
  58. Patole, S.M.; Torlak, M.; Wang, D.; Ali, M. Automotive radars: A review of signal processing techniques. IEEE Signal Process. Mag. 2017, 34, 22–35. [Google Scholar] [CrossRef]
  59. Neumann, T. Analysis of Advanced Driver-Assistance Systems for Safe and Comfortable Driving of Motor Vehicles. Sensors 2024, 24, 6223. [Google Scholar] [CrossRef]
  60. Wang, X.; Xu, L.; Sun, H.; Xin, J.; Zheng, N. On-road vehicle detection and tracking using MMW radar and monovision fusion. IEEE Trans. Intell. Transp. Syst. 2016, 17, 2075–2084. [Google Scholar] [CrossRef]
  61. Bilik, I. Comparative analysis of radar and lidar technologies for automotive applications. IEEE Intell. Transp. Syst. Mag. 2022, 15, 244–269. [Google Scholar] [CrossRef]
  62. Sasikumar, S.; Aravind Balaji, B.; Joshuva, A.; Deivanayagampillai, N. Cameraless sensor fusion: Developing a cost-effective driver assistance system using radar and ultrasonic sensor. Sens. Rev. 2025, 45, 186–197. [Google Scholar] [CrossRef]
  63. Naesseth, C.A. Vision and Radar Sensor Fusion for Advanced Driver Assistance Systems; Linköpings Universitet: Linköping, Sweden, 2013. [Google Scholar]
  64. Mengdao, X.; Penghui, M.A.; Yishan, L.O.U.; Guangcai, S.U.N.; Hao, L.I.N. Review of fast back projection algorithms in synthetic aperture radar. J. Radar 2024, 13, 1–22. [Google Scholar]
  65. Guo, J. The Latest Development of Synthetic Aperture Radar: An Overview. Appl. Comput. Eng. 2024, 112, 186–193. [Google Scholar] [CrossRef]
  66. Emadi, M. Radar Technology. Adv. Driv. Assist. Syst. Auton. Veh. Fundam. Appl. 2022, 265–304. [Google Scholar] [CrossRef]
  67. Belgiovane, D.; Chen, C.-C. Bicycles and human riders backscattering at 77 GHz for automotive radar. In Proceedings of the 2016 10th European Conference on Antennas and Propagation (EuCAP), Davos, Switzerland, 10–15 April 2016; pp. 1–5. [Google Scholar]
  68. Deacon, P.; Hunt, R.; Koenigsknecht, D.; Leonard, C.; Oakley, C. Frequency modulated continuous wave (FMCW) radar. Des. Team 2011, 6. Available online: https://bu.edu.eg/portal/uploads/Engineering,%20Shoubra/Electrical%20Engineering/2443/crs-14023/Files/FM%202.pdf (accessed on 21 July 2025).
  69. Hasch, J.; Topak, E.; Schnabel, R.; Zwick, T.; Weigel, R.; Waldschmidt, C. Millimeter-wave technology for automotive radar sensors in the 77 GHz frequency band. IEEE Trans. Microw. Theory Tech. 2012, 60, 845–860. [Google Scholar] [CrossRef]
  70. Rasshofer, R.H.; Naab, K. 77 GHz long range radar systems status, ongoing developments and future challenges. In Proceedings of the European Radar Conference, 2005. EURAD 2005, Paris, France, 3–4 October 2005; IEEE: New York, NY, USA, 2005; pp. 161–164. [Google Scholar]
  71. Gao, X.; Xing, G.; Roy, S.; Liu, H. Ramp-cnn: A novel neural network for enhanced automotive radar object recognition. IEEE Sens. J. 2020, 21, 5119–5132. [Google Scholar] [CrossRef]
  72. Dios, F.; Torres-Benito, S.; Lázaro, J.A.; Casas, J.R.; Pinazo, J.; Lerín, A. Experimental evaluation of a MIMO radar performance for ADAS application. Telecom 2024, 5, 508–521. [Google Scholar] [CrossRef]
  73. Galle, C.; Amelung, J.; Dallmann, T.; Brueggenwirth, S. Vehicle environment recognition for safe autonomous driving: Research focus on Solid-State LiDAR and RADAR. In Proceedings of the AmE 2020-Automotive Meets Electronics; 11th GMM-Symposium, Dortmund, Germany, 10–11 March 2020; pp. 1–3. [Google Scholar]
  74. Sun, S.; Petropulu, A.P.; Poor, H.V. MIMO radar for advanced driver-assistance systems and autonomous driving: Advantages and challenges. IEEE Signal Process. Mag. 2020, 37, 98–117. [Google Scholar] [CrossRef]
  75. Bilik, I.; Bialer, O.; Villeval, S.; Sharifi, H.; Kona, K.; Pan, M.; Persechini, D.; Musni, M.; Geary, K. Automotive MIMO radar for urban environments. In Proceedings of the 2016 IEEE Radar Conference (RadarConf), Philadelphia, PA, USA, 2–6 May 2016; IEEE: New York, NY, USA, 2016; pp. 1–6. [Google Scholar]
  76. Rohling, H.; Meinecke, M.-M. Waveform design principles for automotive radar systems. In Proceedings of the 2001 CIE International Conference on Radar Proceedings (Cat No. 01TH8559), Beijing, China, 15–18 October 2001; IEEE: New York, NY, USA, 2001; pp. 1–4. [Google Scholar]
  77. Engels, F.; Heidenreich, P.; Wintermantel, M.; Stäcker, L.; Al Kadi, M.; Zoubir, A.M. Automotive radar signal processing: Research directions and practical challenges. IEEE J. Sel. Top. Signal Process. 2021, 15, 865–878. [Google Scholar] [CrossRef]
  78. Raghunandan, K. RADAR for a Better Society. In Proceedings of the Introduction to Wireless Communications and Networks: A Practical Perspective; Springer: Berlin/Heidelberg, Germany, 2022; pp. 379–404. [Google Scholar]
  79. Ramteke, A.Y.; Ramteke, P.; Dhongade, A.; Modak, U.; Thakre, L.P. Blind Spot Detection for Autonomous Driving Using RADAR Technique. J. Phys. Conf. Ser. 2024, 2763, 12015. [Google Scholar] [CrossRef]
  80. Liu, G.; Wang, L.; Zou, S. A radar-based blind spot detection and warning system for driver assistance. In Proceedings of the 2017 IEEE 2nd Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), Chongqing, China, 25–26 March 2017; IEEE: New York, NY, USA, 2017; pp. 2204–2208. [Google Scholar]
  81. Zhao, Z.; Zhou, L.; Zhu, Q.; Luo, Y.; Li, K. A review of essential technologies for collision avoidance assistance systems. Adv. Mech. Eng. 2017, 9, 1687814017725246. [Google Scholar] [CrossRef]
  82. Dickmann, J.; Klappstein, J.; Hahn, M.; Appenrodt, N.; Bloecher, H.L.; Werber, K.; Sailer, A. Automotive radar the key technology for autonomous driving: From detection and ranging to environmental understanding. In Proceedings of the 2016 IEEE Radar Conference (RadarConf), Philadelphia, PA, USA, 2–6 May 2016; pp. 1–6. [Google Scholar]
  83. Muckenhuber, S.; Museljic, E.; Stettinger, G. Performance evaluation of a state-of-the-art automotive radar and corresponding modeling approaches based on a large labeled dataset. J. Intell. Transp. Syst. 2022, 26, 655–674. [Google Scholar] [CrossRef]
  84. Wang, X.; Wei, M.; Wang, Y.; Sun, H.; Ma, J. Radar Signal Behavior in Maritime Environments: Falling Rain Effects. Electronics 2023, 13, 58. [Google Scholar] [CrossRef]
  85. Golovachev, Y.; Etinger, A.; Pinhasi, G.A.; Pinhasi, Y. Millimeter wave high resolution radar accuracy in fog conditions—Theory and experimental verification. Sensors 2018, 18, 2148. [Google Scholar] [CrossRef] [PubMed]
  86. Ersü, C.; Petlenkov, E.; Janson, K. A Systematic Review of Cutting-Edge Radar Technologies: Applications for Unmanned Ground Vehicles (UGVs). Sensors 2024, 24, 7807. [Google Scholar] [CrossRef] [PubMed]
  87. Zang, S.; Ding, M.; Smith, D.; Tyler, P.; Rakotoarivelo, T.; Kaafar, M.A. The impact of adverse weather conditions on autonomous vehicles: How rain, snow, fog, and hail affect the performance of a self-driving car. IEEE Veh. Technol. Mag. 2019, 14, 103–111. [Google Scholar] [CrossRef]
  88. Mohammed, A.S.; Amamou, A.; Ayevide, F.K.; Kelouwani, S.; Agbossou, K.; Zioui, N. The perception system of intelligent ground vehicles in all weather conditions: A systematic literature review. Sensors 2020, 20, 6532. [Google Scholar] [CrossRef]
  89. Enayati, J.; Asef, P.; Wilson, P. Resilient Multi-range Radar Detection System for Autonomous Vehicles: A New Statistical Method. J. Electr. Eng. Technol. 2024, 19, 695–708. [Google Scholar] [CrossRef]
  90. Paek, D.-H.; Kong, S.-H.; Wijaya, K.T. K-radar: 4d radar object detection for autonomous driving in various weather conditions. Adv. Neural Inf. Process. Syst. 2022, 35, 3819–3829. [Google Scholar]
  91. Hadj-Bachir, M.; De Souza, P. LIDAR sensor simulation in adverse weather condition for driving assistance development. HAL Open Sci. 2019. Available online: https://hal.science/hal-01998668 (accessed on 21 July 2025).
  92. Kutila, M.; Pyykönen, P.; Holzhüter, H.; Colomb, M.; Duthon, P. Automotive LiDAR performance verification in fog and rain. In Proceedings of the 2018 21st International Conference on Intelligent Transportation Systems (ITSC), Maui, HI, USA, 4–7 November 2018; IEEE: New York, NY, USA, 2018; pp. 1695–1701. [Google Scholar]
  93. Brzozowski, M.; Parczewski, K. Problems related to the operation of autonomous vehicles in adverse weather conditions. Combust. Engines 2023, 194, 109–115. [Google Scholar] [CrossRef]
  94. Kim, J.; Park, B.; Kim, J. Empirical analysis of autonomous vehicle’s lidar detection performance degradation for actual road driving in rain and fog. Sensors 2023, 23, 2972. [Google Scholar] [CrossRef]
  95. Tang, L.; Shi, Y.; He, Q.; Sadek, A.W.; Qiao, C. Performance test of autonomous vehicle lidar sensors under different weather conditions. Transp. Res. Rec. 2020, 2674, 319–329. [Google Scholar] [CrossRef]
  96. Dreissig, M.; Scheuble, D.; Piewak, F.; Boedecker, J. Survey on lidar perception in adverse weather conditions. In Proceedings of the 2023 IEEE Intelligent Vehicles Symposium (IV), Anchorage, AK, USA, 4–7 June 2023; IEEE: New York, NY, USA, 2023; pp. 1–8. [Google Scholar]
  97. Lee, J.; Bang, G.; Shimizu, T.; Iehara, M.; Kamijo, S. LiDAR-to-Radar Translation Based on Voxel Feature Extraction Module for Radar Data Augmentation. Sensors 2024, 24, 559. [Google Scholar] [CrossRef] [PubMed]
  98. Wojtanowski, J.; Zygmunt, M.; Kaszczuk, M.; Mierczyk, Z.; Muzal, M. Comparison of 905 nm and 1550 nm semiconductor laser rangefinders’ performance deterioration due to adverse environmental conditions. Opto-Electron. Rev. 2014, 22, 183–190. [Google Scholar] [CrossRef]
  99. Designs, T.I. LIDAR Pulsed Time of Flight Reference Design; Texas Instruments Incorporated: Dallas, TX, USA, 2016. [Google Scholar]
  100. Li, W.; Shi, T.; Wang, R.; Yang, J.; Ma, Z.; Zhang, W.; Fu, H.; Guo, P. Advances in LiDAR Hardware Technology: Focus on Elastic LiDAR for Solid Target Scanning. Sensors 2024, 24, 7268. [Google Scholar] [CrossRef]
  101. Yang, Y.; Liu, J.; Huang, T.; Han, Q.-L.; Ma, G.; Zhu, B. RaLiBEV: Radar and LiDAR BEV fusion learning for anchor box free object detection systems. IEEE Trans. Circuits Syst. Video Technol. 2024, 35, 4130–4143. [Google Scholar] [CrossRef]
  102. Brophy, T.; Mullins, D.; Parsi, A.; Horgan, J.; Ward, E.; Denny, P.; Eising, C.; Deegan, B.; Glavin, M.; Jones, E. A review of the impact of rain on camera-based perception in automated driving systems. IEEE Access 2023, 11, 67040–67057. [Google Scholar] [CrossRef]
  103. Mai, N.A.M.; Duthon, P.; Salmane, P.H.; Khoudour, L.; Crouzil, A.; Velastin, S.A. Camera and LiDAR analysis for 3D object detection in foggy weather conditions. In Proceedings of the 2022 12th International Conference on Pattern Recognition Systems (ICPRS), Saint-Etienne, France, 7–10 June 2022; IEEE: New York, NY, USA, 2022; pp. 1–7. [Google Scholar]
  104. Velázquez, J.M.R.; Khoudour, L.; Pierre, G.S.; Duthon, P.; Liandrat, S.; Bernardin, F.; Fiss, S.; Ivanov, I.; Peleg, R. Analysis of thermal imaging performance under extreme foggy conditions: Applications to autonomous driving. J. Imaging 2022, 8, 306. [Google Scholar] [CrossRef]
  105. Yeong, D.J.; Velasco-Hernandez, G.; Barry, J.; Walsh, J. Sensor and sensor fusion technology in autonomous vehicles: A review. Sensors 2021, 21, 2140. [Google Scholar] [CrossRef]
  106. Vargas, J.; Alsweiss, S.; Toker, O.; Razdan, R.; Santos, J. An overview of autonomous vehicles sensors and their vulnerability to weather conditions. Sensors 2021, 21, 5397. [Google Scholar] [CrossRef]
  107. Zhang, C.; Wang, H.; Cai, Y.; Chen, L.; Li, Y.; Sotelo, M.A.; Li, Z. Robust-FusionNet: Deep multimodal sensor fusion for 3-D object detection under severe weather conditions. IEEE Trans. Instrum. Meas. 2022, 71, 1–13. [Google Scholar] [CrossRef]
  108. Bijelic, M.; Gruber, T.; Mannan, F.; Kraus, F.; Ritter, W.; Dietmayer, K.; Heide, F. Seeing through fog without seeing fog: Deep multimodal sensor fusion in unseen adverse weather. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, DC, USA, 14–19 June 2020; pp. 11682–11692. [Google Scholar]
  109. Su, H.; Gao, H.; Wang, X.; Fang, X.; Liu, Q.; Huang, G.-B.; Li, X.; Cao, Q. Object detection in adverse weather for autonomous vehicles based on sensor fusion and incremental learning. IEEE Trans. Instrum. Meas. 2024, 73, 1–10. [Google Scholar] [CrossRef]
  110. Wang, Z.; Zhan, J.; Li, Y.; Zhong, Z.; Cao, Z. A new scheme of vehicle detection for severe weather based on multi-sensor fusion. Measurement 2022, 191, 110737. [Google Scholar] [CrossRef]
  111. Sheeny, M.; De Pellegrin, E.; Mukherjee, S.; Ahrabian, A.; Wang, S.; Wallace, A. Radiate: A radar dataset for automotive perception in bad weather. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May 2021–5 June 2021; IEEE: New York, NY, USA, 2021; pp. 1–7. [Google Scholar]
  112. Burnett, K.; Yoon, D.J.; Wu, Y.; Li, A.Z.; Zhang, H.; Lu, S.; Qian, J.; Tseng, W.-K.; Lambert, A.; Leung, K.Y.; et al. Boreas: A multi-season autonomous driving dataset. Int. J. Rob. Res. 2023, 42, 33–42. [Google Scholar] [CrossRef]
  113. Geiger, A.; Lenz, P.; Urtasun, R. Are we ready for autonomous driving? the kitti vision benchmark suite. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA, 16–21 June 2012; IEEE: New York, NY, USA, 2012; pp. 3354–3361. [Google Scholar]
  114. Sun, P.; Kretzschmar, H.; Dotiwalla, X.; Chouard, A.; Patnaik, V.; Tsui, P.; Guo, J.; Zhou, Y.; Chai, Y.; Caine, B.; et al. Scalability in perception for autonomous driving: Waymo open dataset. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 14–19 June 2020; pp. 2446–2454. [Google Scholar]
  115. Cordts, M.; Omran, M.; Ramos, S.; Rehfeld, T.; Enzweiler, M.; Benenson, R.; Franke, U.; Roth, S.; Schiele, B. The cityscapes dataset for semantic urban scene understanding. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 3213–3223. [Google Scholar]
  116. Maddern, W.; Pascoe, G.; Linegar, C.; Newman, P. 1 year, 1000 km: The oxford robotcar dataset. Int. J. Rob. Res. 2017, 36, 3–15. [Google Scholar] [CrossRef]
  117. Barnes, D.; Gadd, M.; Murcutt, P.; Newman, P.; Posner, I. The oxford radar robotcar dataset: A radar extension to the oxford robotcar dataset. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May 2020–31 August 2020; IEEE: New York, NY, USA, 2020; pp. 6433–6438. [Google Scholar]
  118. Caesar, H.; Bankiti, V.; Lang, A.H.; Vora, S.; Liong, V.E.; Xu, Q.; Krishnan, A.; Pan, Y.; Baldan, G.; Beijbom, O. nuscenes: A multimodal dataset for autonomous driving. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 14–19 June 2020; pp. 11621–11631. [Google Scholar]
  119. Piroli, A.; Dallabetta, V.; Kopp, J.; Walessa, M.; Meissner, D.; Dietmayer, K. Energy-based detection of adverse weather effects in lidar data. IEEE Robot. Autom. Lett. 2023, 8, 4322–4329. [Google Scholar] [CrossRef]
  120. Piroli, A.; Dallabetta, V.; Kopp, J.; Walessa, M.; Meissner, D.; Dietmayer, K. SemanticSpray++: A Multimodal Dataset for Autonomous Driving in Wet Surface Conditions. In Proceedings of the 2024 IEEE Intelligent Vehicles Symposium (IV), Jeju Island, Republic of Korea, 2–5 June 2024; IEEE: New York, NY, USA, 2024; pp. 3085–3091. [Google Scholar]
  121. Pham, Q.-H.; Sevestre, P.; Pahwa, R.S.; Zhan, H.; Pang, C.H.; Chen, Y.; Mustafa, A.; Chandrasekhar, V.; Lin, J. A* 3d dataset: Towards autonomous driving in challenging environments. In Proceedings of the 2020 IEEE INTERNATIONAL Conference on Robotics and Automation (ICRA), Paris, France, 31 May 2020–31 August 2020; IEEE: New York, NY, USA, 2020; pp. 2267–2273. [Google Scholar]
  122. Kenk, M.A.; Hassaballah, M. DAWN: Vehicle detection in adverse weather nature dataset. arXiv 2020, arXiv:2008.05402. [Google Scholar] [CrossRef]
  123. Huang, X.; Cheng, X.; Geng, Q.; Cao, B.; Zhou, D.; Wang, P.; Lin, Y.; Yang, R. The apolloscape dataset for autonomous driving. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Workshops, Salt Lake City, UT, USA, 18–23 June 2018; pp. 954–960. [Google Scholar]
  124. Neuhold, G.; Ollmann, T.; Bulo, S.R.; Kontschieder, P. The mapillary vistas dataset for semantic understanding of street scenes. In Proceedings of the IEEE International Conference on Computer Vision, Honolulu, HI, USA, 21–26 July 2017; pp. 4990–4999. [Google Scholar]
  125. Yu, F.; Chen, H.; Wang, X.; Xian, W.; Chen, Y.; Liu, F.; Madhavan, V.; Darrell, T. Bdd100k: A diverse driving dataset for heterogeneous multitask learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 14–19 June 2020; pp. 2636–2645. [Google Scholar]
  126. Sakaridis, C.; Dai, D.; Van Gool, L. ACDC: The adverse conditions dataset with correspondences for semantic driving scene understanding. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Nashville, TN, USA, 20–25 June 2021; pp. 10765–10775. [Google Scholar]
  127. Zendel, O.; Honauer, K.; Murschitz, M.; Steininger, D.; Dominguez, G.F. Wilddash-creating hazard-aware benchmarks. In Proceedings of the European Conference on Computer Vision (ECCV), Workshops, Salt Lake City, UT, USA, 18–23 June 2018; pp. 402–416. [Google Scholar]
  128. Braun, M.; Krebs, S.; Flohr, F.; Gavrila, D.M. Eurocity persons: A novel benchmark for person detection in traffic scenes. IEEE Trans. Pattern Anal. Mach. Intell. 2019, 41, 1844–1861. [Google Scholar] [CrossRef]
  129. Wenzel, P.; Wang, R.; Yang, N.; Cheng, Q.; Khan, Q.; von Stumberg, L.; Zeller, N.; Cremers, D. 4Seasons: A cross-season dataset for multi-weather SLAM in autonomous driving. In Proceedings of the Pattern Recognition: 42nd DAGM German Conference, DAGM GCPR 2020, Tübingen, Germany, 28 September–1 October 2020; Proceedings 42. Springer: Berlin/Heidelberg, Germany, 2021; pp. 404–417. [Google Scholar]
  130. Pitropov, M.; Garcia, D.E.; Rebello, J.; Smart, M.; Wang, C.; Czarnecki, K.; Waslander, S. Canadian adverse driving conditions dataset. Int. J. Rob. Res. 2021, 40, 681–690. [Google Scholar] [CrossRef]
  131. Sheeny, M. All-Weather Object Recognition Using Radar and Infrared Sensing. arXiv 2020, arXiv:2010.16285. [Google Scholar] [CrossRef]
  132. Hahner, M.; Sakaridis, C.; Bijelic, M.; Heide, F.; Yu, F.; Dai, D.; Van Gool, L. Lidar snowfall simulation for robust 3d object detection. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA, 18–24 June 2022; pp. 16364–16374. [Google Scholar]
  133. Janai, J.; Güney, F.; Behl, A.; Geiger, A. Computer vision for autonomous vehicles: Problems, datasets and state of the art. Found. Trends® Comput. Graph. Vis. 2020, 12, 1–308. [Google Scholar] [CrossRef]
  134. Sarker, I.H. Machine learning: Algorithms, real-world applications and research directions. SN Comput. Sci. 2021, 2, 160. [Google Scholar] [CrossRef]
  135. Kiran, B.R.; Sobh, I.; Talpaert, V.; Mannion, P.; Al Sallab, A.A.; Yogamani, S.; Perez, P. Deep reinforcement learning for autonomous driving: A survey. IEEE Trans. Intell. Transp. Syst. 2021, 23, 4909–4926. [Google Scholar] [CrossRef]
  136. Parekh, D.; Poddar, N.; Rajpurkar, A.; Chahal, M.; Kumar, N.; Joshi, G.P.; Cho, W. A review on autonomous vehicles: Progress, methods and challenges. Electronics 2022, 11, 2162. [Google Scholar] [CrossRef]
Figure 1. Organization of the paper.
Figure 1. Organization of the paper.
Vehicles 07 00105 g001
Figure 2. Evaluation of papers by year related to ARAS.
Figure 2. Evaluation of papers by year related to ARAS.
Vehicles 07 00105 g002
Figure 3. ARAS modules performance in sunny, rainy, foggy, and snowy conditions.
Figure 3. ARAS modules performance in sunny, rainy, foggy, and snowy conditions.
Vehicles 07 00105 g003
Figure 4. Evaluating LiDAR wavelengths, 905 nm vs. 1550 nm, across key metrics [100].
Figure 4. Evaluating LiDAR wavelengths, 905 nm vs. 1550 nm, across key metrics [100].
Vehicles 07 00105 g004
Table 1. ACC system performance across weather conditions.
Table 1. ACC system performance across weather conditions.
Weather ConditionACC Performance SummaryRefs.
Sunny/DryOptimal performance with low error margins and high comfort levels. [21]
RainySlight degradation; longer travel times and headway; mitigated by control adjustments. [22,23]
FoggySensor limitations are likely, limited data; manufacturers caution against use. [21]
Snowy/IcySignificant challenges require dynamic fuzzy control and extended safe distances for acceptable performance. [21,24]
Table 2. ABS effectiveness by weather condition.
Table 2. ABS effectiveness by weather condition.
Weather ConditionABS EffectivenessKey ObservationsRefs.
Sunny (Dry)Very HighShortest stopping distances, optimal traction, stable steering [25]
Rainy (Wet)Moderate to HighIncreased stopping distance, late wheel lock, crash/injury reduction up to 60% [25,26]
FoggyModerate (inferred)Expected reduced friction; performance similar to wet conditions [25]
Snowy/IcyLow to ModerateLonger stopping distances, reduced friction, and ABS effectiveness decline unless paired with advanced controls [27,28]
Table 3. Performance summary of AEB under different weather conditions.
Table 3. Performance summary of AEB under different weather conditions.
Weather ConditionAEB PerformanceKey ChallengesEffectiveness NotesRefs.
SunnyOptimal performanceNoneHigh detection accuracy, full braking capability [29]
RainyReduced effectivenessSensor obstruction, low frictionCrash risk reduced for pedestrians, not significant for cyclists [29,30]
FoggyInconsistent performanceSensor inaccuracySome benefit for pedestrians, low for cyclists [30]
SnowySignificantly reducedLow traction, sensor interferenceImproved with adaptive algorithms; otherwise, poor [31]
Table 4. Performance summary of curve warning system.
Table 4. Performance summary of curve warning system.
Weather ConditionPerformance SummaryRef.
SunnyExpected optimal performance, minimal interference, and underexplored in research. [17]
RainyIncreased rider control; ABS and traction control are highly valued; concern over trust. [17]
FoggyHigh effectiveness with HUD and audio; reduced lane departure; strong user response. [37]
Snowy/IcyEffective warnings, early alerts are critical; control can be compromised by automation. [19]
Table 5. FCW system performance by weather condition.
Table 5. FCW system performance by weather condition.
Weather ConditionPerformance SummaryRefs.
SunnyHigh detection accuracy (>90%) [38]
RainyModerate degradation (~81.9–85%) [38,39]
FoggySignificant performance drop in standard systems; advanced systems improve warning time by 3.4–10.88× [40,41]
SnowyNot experimentally evaluated; likely performance degradation inferredNo specific
Table 6. Intersection support system performance.
Table 6. Intersection support system performance.
Weather ConditionPerformance SummaryRefs.
SunnyHigh accuracy and effectiveness in threat detection and user feedback [42,43]
RainyNo direct testing; potential sensor degradation noted [44]
FoggyNo direct evaluation; cooperative systems could mitigate visibility limitations [43,44]
SnowyNot tested; likely performance degradation due to occlusions and control limitations [42,44]
Table 7. LKAS performance under different weather conditions.
Table 7. LKAS performance under different weather conditions.
Weather ConditionPerformance DescriptionRef.
SunnyHigh accuracy (>98%); minimal noise; optimal conditions [45]
RainyModerate degradation; glare and water reduce clarity; compensated by preprocessing [45]
FoggySignificant degradation, reduced visibility, and lane contrast; optical flow helps [46]
SnowyPoor performance; markings often obscured; partial lane recovery with robust algorithms [45]
Table 8. Weather-based performance summary.
Table 8. Weather-based performance summary.
WeatherPerformance SummaryBest ModelsRefs.
SunnyHigh accuracy, high recall, low false positivesDFR-TSD, YOLOv5 Improved [47,48,49]
RainyModerate decline; robust models maintain acceptable recallDFR-TSD, YOLOv5 Improved (CA) [48,49]
FoggySignificant challenge; only enhances models, maintains good performanceDPF-YOLOv8, DFR-TSD [48,50]
SnowySlight degradation; detection is still effective with attention and feature refinementYOLOv5 Improved, DFR-TSD [48,49]
Table 9. Performance of stability control systems.
Table 9. Performance of stability control systems.
Weather ConditionsPerformance SummaryRefs.
SunnyOptimal functionality, stable traction, and visibility [51]
RainyReduced effectiveness due to traction loss and sensor issues [51,52]
FoggyDetection limitations affect sensor-based systems [52]
SnowyLowest reliability; sensor and traction impairment [52]
Table 10. TCS performance by weather condition.
Table 10. TCS performance by weather condition.
Weather ConditionRoad–Tire FrictionTCS PerformanceKey ChallengesRefs.
SunnyHighExcellentNone [53]
RainyMediumModerateTimely torque modulation [54]
FoggyMediumLimited (sensor-limited)Reduced sensor accuracy, visibility [55]
SnowyLowPoor to ModerateSevere slip requires advanced control [53,54]
Table 11. Normalized terms to evaluate the performance of the ARAS module.
Table 11. Normalized terms to evaluate the performance of the ARAS module.
Normalized TermDescription
ExcellentNear-perfect operation with minimal degradation; optimal conditions.
GoodMinor degradation, still performs reliably; mitigation strategies are effective.
ModerateNoticeable performance drop; acceptable with limitations.
PoorSignificant degradation; limited utility without adaptive enhancements.
Unusable/UnknownPerformance is severely compromised, not recommended, or untested.
Table 12. Radar types in ARAS.
Table 12. Radar types in ARAS.
Radar TypeApplication in ARASFunctionalityRefs.
Long-Range Radar (LLR)ACCMaintains a safe distance from vehicles aheadBosch Mobility (Advanced rider assistance systems), Continental Engineering (Revolutionizing Motorcycle Safety: Advanced Rider Assistance Systems (ARAS)) [66]
Mid-Range Radar (MRR)BSD, Lane Change Assist (LCA)Monitors blind spots and adjacent lanesPanasonic Industrial (What Is Radar Used for in the ADAS and AD System? Support Autonomous Driving by Using Radio Waves—Panasonic), Continental Engineering (Revolutionizing Motorcycle Safety: Advanced Rider Assistance Systems (ARAS)) [67]
Short-Range Radar (SRR)FCWDetects immediate obstacles, provides collision alertsGeotab Blog (What is ADAS (Advanced Driver Assistance Systems)? | Geotab), NXP Radar Solutions (https://www.nxp.com/applications/RADAR-SYSTEMS—accessed on 10 June 2025), Continental Engineering2, Bosch Mobility (Advanced rider assistance systems)
FMCW RadarGeneral ARAS applicationsMeasures distance and speed with high precision [68,69]
77 GHz RadarACC, BSDHigh-resolution object detectionPanasonic Industrial (Revolutionizing Motorcycle Safety: Advanced Rider Assistance Systems (ARAS)) [67,70]
Monostatic RadarVarious ARAS applicationsSimple design with co-located transmitter & receiver [71,72]
Solid-State RadarARAS applications requiring durabilityReliable and maintenance-free operationContinental Engineering (Revolutionizing Motorcycle Safety: Advanced Rider Assistance Systems (ARAS)) [73]
MIMO Radar (Multiple-Input Multiple-Output)Advanced ARAS applicationsEnhances spatial resolution and target discrimination [74,75]
Table 13. Classification of Radar sensors with their different aspects.
Table 13. Classification of Radar sensors with their different aspects.
Frequency RangeClassificationAzimuth FoV (°)Detection RangeKey ApplicationsRefs.
24.05 GHz–24.25 GHzSRR80–1200–10 mBlind-spot detection (BSD), Lane Change AssistCetecom advance (https://cetecomadvanced.com/en/news/frequency-ranges-for-automotive-Radar-technology/?utm—accessed on 5 June 2025) [78,79,80]
24.05 GHz–24.25 GHzMRR30–601–60 mDistance Warning, Lane Change Assist, Collision Avoidance [78,81]
76 GHz–77 GHzMRR30–601–60 mFront Cross Traffic Alert, Blind Spot Detection, Lane Change Assist, Rear Cross Traffic Alert [78]
76 GHz–77 GHzLRR10–20150–250 mAdaptive Cruise Control (ACC), Forward Collision Warickmanning (FCW) [82]
Table 14. Radar performance in diverse weather conditions.
Table 14. Radar performance in diverse weather conditions.
Weather ConditionRadar TypesRadar PerformanceApplications Refs.
SunnymmWave RadarUnaffected; consistent performanceACC, LKAAutomotiveRadar (https://www.keysight.com/blogs/en/tech/educ/2023/automotive-radar—accessed on 5 June 2025) [83,86]
RainyDual-Polarization (Dual-Pol) RadarMinor attenuation in heavy rain; retains functionalityCollision Avoidance, Emergency Braking [87,88,89]
FoggymmWave RadarHigh reliability; minimal signal scattering due to wavelength advantageAutonomous Vehicle Navigation, Blind-Spot Detection [86,87,90]
SnowymmWave Radar, Dual-Pol RadarFunctional in light snow; heavy snow causes scattering but retains obstacle detectionObstacle Detection, Cross-Traffic Alert [87,88,89]
Table 15. Lidar performance in different weather conditions.
Table 15. Lidar performance in different weather conditions.
Weather ConditionPerformance Lidar Type ApplicationsRefs.
SunnyUnaffected; high-resolution 3D mappingStandard LiDAR (e.g., 905 nm/1550 nm)Autonomous Navigation, Object Detection [97,98]
RainySignal attenuation in heavy rain, reduced range, and accuracy1550 nm solid state or Hybrid LiDARCollision Avoidance, Emergency Braking [92,94,99]
FoggyDense scattering reduces detection range and accuracy, and does not perform well in foggy weather1550 nm LiDAR, solid state LiDARAutonomous Vehicle Navigation, Obstacle Detection [92,94,100]
SnowySnowflakes scatter laser beams; snow accumulation obstructs sensors1550 nm LiDAR, Multi-Sensor FusionLane Detection, Cross-Traffic Alert [92,95,101]
Table 16. Camera performance in different weather conditions.
Table 16. Camera performance in different weather conditions.
Weather ConditionPerformanceCamera TypeApplicationsRefs.
Sunny High-resolution images, effective for object and lane detection, but prone to glare and shadows Standard RGB camera LKA, object detection, adaptive lighting [102]
Rainy Performance may improve due to wet surfaces reflecting more light, but water droplets distort images. Waterproof or water-resistant RGB camera Collision avoidance, pedestrian detection, road marking recognition [102]
Foggy Reduced visibility; sharp decline in standalone performance, but stereo cameras perform better (90.15%) Stereo camera or thermal imaging camera Object detection in fog, multi-sensor fusion with LiDAR for enhanced accuracy [103,104]
Snowy Functional in cold temperatures, affects the camera because of optical and mechanical disruption. Cold-resistant RGB camera Obstacle detection, lane departure warning, cross-traffic alert [87]
Table 17. Evaluation of datasets and their performance in different weather conditions.
Table 17. Evaluation of datasets and their performance in different weather conditions.
Dataset Datasets SizeLiDARRadarCameraSunnyRianfogSnowRefs.
RADIATE200K+ images[111]
K-Radar35K frames[90]
Boreas7111 LiDAR frames[112]
KITTI12,919 images[113]
Waymo Open DatasetOver 20 million images[114]
Cityscapes5 Kimages, 20K[115]
Oxford Radar RobotCar37,700 frames [116,117]
nuScenes1000 scenes[118]
SemanticSpray200 scenes[119,120]
A*3D (AdverseKITTI)230K+ images[121]
DENSE Dataset12K samples[108]
DAWN1000 images[122]
ApolloScapeOver 140,000 frames[123]
Mapillary25,000 images[124]
BDD100k100,000 frames[125]
ACDC8012 images[126]
WildDash4256 frames[127]
EuroCity47,300 images[128]
4Season350 km driving data[129]
CADCD7000 frames[130]
Table 18. ARAS suitability datasets’ performance.
Table 18. ARAS suitability datasets’ performance.
DatasetLidarRadarCameraRainFogSnowKey Notes
DENSE Multimodal with challenging weather + night; good for sensor fusion and robustness.
NuScenes Large-scale, popular, multimodal, good annotations, urban/highway mix.
Radiate Focused on adverse weather, strong sensor fusion.
Oxford RobotCar Real-world long-term dataset, covering weather and lighting variations.
Boreas A newer dataset with a variety of conditions also supports multimodal fusion.
K-RadarK-Radar includes challenging driving conditions, such as fog, rain, and snow.
Waymo Large-scale, high-quality annotations are widely used in ADAS research.
A*3D Limited sensors (no LiDAR), but decent weather and night coverage.
Table 19. Limitations of sensor performance.
Table 19. Limitations of sensor performance.
Weather ConditionSensor Type Sensor Limitations Refs.
SunnyRadar No significant limitations [96,111]
LiDAR High reliability (no degradation in clear weather) [16,120]
Camera Glare near low sun angles
RainRadar Minor signal noise implied by Radar robustness in adverse weather (not explicitly quantified) [111]
LiDAR Degraded performance due to raindrop adherence on sensor surfaces, reducing visibility [92,120]
Camera Susceptible to blur/glare caused by rain [102,120]
FogRadar Angular resolution loss in fog (implied by reduced precision in extreme conditions) [96,111]
LiDAR Visibility range reduced due to laser beam absorption/attenuation [92,119]
Camera Poor image detail due to light scattering [102,119]
SnowRadar Few false positives (implied by Radar robustness in low visibility) [96,111]
LiDAR Heavy scattering from snowflakes [92,132]
Camera Glare from snow-covered surfaces [102,120]
Table 20. ADAS/ARAS scene understanding and perception tasks.
Table 20. ADAS/ARAS scene understanding and perception tasks.
TaskDescription
Object DetectionIdentifying and localizing objects such as vehicles and pedestrians
Semantic/Instance SegmentationAssigning labels or separating individual objects in a scene
Tracking/Scene FlowFollowing object movement over time or estimating 3D motion
Lane DetectionIdentifying lane markers under varying conditions
Depth EstimationPredicting the distance of objects using stereo or monocular inputs
Interpretation and Understanding
TaskDescription
Scene UnderstandingContextualizing traffic layout, road semantics, and intersections
Trajectory PredictionPredicting the motion of vehicles, pedestrians, and cyclists
Behaviour PredictionAnticipating driver or pedestrian intent (e.g., crossing, turning)
Localization & MappingEstimating vehicle pose and building a local/global environmental map
Decision and Control Support
TaskDescription
Collision AvoidanceRecognizing imminent risks and triggering alerts or actions
Path PlanningComputing safe and efficient trajectories under constraints
Adaptive Cruise & Lane KeepMaintaining speed, gap, and lateral position under automation
Cross-Cutting Tasks
TaskDescription
Sensor FusionCombining data from multiple sensors to improve robustness
Domain Adaptation / RobustnessEnsuring models generalize to new environments or weather
Real-Time OptimizationEnabling low-latency performance on constrained hardware
Table 21. ADAS/ARAS algorithms used in scene understanding and perception tasks.
Table 21. ADAS/ARAS algorithms used in scene understanding and perception tasks.
TaskCommon Techniques/Algorithms
Object DetectionYOLO, SSD, Faster R-CNN, RetinaNet, CenterPoint (LiDAR)
Semantic SegmentationDeepLabv3+, PSPNet, HRNet, Swin Transformer, U-Net
Instance SegmentationMask R-CNN, PANet, SOLOv2
Scene Flow/TrackingKalman Filters, Optical Flow (RAFT), CenterTrack, SORT, Deep SORT, SceneFlowNet
Depth EstimationMonoDepth2, DORN, PackNet, PSMNet (stereo), self-supervised SfM
Lane DetectionSCNN, PolyLaneNet, LaneNet, ENet-SAD
Interpretation and Understanding
TaskCommon Techniques/Algorithms
Scene UnderstandingScene Graph Networks, BEVFusion, PointPainting
Trajectory PredictionLSTM, GRU, Social-GAN, MultiPath++, VectorNet, IntentNet
Behavior PredictionBayesian Networks, Reinforcement Learning, Graph-based models
Localization & MappingSLAM (ORB-SLAM2, VINS-Fusion), LiDAR Odometry, Visual-Inertial Odometry (VIO), NDT Matching
Decision and Control Support
TaskCommon Techniques/Algorithms
Collision AvoidanceRule-based logic, end-to-end RL, MPC (Model Predictive Control), risk-aware planning
Path PlanningA*, RRT*, MPC, hybrid A*/MPC, learning-based planners (e.g., conditional imitation learning)
Adaptive Cruise ControlPID controllers, Fuzzy logic, sensor fusion + prediction-based control
Lane-Keeping AssistLane geometry + PID, CNN regression, Kalman filtering for lateral control
Cross-Cutting Techniques
FunctionTechniques/Algorithms
Sensor FusionKalman Filter, Extended KF, Bayesian fusion, PointPainting, EPNet, Robust-FusionNet, DeepFuse
Domain AdaptationDAFormer, ADVENT, CyCADA, Style Transfer, adversarial training
Weather RobustnessData augmentation, fog simulation, domain randomization, contrastive learning
Real-Time OptimizationTensorRT, model quantization, pruning, knowledge distillation, Edge AI inference
Uncertainty EstimationMonte Carlo Dropout, Deep Ensembles, Bayesian Neural Networks, Evidential DL
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ullah, Z.; da Silva, J.A.C.; Nunes, R.R.; Reis, A.; Filipe, V.; Barroso, J.; Pires, E.J.S. Performance of Advanced Rider Assistance Systems in Varying Weather Conditions. Vehicles 2025, 7, 105. https://doi.org/10.3390/vehicles7040105

AMA Style

Ullah Z, da Silva JAC, Nunes RR, Reis A, Filipe V, Barroso J, Pires EJS. Performance of Advanced Rider Assistance Systems in Varying Weather Conditions. Vehicles. 2025; 7(4):105. https://doi.org/10.3390/vehicles7040105

Chicago/Turabian Style

Ullah, Zia, João A. C. da Silva, Ricardo Rodrigues Nunes, Arsénio Reis, Vítor Filipe, João Barroso, and E. J. Solteiro Pires. 2025. "Performance of Advanced Rider Assistance Systems in Varying Weather Conditions" Vehicles 7, no. 4: 105. https://doi.org/10.3390/vehicles7040105

APA Style

Ullah, Z., da Silva, J. A. C., Nunes, R. R., Reis, A., Filipe, V., Barroso, J., & Pires, E. J. S. (2025). Performance of Advanced Rider Assistance Systems in Varying Weather Conditions. Vehicles, 7(4), 105. https://doi.org/10.3390/vehicles7040105

Article Metrics

Back to TopTop