1. Introduction
Motorcycle safety is a significant concern worldwide, with motorcycle accidents often resulting in severe injuries and fatalities [
1]. Motorcycles represent a prevalent and often economical mode of transportation globally, with a significant presence in Europe, where approximately 25 million were recorded in 2018 [
2]. Despite their utility and popularity, motorcyclists face considerably elevated risks of severe injury and fatality on the roads due to their inherent vulnerability. In 2019, more than 3500 motorcyclists lost their lives on European roads, representing 16% of all road fatalities, a proportion that has slightly increased over the last decade despite a 16% reduction in absolute numbers [
2]. This increased risk of fatalities and injuries can be broadly categorized into human factors (e.g., age, inexperience, speeding, helmet non-use), vehicle-related factors (e.g., lack of safety features, braking instability), and environmental factors (e.g., poor road infrastructure and adverse weather conditions) [
3].
Several factors contribute to the prevalence of motorcycle accidents. A primary concern is that motorcyclists are frequently overlooked by other road users, often due to their smaller visual profile and blind spots [
2]. Additionally, inadequate or poorly maintained road infrastructure, such as uneven surfaces, potholes, and loose gravel or material, poses significant hazards specifically for two-wheeled vehicles [
2,
4]. Rider-specific factors also play a critical role; these include inexperience, particularly among younger riders, excessive speed, and impairment due to alcohol or drugs [
2]. Fatigue and risky riding behaviors are further elements that can compromise safety [
3].
Efforts to improve motorcycle safety have seen advancements in vehicle technology, with anti-lock braking systems notably contributing to a reduction in accidents [
2]. However, beyond technological improvements, a holistic approach is deemed essential for addressing the multifaceted safety challenges confronting powered two- and three-wheeler users globally. This includes promoting safer road infrastructure, improving enforcement of traffic laws, fostering safer post-crash care, and enhancing public awareness regarding motorcycle safety [
3]. The complex relationship among these factors requires continued and comprehensive strategies to mitigate risks and improve road safety for motorcyclists worldwide.
To address these risks, advanced rider assistance systems (ARAS) have emerged as a promising solution [
5,
6,
7]. ARAS are systems designed to aid the rider during the operation of the vehicle. These systems integrate various technologies designed to enhance rider safety by providing real-time alerts and assistance, such as collision avoidance blind-spot detection adaptive cruise control [
5], and anti-lock braking system [
8,
9,
10,
11,
12]. By offering these features, ARAS aims to reduce human error, which is the leading cause of motorcycle accidents.
However, the effectiveness of ARAS is not uniform across all conditions; adverse weather conditions, such as heavy rain or fog, can significantly affect the performance of sensor-based systems integral to ARAS. These environmental factors can reduce the reliability and effectiveness of the assistance systems in real-world scenarios [
13,
14]. Recent research on vehicle speeds showed that all vehicle types traveled at better speeds under sunny conditions, while significant variations were observed in rainy weather. In particular, motorcycles were more affected by adverse weather, reflecting their greater exposure as compared to heavier vehicles [
15]. Adverse weather, such as rain, fog, and snow, presents unique challenges to sensor systems that are central to ARAS [
16]. For instance, optical sensors such as cameras may experience reduced visibility in fog or rain, Radar (radio detection and ranging) may be disrupted by heavy precipitation or snow, and LiDAR (light detection and ranging) systems may face difficulties in low-contrast environments.
In recent years, several reviews have explored the development and potential of ARAS for motorcycles. For example, Savino et al. [
11] provided a systematic review of active safety systems for powered two-wheelers, highlighting the diversity of technologies and the lack of comparative evaluations across systems. Kaye et al. [
17] examined rider beliefs and perceptions toward ARAS using the theory of planned behaviour, offering insights into the psychological and behavioural barriers to adoption. Naweed and Blackman [
18] critically analyzed how ARAS are discussed and marketed by manufacturers, exposing a gap between industry claims and empirical evidence. Other reviews have addressed human factors [
19], safety discourse [
18], and system-level effectiveness through real crash data [
20]. Despite these valuable contributions, prior reviews have typically focused on specific aspects such as rider psychology, industry framing, or single-system performance. This paper extends the literature by providing an integrated, weather-focused evaluation of ARAS, synthesizing information on individual modules, sensor technologies, machine learning algorithms, and public datasets. Uniquely, we structure the review around system suitability under adverse weather conditions, an area that remains underexplored in the existing literature.
This study evaluates and highlights the effectiveness of current ARAS technologies under different weather conditions. It will assess how various sensors, LiDAR, Radar, and cameras perform under rain, fog, and snow scenarios. Additionally, the study will identify which technologies and datasets are best suited to maintain optimal performance in these conditions. This study provides insights into the most reliable ARAS systems and algorithms for enhancing motorcycle safety across diverse environmental challenges by understanding these limitations.
The organization of the paper is outlined in
Figure 1 and structured as follows:
Section 2 describes the methodology and data collection process for the review.
Section 3 discusses the different modules of ARAS under various weather conditions, where the main focus is on evaluating rainy, foggy, and snowy conditions for each module.
Section 4 focuses on ARAS sensor technologies, where the central focus is on Lidar, Radar, and camera behavior in adverse weather conditions.
Section 5 provides an overview of relevant datasets that are applicable to ADAS or ARAS and have also been evaluated for adverse weather conditions. Furthermore,
Section 6 examines algorithms and techniques that are directly or indirectly related to ARAS, and
Section 7 presents the discussion, analysis, conclusion, and future recommendations.
2. Method
This study employs a qualitative, literature-driven methodology to investigate the performance of ARAS under diverse weather conditions. Rather than following a formal systematic review protocol, this work employs a thematic review approach in which existing research is collected, examined, and synthesized around clearly defined themes rather than being organized strictly by chronology or methodology. The emphasis is on identifying patterns, recurring topics, and conceptual categories across studies, which are then grouped and discussed under thematic headings. This approach was selected to provide a structured overview of existing work by organizing studies according to predefined themes. The scope was limited to ARAS-related research that explicitly considered environmental influences.
Google Scholar, IEEE Xplore, and ScienceDirect were used to identify relevant articles from peer-reviewed journals, conference proceedings, and established datasets in the fields of intelligent transportation systems and computer vision. The selection criteria prioritized studies evaluating ARAS or related advanced driver assistance systems (ADAS) in relation to environmental factors, such as sun, rain, fog, and snow. A special focus was placed on papers presenting empirical results, technology comparisons, or novel methods involving Radar, LiDAR, and camera-based systems. A total of 136 papers were selected. To better understand the evolution of research activity in the ARAS field, the publication years of the reviewed articles were analyzed. As illustrated in
Figure 2, ARAS-related studies remained limited until around 2015, after which a steady growth trend emerged. The number of publications increased significantly from 2019 onward, aligning with the broader adoption of advanced sensing technologies, increased interest in motorcycle safety, and the rising availability of benchmark datasets. The recent surge suggests a transition from feasibility studies toward system integration and evaluation under diverse operational conditions, including adverse weather, underscoring the relevance of this review’s focus.
3. Modules of ARAS Under Various Weather Conditions
ARAS has become a central technology in improving motorcycle safety and rider confidence, particularly in challenging weather conditions. The system integrates a range of technologies, including Radar, LiDAR, cameras, and advanced control algorithms, to support riders with real-time situational awareness and automated responses. Key ARAS functionalities include adaptive cruise control (ACC) for maintaining safe distances, anti-lock braking systems (ABS) to prevent wheel lock-up, and automatic emergency braking (AEB) for rapid collision mitigation. Additional features like blind-spot detection (BSD), curve warning, front collision warning (FCW), and intersection support system (ISS) provide proactive hazard recognition, especially under poor visibility or slippery road surfaces. Lane-keeping assist system (LKAS) and speed alerts (SAs) help drivers maintain vehicle discipline, while stability and traction control systems (SCS and TCS) ensure optimal handling during rain, fog, or snow. All these technologies from ADAS are increasingly central to ARAS, with companies like Robert Bosch GmbH and Continental AG developing radar-based safety features such as adaptive cruise control, collision warning, and blind-spot detection. Motorcycle manufacturers, including BMW Motorrad, Honda, Yamaha, Kawasaki, Ducati, and KTM, are integrating these features into higher-end models. Weather-related challenges pose significant risks to motorcyclists; therefore, the role of ARAS in maintaining safety and performance across environmental conditions is increasingly critical and well-supported by empirical research and dataset-driven innovations.
3.1. Adaptive Cruise Control Under Various Weather Conditions
ACC systems are designed to maintain a safe vehicle distance and speed automatically, enhancing both comfort and safety during driving. Under optimal weather conditions, such as sunny or dry environments, ACC systems demonstrate reliable performance, with minimal velocity and relative distance errors, less than 0.6% and 0.5%, respectively, thereby ensuring high operational accuracy and passenger comfort [
21].
In rainy weather, system performance slightly degrades due to increased braking distances and reduced tire-road friction. Real-world studies on commercial motor vehicles indicated a modest reduction in ACC usage during rain and a significant increase in following headways to maintain safety [
22]. Simulated environments further validated these data by showing increased travel times and reduced average speeds in rainy scenarios [
23].
Although specific experimental data on foggy conditions were sparse, the literature recognizes fog as a major challenge due to sensor performance limitations. Radar and camera systems, critical components of ACC, may experience degraded object detection in fog, prompting many manufacturers to discourage ACC use in such environments [
21].
Snowy and icy conditions pose the most significant challenges to ACC functionality. Drastically reduced road friction requires extended stopping distances, up to four times greater than dry conditions. However, adaptive control methods such as fuzzy logic and dynamically adjusted reference signals (e.g., Gaussian, sine-based) enable ACC systems to maintain safe operation despite severe weather. These systems modulate target velocity and safe distances in real time, effectively balancing safety and comfort [
21,
24].
Table 1 provides a summary of ACC performance under different weather conditions.
3.2. ABS Performance Under Different Weather Conditions
ABS exhibits variable performance depending on road surface conditions, which are influenced by weather. Under sunny and dry conditions, ABS systems perform optimally. They help maintain vehicle stability, minimize wheel lock, and reduce stopping distances [
25]. Enhanced friction from both the tire-road interface and high-performance brake materials like carbon-fiber ceramic discs further boosts ABS effectiveness in such scenarios [
25].
In rainy weather, ABS remains beneficial but with diminished effectiveness due to a significant drop in tire-road friction caused by water films. While ABS-equipped vehicles still outperform non-ABS ones, studies report increased stopping distances and late-phase wheel lock during braking on wet asphalt [
25]. Nonetheless, ABS has been shown to reduce crash involvement significantly, with injury reductions reported as high as 57–60% in wet road conditions [
26].
Foggy conditions were not explicitly tested in the analyzed studies. However, given that fog often results in damp road surfaces, ABS performance can be inferred to follow patterns similar to rainy conditions, providing continued benefits through improved steering control, albeit with some performance loss due to moisture-related traction reductions.
On snowy or icy roads, ABS effectiveness is most compromised. The significantly lower friction and unstable slip conditions on these surfaces increase stopping distances, even with ABS activated.
Table 2 summarizes the effectiveness of ABS systems, while some studies show ABS can become less efficient or even detrimental under such conditions if not paired with additional systems like traction control or adaptive braking logic [
27,
28].
3.3. Automatic Emergency Braking Systems in Different Weather Conditions
AEB systems are a critical component of modern vehicle safety technology. Their performance, however, is notably influenced by environmental conditions, particularly weather. This section synthesizes findings from recent studies to evaluate how AEB systems perform in sunny, rainy, foggy, and snowy conditions.
Under sunny weather, AEB systems generally perform at optimal levels. Sensor visibility is clear, and traction is high, which supports precise object detection and effective braking response [
29].
In rainy weather, performance begins to degrade. Camera and LiDAR sensors experience reduced reliability due to water interference, and lower road adhesion leads to increased braking distances. While AEB systems still offer some crash mitigation, particularly for pedestrians, their effectiveness for bicyclist-related incidents is not statistically significant [
29,
30].
Foggy conditions pose even greater challenges, as dense fog interferes with sensor accuracy, especially optical systems. Although some crash risk reduction has been observed in pedestrian scenarios, the effectiveness remains inconsistent and largely ineffective for bicyclist incidents [
30].
In snowy environments, AEB performance is significantly compromised. Snow reduces both traction and sensor visibility, particularly for cameras and LiDAR. Unless systems are specifically designed or adapted to low-friction conditions, braking performance suffers greatly. However, research indicates that driver-trust and safety perceptions can be improved when AEB systems incorporate snow-adaptive braking algorithms [
31].
Table 3 provides an overview of the AEB system performance under various weather conditions.
3.4. Blind-Spot Detection Systems Under Different Weather Conditions
BSD systems demonstrate strong performance in clear weather but face various degrees of degradation under adverse conditions. In sunny conditions, vision-based systems utilizing optical flow, feature extraction, and convolutional neural networks consistently perform with high accuracy, often exceeding 95% detection rates [
32,
33,
34].
Under rainy conditions, the performance of purely camera-based systems deteriorates due to visual noise from water droplets, reflections, and reduced contrast [
34,
35]. However, sensor fusion systems integrating Radar and cameras show greater resilience, compensating for visual weaknesses with Radar robustness [
35,
36].
Foggy weather was less commonly tested directly, but vision-based systems are expected to underperform due to low visibility and contrast loss. Radar and LiDAR may help mitigate these issues, although their effectiveness can vary with fog density [
35].
Snowy conditions pose significant challenges for both Radar and vision systems. Snow introduces signal attenuation and environmental clutter for Radar and obscures visual cues for cameras. Most single-sensor systems are insufficient under such conditions, reinforcing the need for multi-sensor redundancy and adaptive algorithms [
35].
Overall, BSD systems perform best in sunny conditions and can maintain acceptable performance in rain when sensor fusion is used. Fog and snow remain problematic and require further research into adaptive sensing strategies and robust environmental perception.
3.5. Curve Warning System Performance Under Different Weather Conditions
Curve warning systems, as part of broader ARAS, demonstrate varying levels of effectiveness across different weather conditions. Under sunny or clear conditions, these systems are generally assumed to function optimally, as environmental interference is minimal. While no specific performance limitations were identified, user studies primarily focused on adverse conditions, offering limited insights into their relative benefit during ideal weather [
17].
In rainy conditions, riders consistently acknowledged the utility of systems like ABS, traction control, and curve warnings. These features enhance motorcycle control and safety by mitigating slip risks and improving braking and cornering reliability on wet surfaces. Riders found ARAS particularly reassuring in rain, though concerns about potential overreliance and system malfunction were also noted [
17].
Foggy conditions presented a significant context for performance testing. Simulator-based experiments revealed that curve warning systems integrated with Head-Up Displays (HUDs), particularly when paired with auditory alerts, substantially reduced lane departure and speed deviations in dense fog. These systems were especially effective for female and older riders, offering meaningful improvements in situational awareness and control [
37].
In snowy or icy conditions, systems targeting icy-curve warnings were found to be beneficial for hazard anticipation and reaction. Early alerts improved rider responses, though stability concerns arose with abrupt automated interventions like autonomous emergency braking. These findings underscore the need for carefully timed and user-transparent warnings to maintain rider control [
19].
Table 4 provides the performance summary of the curve warning system in different weather conditions.
3.6. FCW System Performance Under Different Weather Conditions
FCW systems demonstrate varying levels of performance depending on environmental conditions, particularly weather. Under sunny conditions, systems generally exhibit optimal detection accuracy. Chou et al. [
38] reported detection rates exceeding 90% in both sunny and cloudy environments, reflecting the effectiveness of vision-based detection under high-visibility conditions.
In rainy conditions, performance moderately declines due to the reduction in visibility and interference caused by rain droplets and road spray. Despite this, Chou et al. [
38] observed detection rates over 80%, suggesting resilience to moderate visual distortion. Pan et al. [
39] similarly noted a drop in recognition accuracy from ~91% to ~81.9% when transitioning from clear to rainy weather, alongside increased computational demand.
Foggy conditions pose a significant challenge to FCW systems, especially those reliant on vision or Radar. Several studies have demonstrated that fog can severely impair detection accuracy and reduce warning times. Chen et al. [
40] introduced the ViCoWS system, which adjusted its warning horizon based on visibility and extended the warning time up to 4.5 s under dense fog (120 m visibility). Additionally, Zhang et al. [
41] developed a low-visibility FCW algorithm that significantly reduced collision risk compared to standard systems, with a reported improvement of up to 10.88 times in fog scenarios.
In contrast, snowy conditions remain less explored within the current literature. While snow presents similar visual and sensor interference challenges as fog and rain, no empirical performance data specific to snow were found among the reviewed studies. It is reasonable to infer that performance would similarly degrade unless systems are explicitly designed to compensate for snow-induced distortions. The system performance is summarized in
Table 5.
3.7. Intersection Support Systems Under Different Weather Conditions
ISS, particularly those designed for powered two-wheeled vehicles and vehicular ad hoc networks (VANETs), show promising performance under ideal or sunny conditions. Evaluations performed in simulated or controlled environments demonstrate the systems’ ability to assess threats effectively using receding horizon control and onboard sensors [
42]. These systems offer accurate real-time assessments, effective rider feedback mechanisms, and high user acceptance in clear weather [
42,
43].
However, the performance under adverse weather conditions such as rain, fog, and snow is largely untested in the reviewed literature, as shown in
Table 6. While the VANET-based frameworks acknowledge environmental and sensor reliability challenges [
44]. There is a lack of explicit modeling or testing under such conditions. Rain may degrade the accuracy of visual sensors and laser scanners, while fog introduces visibility constraints that could impact non-cooperative systems. Snow poses more severe limitations due to both environmental occlusions and reduced vehicle maneuverability factors that challenge the assumptions made in current dynamic models and control strategies [
44].
Although technologies like Radar and DSRC communications are better suited for adverse weather, the current systems have not incorporated or tested such adaptations. This indicates a need for expanded validation across realistic, weather-diverse scenarios to ensure robust and reliable system performance in all conditions [
43,
44].
3.8. Lane-Keeping Assist System Under Different Weather Conditions
LKAS demonstrates high reliability in sunny weather, benefiting from optimal visibility and well-marked lanes. Under such conditions, lane detection algorithms achieve accuracy rates above 98%, making them highly effective for real-time applications [
45].
However, performance begins to degrade in adverse weather. In rainy conditions, LKAS effectiveness diminishes due to factors like reduced contrast and water interference on sensors. Despite this, systems with enhanced image processing, such as those using convolutional neural networks (CNNs) or histogram equalization, retain moderate accuracy [
45].
Foggy conditions present a greater challenge, significantly impairing system reliability. Visibility is reduced, and lane markings become indistinct, which affects both human and algorithmic lane-keeping capabilities. While systems employing optical flow and contextual regularization offer some improvement, the overall performance remains suboptimal [
46].
Snowy conditions are among the most problematic for LKAS. Snow often covers lane markings entirely, leading to substantial degradation in detection accuracy. Nonetheless, a few robust algorithms manage to extract features despite the snow cover, though their success rates are still notably lower compared to clear weather scenarios [
45]. The overall performance is summarized in
Table 7.
3.9. Traffic Sign Detection Systems Under Weather Conditions
Traffic sign detection systems, particularly those used for “speed alert” functions, demonstrate varying levels of effectiveness depending on weather conditions. In sunny weather, most systems perform at their best due to high visibility and clear contours. For example, YOLOv3-based systems achieve high average precision (AP) and low false negatives in clear conditions [
47]. Similarly, DFR-TSD and improved YOLOv5-based models report excellent recall and precision under sunny scenarios [
48,
49].
In rainy conditions, performance consistently drops across all models due to visibility reduction and motion blur. However, the DFR-TSD framework’s enhancement module and the YOLOv5-based system with coordinate attention (CA) show significant robustness and improved detection rates despite these challenges [
48,
49].
Foggy environments pose the greatest challenge for detection models, particularly due to image blurring and low contrast. DPF-YOLOv8, which was explicitly trained on fog-augmented datasets, improves mean average precision (mAP) by over 2% compared to standard YOLOv8, demonstrating superior adaptation to hazy weather [
50]. Similarly, DFR-TSD’s modular architecture enhances sign visibility, reducing false negatives even in dense fog [
48].
Under snowy conditions, the detection accuracy is moderately affected. While visibility is sometimes compromised, models like DFR-TSD and improved YOLOv5 maintain high precision and relatively stable recall due to robust feature extraction techniques [
48,
49]. The summarized performance of the system is illustrated in
Table 8.
3.10. Stability Control System Performance
SCS, including technologies such as ABS, TCS, and MSC, demonstrates varying levels of effectiveness depending on environmental conditions.
In sunny weather, these systems perform optimally. Favorable conditions like dry and warm roads have been associated with an increase in motorcycle use and extended riding seasons, although not with a reduction in system performance [
51]. This suggests that under dry conditions, these technologies function as designed, providing reliable support for vehicle control and crash avoidance.
In rainy conditions, performance may be compromised. The effectiveness of stability-related ADAS can be diminished due to decreased traction and sensor limitations caused by precipitation [
51,
52]. While systems like ABS offer significant benefits, their full potential can be impaired by real-world conditions, such as wet surfaces.
Under foggy weather, system performance is further impacted. Detection systems like Radar and cameras may suffer reduced visibility, affecting the timely activation of emergency interventions [
52]. Although direct data on fog-related motorcycle stability is limited, modeling adjustments in ADAS performance confirm these limitations.
In snowy weather, performance is least reliable. Snow creates severe traction challenges and may obscure sensor functionality entirely. As a result, the reduction in crash potential attributed to these systems is expected to be significantly lower, and no empirical motorcycle-specific data are yet available [
52].
Table 9 summarizes the performance of the system in adverse weather conditions.
3.11. Traction Control System Performance Across Different Weather Conditions
TCS performance varies significantly depending on environmental conditions. Under sunny conditions, TCS operates in its optimal state due to high road–tire friction. These conditions are frequently used as benchmarks in modeling and control design, where the system can maintain stability with minimal wheel slip [
53].
In rainy conditions, TCS performance is moderately affected. Although road–tire friction is reduced, the damping effect on engine-to-slip dynamics can assist in maintaining control, assuming the control system is properly tuned for such transitions [
54]. However, timely torque modulation becomes more critical.
Under foggy conditions, while road traction may remain relatively unchanged, sensor reliability, particularly for vision-based or sensor fusion-dependent systems, declines. This can hinder the TCS’s ability to accurately assess traction and react effectively, highlighting a need for robust perception algorithms [
55].
In snowy conditions, the challenges are most severe. The road–tire friction is drastically lowered, and maintaining a stable slip ratio becomes increasingly difficult. Advanced control strategies, such as model predictive control or machine learning-based methods, are often required to mitigate excessive slip and loss of stability [
53,
54]. The system performance is evaluated in
Table 10.
The analysis of ARAS modules under different weather conditions described in
Section 3 is illustrated in the heatmap. The heatmap displays the performance of ARAS across various weather conditions, sunny, rainy, foggy, and snowy, utilizing normalized performance levels as indicated in
Table 11. These normalized terms reflect the systems’ relative effectiveness in providing reliable support in adverse environments.
From
Figure 3, it is clear that ACC, LKAS, and the stability control system (SCS) maintain relatively high performance across most weather conditions. For example, ACC shows Excellent performance in sunny conditions and sustains at least moderate performance in rain and fog, outperforming more weather-sensitive systems like AEB, which drops to Poor in snowy conditions.
The ISS module is notably vulnerable to extreme weather, becoming unusable/unknown in snowy conditions, likely due to the complexity of decision-making at intersections under reduced visibility or occluded road markings.
In terms of robustness, SCS and ACC are among the most consistent performers across all conditions. While AEB and ISS show significant degradation, especially in snow, which is the most challenging weather condition overall.
This analysis suggests that while many ARAS components are effective in favorable weather, there is a need for improvement in perception and control systems to handle low-visibility or low-traction environments, particularly for systems that rely heavily on camera input.
6. Algorithms and Techniques
Machine learning plays a central role in advanced rider assistance systems (ARAS), enabling key perception and decision-making capabilities required for safe operation in complex environments. The primary tasks involved in environmental perception include object detection (identifying the presence and location of relevant objects), segmentation (delineating object boundaries or regions), and classification (assigning semantic labels to objects or scene elements) [
133]. These tasks are typically addressed using different machine learning paradigms, each suited to different aspects of the problem, as shown in
Table 20.
Broadly, machine learning can be categorized into three main types: supervised learning, where models are trained on labelled datasets to learn mappings between inputs and outputs; unsupervised learning, which extracts structure from unlabelled data, supporting clustering or anomaly detection; and reinforcement learning, in which agents learn optimal actions through interaction and feedback, applicable in adaptive behaviour and control strategies [
134,
135]. Understanding these paradigms provides a foundation for evaluating the algorithmic approaches used in ARAS systems, especially when addressing the challenges posed by adverse weather and limited computing resources.
ARAS and ADAS rely on a layered pipeline of perception, interpretation, and control to ensure safe operation across diverse and dynamic environments. At the foundation is perception, where sensors and algorithms detect, segment, and classify environmental elements. The next layer is interpretation and understanding, which gives semantic context, like predicting object trajectories and understanding scene layout. Finally, decision and control support uses these interpreted data to assist with real-time actions such as collision avoidance or path planning. Supporting all these layers are cross-cutting task techniques like sensor fusion, uncertainty estimation, and real-time optimization, which enhance reliability and robustness, especially in challenging conditions, as show in
Table 21.
7. Discussion and Analysis
This study provides a comprehensive evaluation of ARAS under diverse environmental conditions, with a particular focus on weather variability, a critical factor in ensuring motorcyclist safety and ARAS reliability. The motivation behind this work stems from the pressing need to understand how current ARAS solutions perform in real-world, adverse weather scenarios, and to identify technological and data-driven gaps that must be addressed to advance safety in two-wheeled transportation. The findings confirm that while ARAS modules such as ACC, ABS, AEB, and BSD perform reliably in clear weather, their performance declines in fog, snow, and heavy rain, mostly for systems dependent on optical sensors or visual recognition. This observation highlights not only the sensitivity of perception in adverse environments but also the limitations of benchmarking ARAS primarily under controlled conditions.
Subsequently, the paper shifts focus to enabling sensor technologies, RADAR, LiDAR, and camera systems, evaluating their individual and integrated performance in adverse conditions. RADAR maintains operational robustness in poor visibility, yet its relatively low resolution limits its usefulness for fine-grained detection and curve warning functions. By contrast, LiDAR and cameras deliver richer semantic information but are disproportionately affected by occlusion, light scattering, and precipitation. These complementary strengths and weaknesses indicate that sensor fusion is not a secondary improvement but a necessary design choice for ARAS. However, the transfer of sensor fusion strategies from car-based ADAS to motorcycles is not straightforward due to differences in dynamics, mounting constraints, and exposure to environmental noise.
Recognizing the pivotal role datasets play in both system training and performance benchmarking, the paper further explores datasets used across both ADAS and ARAS domains. Well-known resources such as KITTI, nuScenes, and RADIATE remain valuable for perception research, but they are car-centric and fail to capture motorcycle-specific dynamics such as roll angle, vibration, and limited sensor installation space. Although emerging ARAS datasets developed through academic–industry collaborations mark an important step forward, they are still limited in scale and diversity. As a result, algorithm development and performance benchmarking remain constrained, and there is an urgent need for large-scale, standardized datasets that reflect the realities of two-wheeled transportation in adverse weather conditions.
Lastly, the analysis of algorithms and techniques illustrates that deep learning approaches, especially with LiDAR input, can significantly enhance perception under low visibility conditions. However, their heavy reliance on large balanced datasets makes them prone to biases, an issue amplified by the lack of ARAS-specific data. Radar-based methods remain stable in rain but struggle with high-resolution classification of small or fast-moving targets. These limitations illustrate the trade-offs between robustness and resolution that continue to shape ARAS development and emphasize the importance of evaluating algorithms under conditions that mirror real-world complexity rather than controlled benchmarks.
7.1. Conclusions
This study examined the role of advanced rider assistance systems (ARAS) in enhancing motorcycle safety, with a particular focus on performance under varying weather conditions. The discussion highlighted different ARAS modules and their operational principles, emphasizing the limitations of sensor performance in adverse environments such as rain, fog, and snow. Addressing these challenges requires the use of diverse datasets that capture a wide range of weather scenarios, enabling effective benchmarking of ARAS modules through criteria that balance data quality, environmental diversity, and applicability to two-wheeler dynamics. In addition, a range of algorithms and techniques were reviewed, with links to recommended datasets and software tools to support practical deployment. By integrating insights on ARAS modules, sensor limitations, datasets, and algorithms, this study provides a structured foundation for advancing research in challenging weather contexts and contributes to the development of robust, weather-resilient ARAS solutions that enhance rider safety across diverse environmental conditions.
7.2. Future Recommendations
As ARAS technology advances toward real-world deployment, several critical challenges remain. One of the most pressing issues is ensuring reliable system performance in diverse and adverse weather conditions, where sensor degradation and data noise significantly affect perception accuracy. While this review highlights the benefits of radar, LiDAR, and camera fusion, deploying these multi-sensor systems on motorcycles presents unique constraints related to space, cost, and power efficiency. Additionally, deep learning models used for sensor fusion often require large datasets, high computational resources, and struggle with real-time inference, especially under changing environmental conditions. Future ARAS development must focus on creating lightweight, adaptive algorithms capable of processing fused data under strict latency and hardware limitations. Emerging approaches such as self-supervised learning, edge AI, and multimodal transformer architectures offer promising directions for improving system generalization, robustness, and efficiency. At the same time, mass adoption will depend on addressing broader integration and regulatory challenges, including standardization of sensor protocols, fail-safe mechanisms, and ensuring user trust in automated interventions. Moving forward, ARAS research must not only refine technical performance but also align with deployment realities and safety-critical requirements in the motorcycle domain.