Next Article in Journal
Information Systems in Pre-Combination M&A: Developing an ISOFAM
Next Article in Special Issue
UAM Vertiport Network Design Considering Connectivity
Previous Article in Journal
Study on the Complexity Evolution of the Aviation Network in China
Previous Article in Special Issue
Length Requirements for Urban Expressway Work Zones’ Warning and Transition Areas Based on Driving Safety and Comfort
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of a Networked Multi-Participant Driving Simulator with Synchronized EEG and Telemetry for Traffic Research

by
Poorendra Ramlall
,
Ethan Jones
and
Subhradeep Roy
*
Department of Mechanical Engineering, Embry-Riddle Aeronautical University, 1 Aerospace Blvd, Daytona Beach, FL 32114, USA
*
Author to whom correspondence should be addressed.
Systems 2025, 13(7), 564; https://doi.org/10.3390/systems13070564
Submission received: 21 May 2025 / Revised: 1 July 2025 / Accepted: 2 July 2025 / Published: 10 July 2025
(This article belongs to the Special Issue Modelling and Simulation of Transportation Systems)

Abstract

This paper presents a multi-participant driving simulation framework designed to support traffic experiments involving the simultaneous collection of vehicle telemetry and cognitive data. The system integrates motion-enabled driving cockpits, high-fidelity steering and pedal systems, immersive visual displays (monitor or virtual reality), and the Assetto Corsa simulation engine. To capture cognitive states, dry-electrode EEG headsets are used alongside a custom-built software tool that synchronizes EEG signals with vehicle telemetry across multiple drivers. The primary contribution of this work is the development of a modular, scalable, and customizable experimental platform with robust data synchronization, enabling the coordinated collection of neural and telemetry data in multi-driver scenarios. The synchronization software developed through this study is freely available to the research community. This architecture supports the study of human–human interactions by linking driver actions with corresponding neural activity across a range of driving contexts. It provides researchers with a powerful tool to investigate perception, decision-making, and coordination in dynamic, multi-participant traffic environments.

1. Introduction

Driving simulators have emerged as a powerful alternative to real-world experiments for studying driver behavior, vehicle dynamics, and traffic interactions. Their advantages are twofold: they ensure participant safety, and they offer unparalleled control over experimental conditions [1]. In contrast to on-road testing, simulators enable the reproduction of rare or dangerous scenarios—such as sudden merging, near-miss events, or collisions—without risk to human life [2]. Furthermore, in a simulator-based setup, researchers can precisely manipulate variables like the number and behavior of surrounding vehicles, road geometry, weather conditions, and even the time of day, thereby eliminating confounding effects that often complicate field studies [2,3,4,5]. However, a critical challenge remains in determining how closely participant behavior mimics real-world driving [6]. Drivers may behave differently if the simulation environment lacks realistic sensory cues. To address this, modern simulators rely on immersive technologies such as head-mounted VR headsets, motion platforms that simulate inertial feedback, and visually convincing physics engines that govern vehicle behavior [7,8,9]. The convergence of gaming hardware and high-fidelity simulation software has significantly reduced the cost barrier. For example, commercial racing platforms like Assetto Corsa now offer realistic vehicle physics, advanced rendering, and customization options suitable for academic use [10].
While these developments support naturalistic driving, a major limitation persists: most driving simulators are single-participant systems. In such setups, a human driver operates in a virtual world surrounded by simulated vehicles whose behaviors follow predefined, non-adaptive scripts. This restricts interaction to a unidirectional dynamic, where the human driver reacts to static or rule-based traffic, but the surrounding vehicles do not respond to the driver’s behavior in any meaningful or realistic way [11,12,13,14]. As a result, these systems fail to capture the rich, bidirectional human–human interactions observed in real-world traffic—such as mutual negotiation during lane changes, merging behavior, or implicit signaling between drivers. The literature identifies three dominant approaches to studying driver interactions: simulation models that assume driver behavior through mathematical or rule-based formulations [15,16]; conventional simulators that include only one human driver surrounded by scripted traffic [11,17,18]; and naturalistic studies conducted on real roads with human drivers [19,20,21]. While the first two approaches lack authenticity in capturing human–human interactions, the third introduces safety risks and uncontrollable environmental variables [11,21,22,23]. To overcome these limitations, networked driving simulators (Figure 1) have emerged, connecting multiple human drivers to a shared virtual environment. These systems enable dynamic, reciprocal behavior between drivers, creating opportunities to study how complex traffic interactions unfold when all participants are real people [4,11,12,24,25]. Importantly, drivers tend to behave more attentively and naturally when surrounded by other humans compared to simulated traffic, and without this human presence, it is not scientifically sound to draw conclusions about interactive driving behavior. Despite their growing relevance, very few studies have used networked simulators to collect data to build realistic models of multi-human driving interaction.
Driving simulators not only allow for the observation of behavioral responses but also provide a rich stream of vehicle telemetry, such as speed, acceleration, steering angle, brake pressure, and throttle input. These data can be supplemented with physiological measures to gain insight into drivers’ internal states. One of the most powerful tools in this context is electroencephalography (EEG), which captures brain activity with high temporal resolution and has been widely used to study attention, cognitive workload, and stress during dynamic tasks [26,27]. Traditional EEG systems, designed for clinical use, are often bulky and impractical for simulator environments, particularly those incorporating motion platforms. For such applications, dry scalp EEG systems—which require no conductive gel and can be worn comfortably—are preferable, despite their reduced signal fidelity [28,29,30]. Integrating EEG into a multi-participant simulator, however, introduces significant technical challenges related to data synchronization. Each participant’s EEG and driving data are captured using separate software platforms, such as Assetto Corsa for driving telemetry and a dedicated system for EEG signals. These must be precisely synchronized to a common time base to allow for their valid interpretation. This complexity is compounded when scaling to multiple participants, where synchronization must be ensured not just within each participant’s data but also across all participants involved in the experiment. Such alignment is crucial when studying collaborative or competitive driving tasks—like highway merging or overtaking—where researchers may wish to compare the timing of brake pressure and neural firing across individuals. In this paper, we present, for the first time, a comprehensive framework for a multi-participant networked driving simulator integrated with dry EEG, with the full synchronization of behavioral and neural data streams. We introduce a custom-built software that synchronizes data acquisition across multiple systems and participants, making it significantly easier for other researchers to adopt this cost-effective and scalable approach.

2. System Architecture

The experimental system consists of a set of immersive, networked driving simulators designed to study human driving behavior and cognitive activity in a shared virtual environment. Each simulator functions as an independent node within a multi-participant traffic simulation, allowing for multiple participants to drive simultaneously while their driving telemetry and EEG data are recorded in real time. This section describes the core components of the system, including the hardware and software used in each simulator and the EEG apparatus.

2.1. Driving Simulator Configuration

The driving simulator used in this study is designed to mimic the physical and perceptual cues of real-world driving as closely as possible, while remaining modular and cost-effective. Each simulator (Figure 2) includes the following: a physical driving cockpit (seat and frame), a motion platform to provide inertial feedback, a steering wheel and pedal set, a visual interface (either a high-resolution curved monitor or a VR headset), the Assetto Corsa simulation software, and a dry-electrode EEG headset paired with a trigger hub. The simulators are connected via a local server running Assetto Corsa, allowing participants to interact with one another in a shared virtual scenario. While the configuration supports flexibility in hardware selection, such as different wheelbases or display types, compatibility with Assetto Corsa must be ensured. Together, these elements create an immersive and interactive driving environment where naturalistic driver behavior can be observed and recorded. The system is built with flexibility and reproducibility in mind, allowing for individual components to be modified or upgraded based on experimental needs and participant preferences.

2.1.1. Driving Cockpit

The driving cockpit serves as the physical foundation of the simulator. It consists of a rigid frame and an adjustable racing seat, both sourced from Next Level Racing. The frame is built from aluminum profiles and steel reinforcements, offering the structural integrity necessary to support a motion platform and high-torque steering systems. It also includes mounting interfaces for the pedal box, steering column, and motion platform, ensuring ergonomic alignment and minimal vibration interference.
The cockpit’s adaptable design is crucial for participant comfort and realism. Seat position, tilt, and distance from the pedals can all be adjusted to accommodate drivers of different heights and postures, allowing them to assume a natural driving position. This is important not only for realism, but also for minimizing muscular strain during extended trials and reducing motion artifacts in EEG recordings. The cockpit also includes brackets and cable channels for securing wiring and peripheral devices, reducing clutter and preventing mechanical interference during operation.

2.1.2. Motion Platform

A key feature of the simulator is the integration of a motion platform between the seat and the base of the cockpit. The motion platform used in this setup is the Motion Platform V3 by Next Level Racing, which is designed for racing and flight simulators. It provides two degrees of freedom—pitch and roll—simulating longitudinal and lateral vehicle dynamics such as braking, acceleration, and cornering forces.
The motion feedback adds kinesthetic realism to the visual experience, enabling participants to feel the forces associated with driving maneuvers. This haptic feedback plays a critical role in immersing the participant and eliciting naturalistic responses to stimuli, such as adjusting steering angle or braking force in response to perceived deceleration. Calibration of the platform was performed using the manufacturer’s Platform Manager software, which is iteratively tuned to provide realistic yet non-fatiguing movement profiles. The gain and scaling of the motion response were adjusted to match the physical limits of the driving scenario while accounting for participant safety and comfort.
Importantly, the motion platform must operate silently and with minimal mechanical delay to avoid introducing latency between the visual and haptic feedback, as this can result in motion sickness or reduced immersion. The careful integration and isolation of vibrations were also implemented to ensure they did not introduce noise into the EEG signal during recording. To reduce mechanical noise that could interfere with EEG recordings, vibration isolation was implemented by mounting and tuning steel springs between the base of the motion platform and the cockpit frame. These springs absorb residual vibrations and decouple platform motion from the EEG headset, contributing to a more stable neural signal acquisition.

2.1.3. Steering Wheel and Pedals

The steering and pedal system forms the primary input interface for the driver. In our setup, the steering wheel system includes a direct-drive servo motor base and an interchangeable wheel rim, both compatible with Assetto Corsa. Direct-drive steering systems are preferred over gear- or belt-driven alternatives due to their high torque resolution, instantaneous force feedback, and absence of mechanical lag. This ensures the driver feels realistic resistance and feedback during cornering, collisions, or road texture changes, enhancing behavioral fidelity.
The pedal set includes independent modules for the clutch, brake, and accelerator, each with customizable tension and travel distance. Load-cell-based braking is used to measure actual pedal force rather than displacement, providing a more accurate representation of real-world braking behavior. This level of granularity is essential in experiments focused on fine motor control and reaction time measurements. Both the steering and pedal systems are mounted securely to the cockpit frame to eliminate flex and shifting, preserving consistent input-response characteristics across trials.

2.1.4. Visual Interface

Visual immersion is achieved through either a high-resolution curved monitor (Samsung R55 series) or a VR headset (HTC Vive Pro 2), depending on participant preference and experimental requirements. The curved monitor offers a panoramic field of view that helps approximate the peripheral vision available in a real vehicle. With a wide aspect ratio and high refresh rate, it ensures smooth rendering of the simulation environment, minimizing aliasing and screen tearing.
For a more immersive experience, the VR headset is used to place the participant directly within the 3D driving environment. The HTC Vive Pro 2 features dual OLED displays with a combined resolution of 4896 x 2448 and a refresh rate of 120 Hz, providing crisp visuals and minimal latency. Head tracking allows the simulation to dynamically adjust the driver’s point of view based on head orientation, reinforcing spatial awareness and realism.
However, VR is not suitable for all participants. Some may experience discomfort, disorientation, or motion sickness due to the sensory mismatch between the visual and vestibular systems. For this reason, the system offers the option to use either visual interface, allowing researchers to select the most appropriate display mode based on participant tolerance and experimental goals. Regardless of their choice, the simulator delivers a consistent visual experience by maintaining identical field-of-view angles, screen distances, and rendering parameters across modalities.

2.1.5. Simulation Software: Assetto Corsa

The simulation environment is powered by Assetto Corsa, a commercially available racing simulator known for its realistic vehicle physics, detailed graphics, and extensive modding capabilities. Although designed primarily for entertainment, Assetto Corsa’s open-source asset architecture makes it highly suitable for academic research. Many commercial driving simulator platforms are expensive and do not support multi-driver scenarios [10]. Frutos et al. [10] evaluated multiple racing simulators for their suitability in transportation research and found that Assetto Corsa scored highly in both graphics quality and physical realism.
Its physics engine models tire friction, suspension behavior, drivetrain dynamics, and aerodynamic forces with high fidelity, creating a convincing driving experience. This realism is crucial when studying naturalistic behavior, as drivers are more likely to engage meaningfully with the environment when it behaves as expected. Additionally, Assetto Corsa supports multiple input types, customizable force feedback profiles, and telemetry output through shared memory, allowing researchers to collect real-time data on speed, throttle, braking, steering angle, gear selection, and more.
One of Assetto Corsa’s most valuable features for this project is its ability to host local multiplayer sessions via a dedicated server. This enables multiple simulators to be networked into a shared driving scenario, allowing human drivers to interact with one another in real time. Because all vehicles operate in the same virtual environment, it is possible to observe genuine driver–driver interactions, such as merging, overtaking, and cooperative maneuvers. The system supports the importing of custom-designed tracks and vehicles, allowing experiments to be tailored to specific road geometries, traffic densities, or behavioral hypotheses.
The combination of Assetto Corsa’s physics accuracy, real-time data access, and multiplayer networking makes it a powerful platform for cognitive and behavioral driving research. When paired with physiological data collection methods like EEG, it provides a rich multimodal dataset for investigating human performance in complex traffic scenarios.

2.2. EEG Apparatus and Configuration

To capture participants’ brain activity, the EEG setup includes the DSI-24 dry electrode headset and a trigger hub (Figure 3), both manufactured by Wearable Sensing [31,32]. The DSI-24 collects EEG signals using 21 dry electrodes arranged according to the international 10–20 system [33,34]. This electrode placement method ensures standardized coverage of key cortical regions while allowing for rapid deployment and minimal participant discomfort.
Each electrode records small variations in scalp potential due to underlying neural activity. These signals are sampled and streamed using the DSI Streamer software, which also supports synchronized data capture across multiple headsets. The trigger hub provides an external pulse to all EEG headsets at the start of each experiment, enabling the precise temporal alignment of EEG recordings across participants.
Dry-electrode EEG systems present a compelling solution for traffic research involving driving simulators, particularly in studies requiring rapid participant turnover, naturalistic behavior, and integration with immersive technologies like VR. Compared to traditional wet systems, dry EEG offers significant practical advantages, such as a minimal setup time, enhanced participant comfort, and better compatibility with head-mounted displays, making them ideal for non-clinical, real-world experimental settings [35,36,37]. These strengths are especially valuable in dynamic experiments where participants are exposed to motion feedback and frequent headset reuse is needed. While dry EEG systems typically exhibit higher electrode impedance and may be more susceptible to motion artifacts and environmental noise, these challenges can be mitigated with proper headset fit and calibration, particularly for participants with dense or curly hair. The lower electrode count does reduce spatial resolution, but for applications focused on cognitive state estimation rather than fine-grained source localization, this tradeoff is acceptable. Overall, the balance of usability and operational efficiency makes dry EEG a practical and effective choice for large-scale, motion-enabled traffic simulation studies [35,36,37].

3. Methodology

3.1. Data Collection

The system supports the simultaneous recording of two parallel data streams for each participant: EEG signals and driving telemetry. EEG data can be sampled at 300 Hz using the DSI-24 headset and managed through the DSI Streamer software. In parallel, the system can capture detailed driving telemetry, including speed, steering angle, brake pressure, and throttle input, via a custom Python 3.13 script that interfaces with the Assetto Corsa’s shared memory API [38]. While Assetto Corsa’s server provides updates at approximately 8 Hz, the script is designed to sample key driving variables at up to 300 Hz, aligning with the EEG sampling rate to facilitate high-resolution synchronization across modalities.
To ensure synchronized EEG recording across multiple simulators, the system uses a trigger hub that sends a simultaneous pulse to all EEG headsets at the start of each experiment. This mechanism enables the consistent alignment of EEG data across participants, independent of individual system clocks. Driving telemetry logging, in contrast, is initiated manually on each simulator PC, which means timestamps across telemetry files are not inherently synchronized. The system architecture is designed to generate two parallel raw data streams per participant, one for EEG signals and one for driving telemetry, supporting flexible post-processing and alignment strategies.
The EEG files include time-indexed data from 21 electrodes, metadata fields, and trigger markers. The driving data files include time-series values for vehicle parameters, a local PC clock variable (pc_time), and a clock (session_time_remaining) that represents the countdown of the simulation session.

3.2. Data Processing and Synchronization

To enable meaningful cross-participant comparisons, the synchronization of these datasets is essential. The session_time_remaining variable is first converted to a forward-counting value, time_elapsed, and all time variables are reformatted into consistent units (seconds). Coordinate strings are parsed and converted into numerical values where necessary.
Since all EEG headsets are triggered simultaneously using the trigger hub, the EEG datasets are inherently synchronized. Similarly, driving telemetry files from different simulators can be aligned using the shared session_time_remaining clock. However, to avoid introducing artifacts from EEG interpolation, synchronization is anchored to the EEG Time variable. All driving telemetry data are interpolated to match this reference timeline.
The pc_time values are synchronized by anchoring them to the EEG Time, which is calibrated across participants using a hardware trigger provided by the EEG trigger hub. Driving data streams from each simulator are synchronized using the time_elapsed variable derived from the Assetto Corsa server clock session_time_remaining, which itself is synchronized via NTP across machines. Additionally, each recording captures a reference pc_time at the start of both EEG and simulator data collection. Using the initial EEG pc_time and incremental Time values, a synthetic pc_time series is constructed. The simulator’s pc_time is then locally aligned with this synthetic EEG clock. Global synchronization is achieved by referencing all data streams to the unified EEG Time.
The final result is a fully synchronized dataset in which each time sample corresponds to one EEG time point, containing EEG data and aligned driving telemetry from all participants. This unified dataset allows researchers to explore high-resolution patterns in driver behavior and brain activity under both solo and interactive driving conditions. The process of data synchronization is illustrated in Figure 4.
Although individual differences exist in EEG signals and driving behavior, synchronization in this framework is based on a shared temporal reference rather than behavioral or cognitive similarity. All data streams are aligned using a unified time base derived from the EEG trigger pulse and the simulator’s session clock. This approach ensures consistent cross-participant alignment while preserving the uniqueness of each participant’s response dynamics.
This synchronization ability is the central contribution of the presented work. While each EEG signal is inherently individual, the unified temporal alignment allows for future exploration of between-participant neural relationships, such as inter-brain coupling in different driving scenarios—for example, collaborative (merging) or competitive (overtaking) tasks.

Graphical User Interface for Data Synchronization

To support the synchronization approach described above and enhance overall usability, a custom graphical user interface (GUI) called syncApp was developed using MATLAB R2024a and JavaScript, and is publicly accessible at https://www.subhradeeproy.com/software (accessed on 31 June 2025).
A snapshot of this GUI is shown in Figure 5. The GUI enables users to load raw EEG and driving data, execute synchronization routines, and visualize time-aligned data streams. Researchers can inspect variable plots, validate data quality, and export synchronized datasets for further analysis. The interface is designed to lower the technical barrier for researchers interested in replicating or extending this experimental setup in their own laboratories.
While the GUI does not conduct EEG signal analysis itself, it implements the synchronization algorithm shown in Figure 4 to generate precisely time-aligned EEG and driving telemetry datasets. These synchronized outputs are compatible with widely used EEG analysis platforms such as EEGLAB [39], MNE-Python [40], and QStates [41]. By formatting the data for direct use, the GUI enables streamlined signal-level analyses—including spectral decomposition, phase-locking value computation (as demonstrated in our prior work [42]), and other cognitive state assessments. More importantly, this synchronization capability extends beyond conventional single-participant analysis. It creates new opportunities for advanced multi-participant paradigms such as hyperscanning, allowing researchers to explore inter-brain neural coupling and collective cognitive dynamics during complex driving interactions, an area rarely addressed in prior traffic simulation studies. This opens a path toward studying cognition not only at the individual level but also at the networked group level in realistic, dynamic environments.
The GUI consists of two key features: data loading and synchronization (shown in Figure 5a), and the visualization of various synchronized variables for the drivers (illustrated in Figure 5b–d). The GUI is designed to handle both asynchronous data and data that has already been synchronized. During the data loading process, a toggle button is provided to indicate whether the data is raw or pre-synchronized. If pre-synchronized data is loaded, no further synchronization is required, and the data can be visualized immediately.

4. System Validation and Usability Testing

This set-up is designed to be easily replicated. To achieve the best results, minimize latency, and maximize immersion, the minimum performance characteristics should include an Intel i9 processor (or similar), an NVDIA 3060Ti GPU (or similar), and at least 16 GB of RAM. A major consideration when designing this system is its overall usability, reliability, and resilience. Consequently, tests are conducted to validate the system design.
Since the set-up uses commercially available and well-tested software (Assetto Corsa), there is higher confidence in the reliability of this environment. When incorporating the motion feedback, the nature of the simulation results in the movement of the subject within their seat. Additionally, as subjects reposition and turn their heads, several motion artifacts appear in the EEG data. These motion artifacts can be identified by their characteristic spread across multiple EEG channels, which distinguishes them from localized neural activity, and can then be attenuated using standard preprocessing techniques. The EEG system ensures data reliability by transmitting recordings wirelessly to a local computer using Bluetooth communication, while also maintaining a redundant copy through onboard storage. After various trials, the signal quality of the EEG is verified to be excellent, with no apparent discontinuities or breaks in the transmission.
Pilot tests are conducted during and after the development of the synchronization algorithm to verify that the data are being synchronized properly, by means of computing the lag times. After interpolation of the driving data and complete synchronization with the higher sampled EEG data, the data points are exactly aligned. In addition, a general problem of data collection from real systems is the challenge of maintaining precise clock sampling due to hardware imperfections and transmission times, among other issues [43]. Before synchronization, the driving data and EEG data contain a clock drift of approximately 3.312 ms and 3.333 ms, respectively, after about 30 min of continual data collection. After synchronization, the latency across the two datasets is measured to be under 2 ms from repeated trials, with a mode of around 1.66 ms.
To evaluate system usability, 15 participants tested the simulator and provided structured feedback on comfort, immersion, and control through a post-session survey, provided in Appendix A. Their input was instrumental in refining several aspects of the setup, including the tuning of motion feedback amplitude and pedal stiffness, and determination of optimal experiment durations to minimize discomfort.
Three core usability dimensions were assessed: comfort, realism, and handling. Comfort was based on participant experiences with the VR headset, EEG headset, motion feedback, overall driving feel, and any VR-related disorientation (survey questions 1 through 6). Realism reflected user perceptions of the virtual environment, immersive quality, and the physical realism of motion and haptic feedback (questions 7 through 10). Handling captured impressions of the responsiveness of the brake, accelerator, and steering wheel (questions 11 through 13). The full list of survey questions is provided in the Appendix A. The average ratings across these categories are reported in Table 1.
On average, participants rated comfort at 7.5/10, realism at 8.0/10, and handling at 8.1/10, resulting in an overall usability rating of 7.8/10 for the system. The most common sources of discomfort were motion sickness during prolonged VR use and physical strain from the EEG headset when worn for extended periods (over one hour). These findings support the inclusion of a mixed interface option, allowing participants to choose between VR and monitor displays, and highlight the importance of limiting session durations to enhance user comfort and data quality.

5. Demonstration of Synchronized Data and Sample EEG Analysis

The developed framework enables precise temporal synchronization between EEG and driving telemetry data collected from multiple participants. Figure 6 presents an example of successfully synchronized data, showing time-aligned gas and brake pedal inputs alongside O1 electrode EEG activity for two drivers participating in a shared driving scenario. This level of synchronization allows researchers to explore how neural and behavioral responses unfold in parallel across interacting participants, supporting rich multi-modal analysis of driving behavior.
While detailed EEG analysis is not the primary focus of this study, the synchronized EEG and driving telemetry datasets produced by the system are fully compatible with standard cognitive state assessment tools. For example, Figure 7 illustrates a sample application using QStates software, where concentration levels during a sustained attention task are estimated based on α -band EEG activity. Both a linear estimator and a multivariate normal probability density function (MVNPDF) are applied to classify cognitive states, with lower α -activity being associated with higher concentration and elevated activity corresponding to relaxation. Similar analyses can be extended to other frequency bands (e.g., beta, delta, theta, gamma) to support broader assessments of driver attention, workload, and fatigue during various driving tasks.
Beyond individual-level analysis, the framework also enables more advanced applications, such as hyperscanning, which examines inter-brain synchrony and coordinated neural activity across multiple participants. While hyperscanning has seen widespread use in social neuroscience, it remains largely unexplored in driving research due to the absence of systems capable of collecting synchronized multi-user EEG and behavioral data. By bridging this gap, the present system lays the foundation for future investigations into shared decision-making, neural coordination, and real-time interaction in dynamic traffic environments.

6. Comparison to Other Networked Simulators

To contextualize the capabilities of the developed framework, we compare it against existing networked driving simulators based on six key criteria: scalability, support for VR, EEG integration, immersion, availability, and cost. The immersion level is qualitatively assessed based on factors such as motion and haptic feedback, environmental realism (e.g., weather and lighting), physics fidelity, and visual rendering.
Table 2 summarizes this comparison. As shown, the present system is one of the few that combines high immersion, multi-participant scalability, and synchronized EEG capability, while remaining accessible and cost-effective for broader research adoption.
Unlike prior work that either involves single drivers or lacks neural measurement, our system uniquely enables synchronized cognitive and behavioral data collection from multiple interacting participants. This lays the foundation for future studies on human–human driving interactions, shared decision-making, and inter-brain neural dynamics in traffic settings.

7. Driving Scenarios and Environmental Conditions

The framework supports the simulation of a wide range of driving scenarios under dynamic conditions, such as varying weather conditions and different times of day. Vehicle handling and responsiveness were adapted using realistic physics models—for instance, reduced traction on wet roads and limited visibility during fog or nighttime conditions. These features allow researchers to examine how drivers perceive and respond to naturalistic environmental challenges.
To support targeted investigations of driver behavior under diverse conditions, we designed a flexible and modular virtual environment. A custom virtual route was developed (Figure 8) and divided into distinct zones featuring diverse traffic and road conditions. These segments were designed to support the study of a range of driving behaviors, including single-lane car-following, lane-changing maneuvers involving lateral interactions between vehicles, and responses to varying road surface quality, from rough dirt segments to smooth paved sections.
Two illustrative examples are shown in Figure 9 and Figure 10. Figure 9 illustrates a single-lane car-following scenario under three distinct environmental conditions—clear daytime, foggy daytime, and clear nighttime—captured at the same location along the route to highlight differences in visibility and ambient context. Figure 10 illustrates a two-vehicle merging scenario, where a red car is merging into the lane occupied by a white car. From the white car’s perspective, the red vehicle is visible directly ahead, while the red car’s driver could see the white car through the side mirror.
This setup demonstrates how drivers experience the same scenario from different viewpoints and interact in real-time, enabling the system to support real-time behavioral coupling between participants. When paired with EEG data, such scenarios can support future investigations into the cognitive and behavioral aspects of human–human driving interactions.

8. Limitations

While the system presented in this paper offers a robust platform for synchronized multi-participant traffic experiments, several limitations remain. First, although dry-electrode EEG systems provide ease of use and faster setup times, they can be susceptible to motion artifacts and signal noise. This may impact the fidelity of neural data, especially in motion-rich scenarios involving aggressive driving maneuvers or platform-induced vibrations.
To minimize motion-related artifacts during EEG acquisition, it is recommended that headsets are fitted snugly to ensure stable electrode contact. Environmental conditions should be optimized, such as maintaining a cool room temperature, to reduce perspiration, and participants should be advised to avoid physical exertion and spicy food prior to recording. A 1–50 Hz bandpass filter, applied in the DSI Streamer software, can attenuate high-frequency noise, including 60 Hz power line interference. Although this study does not focus on EEG analysis, the recorded EEG data are fully compatible with standard preprocessing pipelines, such as EEGLAB [39] and QStates [41], allowing future users of the framework to perform artifact rejection and advanced signal analysis as needed [39,48,49].
Second, the current system architecture does not incorporate additional physiological sensors such as eye-trackers, galvanic skin response monitors, or heart rate sensors, which could offer complementary insights into the driver’s cognitive and emotional state. While integration is technically feasible, it requires further development and synchronization infrastructure. Third, although Assetto Corsa offers excellent realism and modding capabilities, future work may benefit from transitioning to fully open-source simulation platforms for greater experimental control and transparency. Finally, scalability is currently constrained by hardware requirements and local networking limitations. While the system can support multiple participants, expanding to large-scale simulations with a dozen or more drivers may require significant infrastructure upgrades and performance optimization. Addressing these limitations will be a key focus of future work aimed at extending the platform’s capabilities and ensuring broader usability across research domains.

9. Conclusions

This paper presents a comprehensive, modular, and cost-effective framework for conducting traffic experiments using networked, multi-participant driving simulators integrated with synchronized EEG data collection. Built upon commercially available hardware and open simulation platforms, the system replicates key physical and perceptual cues of real-world driving, enabling the observation of naturalistic human behavior in a controlled lab environment. Through the use of Assetto Corsa, motion-enabled cockpits, flexible visual interfaces, and dry-electrode EEG systems, we demonstrate a setup that is both scalable and adaptable to various experimental needs.
A key contribution of this work lies in the synchronization methodology, which enables the fine-grained temporal alignment of behavioral (driving) and neural (EEG) data both within and across participants. The system captures rich datasets that can inform future studies on driver cognition, cooperative and competitive behavior, and distributed decision-making in traffic scenarios. We also release the synchronization software, along with sample datasets, to facilitate broader adoption and collaborative development.
By demonstrating system performance, data quality, synchronization accuracy, and compatibility with EEG analysis tools, this work establishes the necessary foundation for future research that integrates behavioral and neural data in multi-participant traffic scenarios.

10. Future Work

Beyond enabling synchronized data acquisition, this platform opens the door to a wide range of novel research questions that have previously been difficult to study due to the lack of an affordable setup capable of supporting multi-driver experiments with synchronized data streams. For example, researchers can investigate how drivers cognitively and behaviorally respond to complex traffic scenarios such as near-miss incidents, merging conflicts, or cooperative maneuvers. The system also supports multi-participant synchronization, facilitating hyperscanning studies that examine inter-brain neural coupling during real-time human–human driving interactions—an area largely unexplored in traffic research. Its flexibility allows for the implementation of diverse driving tasks under varying conditions such as different times of day, different road geometries, and varying weather conditions, all while maintaining experimental control. The modular setup further enables comparisons between hardware configurations (e.g., VR headset vs. monitor) to examine how different components affect cognitive engagement. Together, these capabilities address key gaps in the literature, where most of the existing work relies on single-driver setups or lacks physiological insight. By providing a robust framework for synchronized EEG and telemetry data collection, this work lays the foundation for future cognitive and behavioral analyses in interactive driving contexts.

Author Contributions

Within the scope of this study, the authors P.R. and S.R. contributed equally in most aspects. The author E.J. was responsible for the creation of the GUI under the guidance of S.R. and P.R. The following briefly outlines the contributions based on each category. Conceptualization, S.R. and P.R.; methodology, S.R.; software, P.R.; validation, P.R. and S.R.; formal analysis, P.R., E.J. and S.R.; resources, S.R.; data curation, P.R.; writing—original draft preparation, P.R.; writing—review and editing, P.R. and S.R.; visualization, P.R.; supervision, S.R.; project administration, S.R.; funding acquisition, S.R. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by National Science Foundation CAREER award (CMMI-2238359).

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article. The GUI developed for this study is dubbed syncApp and is available at https://www.subhradeeproy.com/software (accessed on 31 June 2025).

Acknowledgments

During the preparation of this manuscript, the author(s) used ChatGPT 4o for the purposes of generating the illustrative schematic of the driving simulator. The authors thank the Mechanical Engineering Department and the Dean of the College of Engineering at Embry-Riddle Aeronautical University for providing the laboratory space that made this research possible.

Conflicts of Interest

The authors declare no conflicts of interest. The funding agency had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Appendix A

Survey Questions

Please rate each statement from 0 (Strongly Disagree/Not at all) to 10 (Strongly Agree/Extremely) by checking one box per row.
#Statement:012345678910
Section 1: Comfort
1I was comfortable using the Virtual Reality environment
2I was not disoriented by the VR environment
3I did not encounter any motion sickness
4The EEG cap was comfortable to wear
5The feedback from the motion platform (seat) was tuned to a comfortable level
6The driving simulator allowed for a comfortable driving experience
Section 2: Realism
7I have extensive real-world driving experience
8The VR environment improved immersiveness
9The simulation environment is adequately realistic
10The feedback from the motion platforms improved immersiveness
Section 3: Handling
11The brake pedal was comfortable to use
12The gas pedal was comfortable to use
13The steering on the simulator rig was comfortable to use

References

  1. Krasniuk, S.; Toxopeus, R.; Knott, M.; McKeown, M.; Crizzle, A.M. The effectiveness of driving simulator training on driving skills and safety in young novice drivers: A systematic review of interventions. J. Saf. Res. 2024, 91, 20–37. [Google Scholar] [CrossRef] [PubMed]
  2. Alonso, F.; Faus, M.; Riera, J.V.; Fernandez-Marin, M.; Useche, S.A. Effectiveness of Driving Simulators for Drivers’ Training: A Systematic Review. Appl. Sci. 2023, 13, 5266. [Google Scholar] [CrossRef]
  3. de Winter, J.; Leeuwen, P.M.; Happee, R. Advantages and Disadvantages of Driving Simulators: A Discussion. In Proceedings of the Measuring Behavior Conference, Utrecht, The Netherlands, 28–31 August 2012. [Google Scholar]
  4. Abdelgawad, K.; Gausemeier, J.; Grafe, M.; Berssenbrügge, J. Interest Manager for Networked Driving Simulation Based on High-Level Architecture. Designs 2017, 1, 3. [Google Scholar] [CrossRef]
  5. Chen, F.; Terken, J.; Chen, F.; Terken, J. Driving Simulator Applications. In Automotive Interaction Design: From Theory to Practice; Springer Nature: Singapore, 2023; pp. 239–256. [Google Scholar]
  6. Chen, L.; Fang, J.; Li, J.; Xie, J. Research on the Effectiveness of Driving Simulation Systems in Risky Traffic Environments. Systems 2025, 13, 329. [Google Scholar] [CrossRef]
  7. Tiu, J.; Harmon, A.C.; Stowe, J.D.; Zwa, A.; Kinnear, M.; Dimitrov, L.; Nolte, T.; Carr, D.B. Feasibility and Validity of a Low-Cost Racing Simulator in Driving Assessment after Stroke. Geriatrics 2020, 5, 35. [Google Scholar] [CrossRef]
  8. Zhang, S.; Zhao, C.; Zhang, Z.; Lv, Y. Driving simulator validation studies: A systematic review. Simul. Model. Pract. Theory 2025, 138, 103020. [Google Scholar] [CrossRef]
  9. Feliciani, C.; Crociani, L.; Gorrini, A.; Nagahama, A.; Nishinari, K.; Bandini, S. Experiments and Usability Tests of a VR-Based Driving Simulator to Evaluate Driving Behavior in the Presence of Crossing Pedestrians. In Proceedings of the Traffic and Granular Flow 2019, Pamplona, Spain, 2–5 July 2019; Springer International Publishing: Cham, Switzerland, 2020; pp. 471–477. [Google Scholar]
  10. de Frutos, S.H.; Castro, M. Assessing sim racing software for low-cost driving simulator to road geometric research. Transp. Res. Procedia 2021, 58, 575–582. [Google Scholar] [CrossRef]
  11. Paz, A.; Veeramisti, N.; Khaddar, R.; de la Fuente-Mella, H.; Modorcea, L. Traffic and Driving Simulator Based on Architecture of Interactive Motion. TheScientificWorldJournal 2015, 2015, 340576. [Google Scholar] [CrossRef]
  12. Muehlbacher, D. The Multi-Driver Simulation: A Tool to Investigate Social Interactions Between Several Drivers. In UR:BAN Human Factors in Traffic: Approaches for Safe, Efficient and Stress-free Urban Traffic; Bengler, K., Drüke, J., Hoffmann, S., Manstetten, D., Neukum, A., Eds.; Springer Fachmedien: Wiesbaden, Germany, 2018; pp. 379–391. [Google Scholar]
  13. Lane, D.; Roy, S. Validating a data-driven framework for vehicular traffic modeling. J. Phys. Complex. 2024, 5, 025008. [Google Scholar] [CrossRef]
  14. Lane, D.; Roy, S. Using information theory to detect model structure with application in vehicular traffic systems. IFAC PapersOnLine 2023, 56, 367–372. [Google Scholar] [CrossRef]
  15. ach, L.; Svyetlichnyy, D. Comprehensive Review of Traffic Modeling: Towards Autonomous Vehicles. Appl. Sci. 2024, 14, 8456. [Google Scholar]
  16. Calvert, S.; van Arem, B. A generic multi-level framework for microscopic traffic simulation with automated vehicles in mixed traffic. Transp. Res. Part C Emerg. Technol. 2020, 110, 291–311. [Google Scholar] [CrossRef]
  17. Previati, G.; Mastinu, G. SUMO Roundabout Simulation with Human in the Loop. SUMO Conf. Proc. 2023, 4, 29–40. [Google Scholar] [CrossRef]
  18. Godley, S.T.; Triggs, T.J.; Fildes, B.N. Driving simulator validation for speed research. Accid. Anal. Prev. 2002, 34, 589–600. [Google Scholar] [CrossRef]
  19. Ashley, G.; Osman, O.A.; Ishak, S.; Codjoe, J. Investigating Effect of Driver-, Vehicle-, and Road-Related Factors on Location-Specific Crashes with Naturalistic Driving Data. Transp. Res. Rec. 2019, 2673, 46–56. [Google Scholar] [CrossRef]
  20. Transportation Research Board and National Academies of Sciences, Engineering, and Medicine. Analysis of Naturalistic Driving Study Data: Safer Glances, Driver Inattention, and Crash Risk; The National Academies Press: Washington, DC, USA, 2014. [Google Scholar] [CrossRef]
  21. Ehsani, J.P.; Harbluk, J.L.; Bärgman, J.; Williamson, A.; Michael, J.P.; Grzebieta, R.; Olivier, J.; Eusebio, J.; Charlton, J.; Koppel, S.; et al. Naturalistic Driving Studies: An Overview and International Perspective. In International Encyclopedia of Transportation; Vickerman, R., Ed.; Elsevier: Oxford, UK, 2021; pp. 20–38. [Google Scholar]
  22. Fitch, G.; Soccolich, S.; Guo, F.; McClafferty, J.; Olson, R.; Pérez-Toledano, M.; Hanowski, R.; Hankey, J.; Dingus, T. The Impact of Hand-Held and Hands-Free Cell Phone Use on Driving Performance and Safety-Critical Event Risk; Technical Report; Virginia Polytechnic Institute and State University: Blacksburg, VA, USA, 2013. [Google Scholar]
  23. Carsten, O.; Kircher, K.; Jamson, S. Vehicle-based studies of driving in the real world: The hard truth? Accid. Anal. Prev. 2013, 58, 162–174. [Google Scholar] [CrossRef]
  24. Abdelgawad, K.; Gausemeier, J.; Dumitrescu, R.; Grafe, M.; Stöcklein, J.; Berssenbrügge, J. Networked Driving Simulation: Applications, State of the Art, and Design Considerations. Designs 2017, 1, 4. [Google Scholar] [CrossRef]
  25. Roy, S. Quantifying interactions among car drivers using information theory. Chaos: Interdiscip. J. Nonlinear Sci. 2020, 30, 113125. [Google Scholar] [CrossRef]
  26. Casson, A.J. Wearable EEG and beyond. Biomed. Eng. Lett. 2019, 9, 53–71. [Google Scholar] [CrossRef]
  27. Cao, Z.; Chuang, C.H.; King, J.K.; Lin, C.T. Multi-channel EEG recordings during a sustained-attention driving task. Sci. Data 2019, 6, 19. [Google Scholar] [CrossRef]
  28. Usman, S.M.; Khalid, S.; Akhtar, R.; Bortolotto, Z.; Bashir, Z.; Qiu, H. Using scalp EEG and intracranial EEG signals for predicting epileptic seizures: Review of available methodologies. Seizure 2019, 71, 258–269. [Google Scholar] [CrossRef] [PubMed]
  29. Li, Z.; Fields, M.; Panov, F.; Ghatan, S.; Yener, B.; Marcuse, L. Deep Learning of Simultaneous Intracranial and Scalp EEG for Prediction, Detection, and Lateralization of Mesial Temporal Lobe Seizures. Front. Neurol. 2021, 12, 705119. [Google Scholar] [CrossRef] [PubMed]
  30. Sultana, M.; Jain, O.; Halder, S.; Matran-Fernandez, A.; Nawaz, R.; Scherer, R.; Chavarriaga, R.; del R. Millán, J.; Perdikis, S. Evaluating Dry EEG Technology Out of the Lab. In Proceedings of the 2024 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence and Neural Engineering (MetroXRAINE), St Albans, UK, 21–23 October 2024; pp. 752–757. [Google Scholar]
  31. Wearable Sensing. DSI-24. Available online: https://wearablesensing.com/dsi-24 (accessed on 10 April 2025).
  32. Wearable Sensing. Wireless Trigger Hub. Available online: https://wearablesensing.com/wireless-trigger-hub (accessed on 10 April 2025).
  33. Acharya, J.N.; Hani, A.; Cheek, J.; Thirumala, P.; Tsuchida, T.N. American Clinical Neurophysiology Society Guideline 2: Guidelines for Standard Electrode Position Nomenclature. J. Clin. Neurophysiol. Off. Publ. Am. Electroencephalogr. Soc. 2016, 33, 308–311. [Google Scholar]
  34. EMOTIV. Understanding the 10-20 System of EEG Electrode Placement. Available online: https://www.emotiv.com/blogs/how-to/understanding-the-10-20-system-of-eeg-electrode-placement (accessed on 10 April 2025).
  35. Bang, J.S.; Won, D.O.; Kam, T.E.; Lee, S.W. Motion Sickness Prediction Based on Dry EEG in Real Driving Environment. IEEE Trans. Intell. Transp. Syst. 2023, 24, 5442–5455. [Google Scholar] [CrossRef]
  36. Wang, F.; Chen, D.; Lu, B.; Wang, H.; Fu, R. A Novel Semi-Dry Electrode Suitable for Collecting EEG Data for Detecting Driving Fatigue in Long-Period Driving Case. IEEE Sens. J. 2023, 23, 17891–17900. [Google Scholar] [CrossRef]
  37. Zander, T.O.; Andreessen, L.M.; Berg, A.; Bleuel, M.; Pawlitzki, J.; Zawallich, L.; Krol, L.R.; Gramann, K. Evaluation of a Dry EEG System for Application of Passive Brain-Computer Interfaces in Autonomous Driving. Front. Hum. Neurosci. 2017, 11, 78. [Google Scholar] [CrossRef]
  38. Romagnoli, G. ACSharedMemoryDocumentation. 2015. Available online: https://assettocorsamods.net/threads/doc-shared-memory-reference-acc.3061 (accessed on 17 November 2024).
  39. Delorme, A.; Makeig, S. EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J. Neurosci. Methods 2004, 134, 9–21. [Google Scholar] [CrossRef]
  40. Gramfort, A.; Luessi, M.; Larson, E.; Engemann, D.A.; Strohmeier, D.; Brodbeck, C.; Goj, R.; Jas, M.; Brooks, T.; Parkkonen, L.; et al. MEG and EEG data analysis with MNE-Python. Front. Neurosci. 2013, 7, 267. [Google Scholar] [CrossRef]
  41. McDonald, N.J.; Soussou, W. QUASAR’s QStates cognitive gauge performance in the cognitive state assessment competition 2011. In Proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; pp. 6542–6546. [Google Scholar]
  42. Beauchene, C.; Roy, S.; Moran, R.; Leonessa, A.; Abaid, N. Comparing brain connectivity metrics: A didactic tutorial with a toy model and experimental data. J. Neural Eng. 2018, 15, 056031. [Google Scholar] [CrossRef]
  43. Salimnejad, M.; Pappas, N.; Kountouris, M. So timely, Yet so stale: The impact of clock drift in real-time systems. Comput. Res. Repos. 2025, 2501, 549. [Google Scholar] [CrossRef]
  44. Goedicke, D.; Zolkov, C.; Friedman, N.; Wise, T.; Parush, A.; Ju, W. Strangers in a Strange Land: New Experimental System for Understanding Driving Culture Using VR. IEEE Trans. Veh. Technol. 2022, 71, 3399–3413. [Google Scholar] [CrossRef]
  45. Miller, J.; Kalivarapu, V.; Holm, M.; Finseth, T.; Williams, J.; Winer, E. A Flexible Multi-Modal Multi-User Traffic Simulation for Studying Complex Road Design. In Proceedings of the 40th Computers and Information in Engineering Conference (CIE), International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Virtual, Online, 17–19 August 2020; Volume 9, p. V009T09A065. [Google Scholar]
  46. Yang, M.; Bao, Q.; Shen, Y.; Qu, Q.; Zhang, R.; Han, T.; Zhang, H.; Gao, M. Strategic crossing decisions in vehicle interactions at uncontrolled intersections: A networked driving simulator study. Accid. Anal. Prev. 2025, 215, 107990. [Google Scholar] [CrossRef] [PubMed]
  47. Xu, J.; Lin, Y. Impact of Distracted Drivers on Individual and Group Behavior of Following Vehicles: A Networked Multi-Driver Simulator Study. Transp. Res. Rec. 2018, 2672, 45–54. [Google Scholar] [CrossRef]
  48. Delorme, A.; Sejnowski, T.; Makeig, S. Enhanced detection of artifacts in EEG data using higher-order statistics and independent component analysis. NeuroImage 2007, 34, 1443–1449. [Google Scholar] [CrossRef]
  49. Michel, C.M.; Brunet, D. EEG Source Imaging: A Practical Review of the Analysis Steps. Front. Neurol. 2019, 10, 325. [Google Scholar] [CrossRef]
Figure 1. Illustrative schematic of the multi-participant driving simulator framework, showing two driving stations equipped with either VR headsets or curved screens. Each simulator represents an individual driver node within a shared virtual environment.
Figure 1. Illustrative schematic of the multi-participant driving simulator framework, showing two driving stations equipped with either VR headsets or curved screens. Each simulator represents an individual driver node within a shared virtual environment.
Systems 13 00564 g001
Figure 2. Experimental setup showing a participant operating the multi-participant driving simulator. Key components are labeled as follows: (1) driving cockpit, (2) motion platform, (3) steering wheel and pedals, (4) curved display screen, (5) VR headset, (6) EEG headset, and (7) wireless trigger hub.
Figure 2. Experimental setup showing a participant operating the multi-participant driving simulator. Key components are labeled as follows: (1) driving cockpit, (2) motion platform, (3) steering wheel and pedals, (4) curved display screen, (5) VR headset, (6) EEG headset, and (7) wireless trigger hub.
Systems 13 00564 g002
Figure 3. (a) DSI-24 dry-electrode EEG headset and (b) wireless trigger hub used for EEG synchronization, both manufactured by Wearable Sensing [31,32].
Figure 3. (a) DSI-24 dry-electrode EEG headset and (b) wireless trigger hub used for EEG synchronization, both manufactured by Wearable Sensing [31,32].
Systems 13 00564 g003
Figure 4. Flowchart illustrating the data synchronization process used to align EEG signals and driving telemetry across multiple participants.
Figure 4. Flowchart illustrating the data synchronization process used to align EEG signals and driving telemetry across multiple participants.
Systems 13 00564 g004
Figure 5. Snapshots of GUI interface showing data loading and sample EEG channel visualization for two drivers. (a) data loading and synchronization controls; (b) telemetry and EEG Channel selection; (c) data visualization for the first driver; (d) data visualization for the second driver.
Figure 5. Snapshots of GUI interface showing data loading and sample EEG channel visualization for two drivers. (a) data loading and synchronization controls; (b) telemetry and EEG Channel selection; (c) data visualization for the first driver; (d) data visualization for the second driver.
Systems 13 00564 g005
Figure 6. Example of synchronized data showing gas and brake pedal inputs along with O1 electrode EEG signals recorded from two drivers during a shared driving session. (a) Gas and brake values. (b) O1 electrode values.
Figure 6. Example of synchronized data showing gas and brake pedal inputs along with O1 electrode EEG signals recorded from two drivers during a shared driving session. (a) Gas and brake values. (b) O1 electrode values.
Systems 13 00564 g006
Figure 7. Variation in concentration levels during a sustained attention task, classified based on alpha band EEG activity.
Figure 7. Variation in concentration levels during a sustained attention task, classified based on alpha band EEG activity.
Systems 13 00564 g007
Figure 8. Custom-designed virtual route developed to support a variety of driving scenarios for experimental testing.
Figure 8. Custom-designed virtual route developed to support a variety of driving scenarios for experimental testing.
Systems 13 00564 g008
Figure 9. Example simulation conditions, including variations in weather and time of day, for a single-lane car-following scenario on the custom-designed virtual route. (a) clear weather conditions; (b) foggy weather conditions; (c) clear weather conditions, nighttime.
Figure 9. Example simulation conditions, including variations in weather and time of day, for a single-lane car-following scenario on the custom-designed virtual route. (a) clear weather conditions; (b) foggy weather conditions; (c) clear weather conditions, nighttime.
Systems 13 00564 g009
Figure 10. Merging scenario on the custom-designed virtual route, showing the different perspectives of the involved drivers. (a) merging scenario: third-person view; (b) merging scenario: perspective of white car; (c) merging scenario: perspective of red car.
Figure 10. Merging scenario on the custom-designed virtual route, showing the different perspectives of the involved drivers. (a) merging scenario: third-person view; (b) merging scenario: perspective of white car; (c) merging scenario: perspective of red car.
Systems 13 00564 g010
Table 1. Average participant ratings for key simulator characteristics including comfort, realism, and handling, collected during the system tuning and usability testing phase.
Table 1. Average participant ratings for key simulator characteristics including comfort, realism, and handling, collected during the system tuning and usability testing phase.
CharacteristicAverage Score [1–10]Survey Questions
Comfort7.51–6
Realism8.07–10
Handling8.111–13
Overall Rating7.8
Table 2. Comparison between the proposed simulation framework and existing networked driving simulators across key evaluation criteria.
Table 2. Comparison between the proposed simulation framework and existing networked driving simulators across key evaluation criteria.
StudyScalableVREEGImmersionAvailabilityCost
CDSL (present study)yesyesyesvery highcommercialmid
DLR [24]nononohighresearchhigh
NII [24]yesnonovery lowcommerciallow
OSU [24]nononohighresearchhigh
Goedicke et al. [44]noyesnomidcommerciallow
Miller et al. [45]yesyesnovery lowcommerciallow
Yang et al. [46]yesnonolowresearchmid
Xu and Lin [47]yesnonolowcommerciallow
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ramlall, P.; Jones, E.; Roy, S. Development of a Networked Multi-Participant Driving Simulator with Synchronized EEG and Telemetry for Traffic Research. Systems 2025, 13, 564. https://doi.org/10.3390/systems13070564

AMA Style

Ramlall P, Jones E, Roy S. Development of a Networked Multi-Participant Driving Simulator with Synchronized EEG and Telemetry for Traffic Research. Systems. 2025; 13(7):564. https://doi.org/10.3390/systems13070564

Chicago/Turabian Style

Ramlall, Poorendra, Ethan Jones, and Subhradeep Roy. 2025. "Development of a Networked Multi-Participant Driving Simulator with Synchronized EEG and Telemetry for Traffic Research" Systems 13, no. 7: 564. https://doi.org/10.3390/systems13070564

APA Style

Ramlall, P., Jones, E., & Roy, S. (2025). Development of a Networked Multi-Participant Driving Simulator with Synchronized EEG and Telemetry for Traffic Research. Systems, 13(7), 564. https://doi.org/10.3390/systems13070564

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop