Cellular Vehicle-to-Everything Automated Large-Scale Testing: A Software Architecture for Combined Scenarios
Abstract
:1. Introduction
- Communication and Application ScenariosCommunication scenarios encompass communication channels and schemes that incorporate complex factors, including channel models, Doppler shifts, resource allocation strategies, and congestion control schemes.In a test case, two categories of vehicles are involved: key and environmental. Key vehicles pertain to the Host Vehicle (HV) and Remote Vehicle (RV), which are part of the application testing, whereas environmental vehicles are utilized to simulate scenarios with differing vehicle densities.Application scenarios depict specific instances or processes of an application, such as the 5 s immediately before and after a predicted collision in an FCW scenario. These scenarios encompass the microscopic motion characteristics (coordinates, speeds, accelerations, etc.) of the key vehicles and the surrounding environmental vehicles within the traffic environment.
- Large-ScaleThe concept of “large-scale” pertains to congestion scenarios characterized by high vehicle density rather than vast road networks where multiple roads intersect and connect.
- Field TestingField testing involves simulating a high-density traffic environment for key vehicles by controlling the power (distance) of nodes and the Basic Safety Message (BSM) content (coordinates, vehicle speed, etc.) transmitted by these nodes, rather than using actual environmental vehicles. This method enables the assessment of application performance under complex traffic conditions.
- Vehicle Testing: Autonomous Driving (AD), Advanced Driver Assistance Systems (ADAS), and overall vehicle performance;
- Road Infrastructure Testing: intelligent traffic signal control, Cooperative Vehicle Infrastructure System (CVIS), and smart parking;
- Pedestrian Testing: pedestrian safety warning and pedestrian priority;
- Other Scenarios: emergency management, data collection, and commercial applications.
2. Related Works
2.1. Test Scheme
2.2. Simulation Testing
2.3. Hardware-in-Loop Testing
2.4. Digital Twin Testing
2.5. Field Testing
2.6. Large-Scale Field Testing
2.7. Software and Tool Analysis
3. Requirements Analysis
3.1. C-V2X Large-Scale Testing System Architecture
3.2. Software Functional Analysis
3.2.1. Scenario Generator
3.2.2. Scenario Injection Module
3.2.3. Test Control and Analysis Module
- Test Task GenerationThe test configuration information from the user will be converted into the test task command to invoke other modules.This conversion is crucial for enhancing user-friendliness, enabling users to overlook the intricacies of the underlying hardware and software.Take a communication scenario, for instance. The test configuration information from the users includes channel-influencing factors, including vehicle density, relative speed, and weather conditions, which will be converted into intricate channel parameters such as Doppler shift, channel fading models, and frequency bands.
- Test Execution ControlTest execution control is the interaction with the scenario injection module, which is the test process triggered by continuous events. Additionally, the TS should provide functions for pausing, terminating, and restarting test tasks to handle potential failures during the tests.
- Test Data AnalysisFor an automatic TS, the real-time test data from the testbed nodes and DUTs are collected and analyzed to monitor the test task’s node status, communication performance, and application scenarios.Based on the test data, the test report is generated according to the requirements of the test.
3.2.4. C-V2X Testbed
3.3. Software Non-Functional Analysis
3.3.1. Node Access and Management
3.3.2. Real-Time Processing of High-Load Concurrent Data
3.3.3. Online Graphical User Interface
3.3.4. Maintenance of the Software
4. Software Architecture
4.1. Overall Architecture Design
- User LayerThis layer has two parts: the client GUI and the scenario simulator. Every user operation is passed to the logic layer to process, and then the client GUI is updated to display related data. The scenario simulator primarily injects scenario data.
- Logic LayerThis layer is the core of the entire system, receiving operations from the user layer on the database or the physical resources while receiving and processing events from the physical resources. It is mainly responsible for scenario and node management, communication testing, application testing, and other services, and conducts data management and hardware operations based on business logic.
- Resource LayerThis layer has physical resources in addition to the database used in classical information systems. The database contains user information, scenario information, node information, test execution information, and test result data. The physical resources mainly refer to the testbed nodes and the DUTs.
4.2. Core Functional Module Design
4.2.1. Scenario Management Module
4.2.2. Node Management Module
- Node AccessNode access is the most complex and crucial function of this module. The initial access process is illustrated in Figure 4. It is important to note that the node information stored in the database determines whether a node is accessing the area for the first time.Upon powering on, a node will initiate an application that is customized to the OBU manufacturer to connect to the logical controller and upload its MAC address. After the logic controller obtains the node access request and queries it through the database, it will notify the user of the presence of a new online node in the GUI and request the user’s configuration and the authorization data for the node. Subsequently, the controller generates a unique identifier for the node based on its MAC address, notifies the node of its successful access, stores the node information in the database, and updates its access status on the GUI.Nodes without configuration will be automatically shut down after a certain period to free up logical controller resources. Configured nodes will become available resources, entering the node resource pool and being uniformly managed by the logical controller.
- Status MonitoringThis function includes the ability to view the working status and statistics of the nodes. A node status can be idle, busy, faulty, or disabled. A node fault is critical and needs to be checked by management personnel in a timely manner. Statistical information includes historical working status, network traffic, package quantity, and other data.
- Information ChangeThis function can change and control two aspects of the information. The first is the configuration information, such as the bound physical scenario, node coordinates, and node name. The second is the node status, which turns the node on and off according to the requirements.
- Connectivity TestingThis function confirms the connectivity between the node devices and eliminates hardware faults. The platform supports the development of C-V2X interoperability testing between nodes. If the nodes fail the testing process, their status information becomes faulty.
4.2.3. Communication Testing Module
- Communication ControlThe accessed nodes serve as the communication resources the system allocates to construct the scenarios.For the communication testing, the environment is constructed based on the device control data, which should be distributed to each node once the task begins.
- Data CollectionThis function collects all C-V2X data from actual testing for a communication performance analysis. Some of the more critical data include the source address, destination address, C-V2X data type, associated task ID, and original C-V2X data.
4.2.4. Application Testing Module
- Scenario SimulatorThe simulator models actual physical scenarios and defines vehicle movement models and safety application scenarios to simulate the C-V2X movement scenarios (e.g., FCW).During testing, the simulator specifies the BSM content broadcast by the testbed nodes through the logical controller to inject the application scenarios. Additionally, the simulator subscribes to the status information of the DUT, enabling users to monitor the execution status of the application testing and perform corrections, pauses, or other operations.
- Logical ControllerThis provides the scenario simulator with interfaces for task information acquisition, DUT status subscription, and BSM injection. It collects data from these three interfaces and prepares to analyze the test data.
- Testbed NodesThese nodes broadcast the BSM information injected by the scenario simulator and collect the BSMs broadcast by the DUT to allow for the simulator to obtain the DUT’s status information.
- DUTThis represents the RSU or vehicle provided by the vehicle manufacturer, equipped with C-V2X security applications and operated by the testers. In the test environment, after the application under test is activated, DUTs periodically broadcast BSMs and upload their test data through designated application testing interfaces.
5. Implementation and Deployment
5.1. Communication Protocol
5.2. Graphical User Interface
5.3. Traffic Simulation Software
5.4. Message Broker
5.5. Database
5.6. Test Procedure
- Preparation StageThis stage mainly involves setting up the test type, scenario, and scale.Firstly, the test type must be specified, i.e., whether indoor or field tests are being conducted. Subsequently, the corresponding test scenario and scale are selected, and testbed nodes are allocated accordingly. The test scenario here refers to the actual physical environment; the allocated nodes are the device nodes connected to that physical environment.The GNSS signal simulator will provide satellite signals for field tests and timing services and simulated positioning functions for indoor tests.
- Test Management StageThis stage focuses on node management, task configuration, and task management.Once the test is officially initiated, the testbed will receive real-time status data from testbed nodes (including TBOBU and TBRSU) and the DUT to manage these nodes.Management includes device control (such as device on/off, start/stop of data reporting, congestion control activation/deactivation, device restart, and command execution), task configuration (such as setting test task types, configuring data content), and task management (including task start, pause, resume, and end).Notably, task configuration and task management functions are only practical for testbed nodes.
- Application Scenario Injection StageThe testbed generates and transparently transmits realistic scenario data simulated by the node array.This stage involves communication scenarios and application scenarios. Data injection for static communication scenarios originates from task configuration information, specifically including parameters such as node message transmission power and broadcast message frequency. Data injection for application scenarios is dynamically generated by the testbed and transparently transmitted to the testbed nodes in real-time.After receiving the scenario messages simulated by the test system, the testbed nodes will send out V2X-simulated data, completing the entire scenario injection process.
- Test Data Collection StageTest data are continuously collected during the test run for subsequent analysis and evaluation.During the test, the testbed nodes send out V2X-simulated data and transmit their business data back to the C-V2X testbed.Simultaneously, the DUT will send its data and receive V2X-simulated data to trigger corresponding scenarios. Subsequently, the DUT will transmit its business data and application result data back to the C-V2X testbed.
6. System Performance and Test Case Analysis
6.1. Testing System Performance for Large-Scale Loads
6.1.1. High-Load Performance Test
- Basic configuration of test controller: Intel(R)i5-10400 6core 16 G RAM (Intel, Santa Clara, CA, USA);
- Number of under-test nodes: 20, 50, 100, 200, 400, 800;
- Heartbeat interval: 5 s;
- BSM: 10 Hz.
6.1.2. Regular Load Performance Test
- Testing hardware environment: MacBook Air (M1) 8 G RAM (Apple, Cupertino, CA, USA);
- Number of nodes accessed: 10, 20, 30, 40, 50;
- BSM frequency: 10 Hz, 20 Hz, 30 Hz.
6.2. FCW Test Case
6.2.1. Field Testing Environment
6.2.2. Testing Procedure and Results’ Analysis
- •
- Test RequirementsThe test requires that the DUTs have a communication range greater than 300 m, a data update frequency (i.e., transmission frequency) of less than or equal to 10 Hz, a system latency of less than or equal to 100 ms, and a positioning error of less than or equal to 1.5 m.The test expects the HV to issue a forward collision warning between the distance points marked in Figure 13, which are (≈55.56 m) and (≈29.17 m). Here, is 4 s, and is 2.1 s, as defined in DOT HS 811 501. TTC refers to Time To Collision.
- •
- Test ProceduresThe FCW test procedure is as follows:
- –
- The HV is parked at the center of the lane, 600 m from the RV. Its longitudinal direction is parallel to the road edge, facing in the same direction as the RV.
- –
- The HV accelerates straight ahead until it reaches a constant speed of 50 km/h before reaching a distance of 150 m from the RV.
- –
- Test recording begins when the HV is 150 m away from the RV.
- –
- The test ends when one of the following conditions occurs: a. The FCW alarm is triggered by the HV between the distance points of (≈55.56 m) and (≈29.17 m); b. the HV fails to trigger the FCW alarm before reaching the distance point of ; c. the HV triggers the FCW alarm before reaching the distance point of ; d. a false alarm is triggered.
- –
- After the test, the HV should turn (recommended) or brake to avoid colliding with the RV.
- –
- The real-time speeds, travel trajectories of both the HV and RV, the FCW application warning time, and the FCW application warning distance are recorded.
- •
- Test Result AnalysisThe application response feature is shown in Figure 14, while the communication performance is shown in Figure 15. The test was repeated ten times under each working condition.As shown in Figure 14, the FCW TTC value declines slightly with the increase in node density. Moreover, there were differences in response time TTC between products from different Tier 1 vendors. One possible reason for this is the difference between the application response algorithms. Nevertheless, the overall, TTC performance was relatively stable.As shown in Figure 15, the packet loss rate rises with the increase in node density. 3GPP has specified quantitative performance requirements for advanced V2X applications in TS 22.186; the reliability range for these applications is 90% to 99.999% [41]. This implies that, for V2X applications, the maximum packet loss rate should not exceed 10%, as a higher rate would indicate a severely degraded communication environment.Comparing Figure 14 and Figure 15, we can conclude that communication performance has a limited impact on response time TTC. Even when the packet loss rate reaches 15.6%, FCW response time TTC is also higher than the value, which is 2.1 s. The above test results prove the effectiveness of C-V2X technologies from another perspective.
7. Discussion
7.1. System Effectiveness Comparison and Analysis
7.2. System Continuous Development
8. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Wang, J.; Shao, Y.; Ge, Y.; Yu, R. A Survey of Vehicle to Everything (V2X) Testing. Sensors 2019, 19, 334. [Google Scholar] [CrossRef] [PubMed]
- MCity. Available online: https://mcity.umich.edu/ (accessed on 7 May 2024).
- Astazero. Available online: https://www.astazero.com/ (accessed on 11 May 2024).
- Han, Q.; Yuan, X.; Zeng, L.; Zu, H.; Ye, L.; Lin, L. Scenario Oriented V2V Field Test Scheme with Dense Node Array. In Proceedings of the 2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC), Rhodes, Greece, 20–23 September 2020; pp. 1–6. [Google Scholar]
- Gani, S.M.O.; Fallah, Y.P.; Krishnan, H. Robust and Scalable V2V Safety Communication Based on the SAE J2945/1 Standard. IEEE Trans. Intell. Transp. Syst. 2022, 23, 861–872. [Google Scholar] [CrossRef]
- Facchina, C.; Jaekel, A. Speed Based Distributed Congestion Control Scheme for Vehicular Networks. In Proceedings of the 2020 IEEE Symposium on Computers and Communications (ISCC), Rennes, France, 7–10 July 2020; pp. 1–4. [Google Scholar]
- Lindstedt, R.; Kasparick, M.; Pilz, J.; Jaeckel, S. An Open Software-Defined-Radio Platform for LTE-V2X And Beyond. In Proceedings of the 2020 IEEE 92nd Vehicular Technology Conference (VTC2020-Fall), Victoria, BC, Canada, 18 November–16 December 2020; pp. 1–5. [Google Scholar]
- Li, L.; Wu, X.; Jiang, G.; Feng, J.; Zhang, X. A Virtual Driving Testing Method for C-V2X Performance Evaluation. In Proceedings of the 2021 IEEE International Workshop on Electromagnetics: Applications and Student Innovation Competition (iWEM), Guangzhou, China, 28–30 November 2021; Volume 1, pp. 1–3. [Google Scholar]
- Makinaci, K.M.; Acarman, T.; Yaman, C. Resource Selection for C-V2X and Simulation Study for Performance Evaluation. In Proceedings of the 2021 IEEE 93rd Vehicular Technology Conference (VTC2021-Spring), Helsinki, Finland, 25–28 April 2021; pp. 1–6. [Google Scholar]
- Malinverno, M.; Raviglione, F.; Casetti, C.; Chiasserini, C.F.; Mangues-Bafalluy, J.; Requena-Esteso, M. A Multi-stack Simulation Framework for Vehicular Applications Testing. In Proceedings of the 10th ACM Symposium on Design and Analysis of Intelligent Vehicular Networks and Applications, New York, NY, USA, 16–20 November 2020; pp. 17–24. [Google Scholar]
- Farag, M.M.G.; Rakha, H.A.; Mazied, E.A.; Rao, J. INTEGRATION Large-Scale Modeling Framework of Direct Cellular Vehicle-to-All (C-V2X) Applications. Sensors 2021, 21, 2127. [Google Scholar] [CrossRef]
- Marchetta, A.; Coppola, A.; Cinque, M.; Fiorentino, M.; Bifulco, G.N. An Eclipse MOSAIC-Based Hardware-in-Loop V2X Co-Simulation Framework for CCAM Services. In Proceedings of the 2023 IEEE 26th International Conference on Intelligent Transportation Systems (ITSC), Bilbao, Spain, 24–28 September 2023; pp. 5357–5362. [Google Scholar]
- Schiel, B.; Swindler, S.; Farmer, A.; Sharp, D.; Murali, A.H.; Corry, B.; Lundrigan, P. A Multi-layered Framework for Informing V2I Deployment Decisions Using Commercial Hardware-in-the-Loop Testing of RSUs. In Proceedings of the 2024 IEEE Vehicular Networking Conference (VNC), Kobe, Japan, 29–31 May 2024; pp. 313–320. [Google Scholar]
- Peters, S.; Sivrikaya, F.; Dang, X.T. SEP4CAM—A Simulative / Emulative Platform for C-V2X Application Development in Cross-Border and Cross-Domain Environments. In Proceedings of the 2021 IEEE/ACM 25th International Symposium on Distributed Simulation and Real Time Applications (DS-RT), Valencia, Spain, 27–29 September 2021; pp. 1–4. [Google Scholar]
- Chen, D.; Yan, Y.; Ye, L.; Hu, H.; Lei, J.; Zeng, L. On-road Features Based In-chamber C-V2X Application Test Scheme Design. In Proceedings of the 2022 IEEE Smartworld, Ubiquitous Intelligence & Computing, Scalable Computing & Communications, Digital Twin, Privacy Computing, Metaverse, Autonomous & Trusted Vehicles (SmartWorld/UIC/ScalCom/DigitalTwin/PriComp/Meta), Haikou, China, 15–18 December 2022; pp. 2191–2197. [Google Scholar]
- Marquez-Barja, J.; Lannoo, B.; Naudts, D.; Braem, B.; Maglogiannis, V.; Donato, C.; Mercelis, S.; Berkvens, R.; Hellinckx, P.; Weyn, M.; et al. Smart Highway: ITS-G5 and C-V2X based testbed for vehicular communications in real environments enhanced by edge/cloud technologies. In Proceedings of the European Conference on Networks and Communications (EuCNC), Valencia, Spain, 18–21 June 2019. [Google Scholar]
- Charpentier, V.; Slamnik-Krijestorac, N.; Marquez-Barja, J. Latency-aware C-ITS application for improving the road safety with CAM messages on the Smart Highway testbed. In Proceedings of the IEEE INFOCOM 2022—IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS), New York, NY, USA, 2–5 May 2022; pp. 1–6. [Google Scholar]
- Wang, J.; Ying, T.; Zhu, K.; Zhang, L.; Zhang, F.; Wang, Y. Large scale LTE-V2X Test Attempt via Real Deployment. In Proceedings of the 2022 IEEE 25th International Conference on Intelligent Transportation Systems (ITSC), Macau, China, 8–12 October 2022; pp. 3146–3151. [Google Scholar]
- Eckermann, F.; Wietfeld, C. SDR-based Open-Source C-V2X Traffic Generator for Stress Testing Vehicular Communication. In Proceedings of the 2021 IEEE 93rd Vehicular Technology Conference (VTC2021-Spring), Helsinki, Finland, 25–28 April 2021; pp. 1–5. [Google Scholar]
- Han, Q.; Yue, J.; Ye, L.; Zeng, L.; Long, Y.; Wang, Y. The Critical Scenario Extraction and Identification Method for ICV Testing. In Proceedings of the 2023 IEEE 26th International Conference on Intelligent Transportation Systems (ITSC), Bilbao, Spain, 24–28 September 2023; pp. 5778–5783. [Google Scholar]
- Chen, S.; Hu, J.L.; Zhao, L.; Zhao, R.; Fang, J.; Shi, Y.; Xu, H. 3.6 Technical Comparisons of IEEE 802.11p and C-V2X. In Cellular Vehicle-to-Everything (C-V2X), 1st ed.; Shen, X.S., Ed.; Springer: Singapore, 2023; pp. 106–109. 11p. [Google Scholar]
- IEEE std 1471-2000; IEEE Recommended Practice for Architectural Description for Software-Intensive Systems. The Institute of Electrical and Electronics Engineers: Piscataway, NJ, USA, 2000. [CrossRef]
- YD/T 4771-2024; Technical Requirements for C-V2X Large-Scale Testing System and Data Interface. Ministry of industry and Information Technology of the People’s Republic of China: Beijing, China, 2024.
- Chen, S.; Hu, J.L.; Zhao, L.; Zhao, R.; Fang, J.; Shi, Y.; Xu, H. 4.6 Resource Allocation Method. In Cellular Vehicle-to-Everything (C-V2X), 1st ed.; Shen, X.S., Ed.; Springer: Singapore, 2023; pp. 146–161. [Google Scholar]
- Rana, M.E.; Saleh, O.S. Chapter 15—High assurance software architecture and design. In System Assurances, 1st ed.; Johri, P., Anand, A., Vain, J., Singh, J., Quasim, M., Eds.; Academic Press: New York, NY, USA, 2022; pp. 271–285. [Google Scholar]
- Geppert, P.; Beilharz, J. Integration of C-V2X Into a Hybrid Testbed to Co-Simulate ITS Applications and Scenarios. In Proceedings of the 2022 IEEE International Conference on Cloud Engineering (IC2E), Pacific Grove, CA, USA, 26–30 September 2022; pp. 15–21. [Google Scholar]
- Bafna, S.A. Review on Study and Usage of MERN Stack for Web Development. Int. J. Res. Appl. Sci. Eng. Technol. (IJRASET) 2022, 10, 178–186. [Google Scholar] [CrossRef]
- Kaur, G.; Tiwari, R.G. Comparison and Analysis of Popular Frontend Frameworks and Libraries: An Evaluation of Parameters for Frontend Web Development. In Proceedings of the 2023 4th International Conference on Electronics and Sustainable Communication Systems (ICESC), Coimbatore, India, 6–8 July 2023; pp. 1067–1073. [Google Scholar]
- Gong, X.; Xiao, Y. A Skin Cancer Detection Interactive Application Based on CNN and NLP. J. Phys. Conf. Ser. 2021, 2078, 12–36. [Google Scholar] [CrossRef]
- Christie, M.A.; Marru, S.; Abeysinghe, E.; Upeksha, D.; Pamidighantam, S.; Adithela, S.P.; Mathulla, E.; Bisht, A.; Rastogi, S.; Pierce, M.E. An extensible Django-based web portal for Apache Airavata. In Proceedings of the PEARC ’20: Practice and Experience in Advanced Research Computing, Portland, OR, USA, 26–30 July 2020; pp. 160–167. [Google Scholar]
- Petropoulos, P.; Zoulias, E.; Liaskos, J.; Mantas, J. Web and Mobile Enabled Application for Public Health Inspections. Stud. Health Technol. Inform. 2023, 305, 425–426. [Google Scholar] [PubMed]
- Features, L.M.V. Available online: https://mcity.umich.edu/wp-content/uploads/2023/03/Mcity-Vehicle-Features_2023_03_13-1.pdf (accessed on 7 May 2024).
- Veins. Available online: https://veins.car2x.org/ (accessed on 8 April 2024).
- SUMO. Available online: https://eclipse.dev/sumo/ (accessed on 8 April 2024).
- Sharvari, T.; SowmyaNag, K. A study on Modern Messaging Systems- Kafka, RabbitMQ and NATS Streaming. arXiv 2019, arXiv:1912.03715. [Google Scholar]
- Srinivas, S.; Karna, V.R. A Survey on Various Message Brokers for Real-Time Big Data. In Proceedings of the International Conference on Sustainable Communication Networks and Application (ICSCN 2019), Erode, India, 30–31 July 2019; pp. 164–172. [Google Scholar]
- Singh, P.K.; Chaitra, P. Comprehensive Review of Stream Processing Tools. Int. Res. J. Eng. Technol. (IRJET) 2020, 7, 3537–3540. [Google Scholar]
- RabbitMQ. Available online: https://www.rabbitmq.com/ (accessed on 2 April 2024).
- Khan, W.; Kumar, T.; Zhang, C.; Raj, K.; Roy, A.M.; Luo, B. SQL and NoSQL Database Software Architecture Performance Analysis and Assessments—A Systematic Literature Review. Big Data Cogn. Comput. 2023, 7, 97. [Google Scholar] [CrossRef]
- DOT HS 811 501; A Test Track Protocol for Assessing Forward Collision Warning Driver-Vehicle Interface Effectiveness. National Highway Traffic Safety Administration: Washington, DC, USA, 2011.
- TR 22.886 version 16.2.0, Study on Enhancement of 3GPP Support for 5G V2X Services. 3rd Generation Partnership Project: Valbonne, France, 2018. Available online: https://portal.3gpp.org/desktopmodules/Specifications/SpecificationDetails.aspx?specificationId=3108 (accessed on 18 October 2024).
Testing System | Communication Standard | Network Simulator | Traffic Simulator | Features | Compared to Proposed |
---|---|---|---|---|---|
Resource Selection [9] | LTE-V2X PC5 | NS-3 | SUMO | Resource selection research | × |
Multi-stack [10] | IEEE 802.11p, C-V2X Mode 4 and LTE | NS-3 | SUMO | Multi-stack | × |
INTEGRATION Large-Scale Modeling Framework [11] | Direct C-V2X | Analytical Model | INTEGRATION | Large-scale (downtown LA. area, 145,000 vehicles) | × |
Eclipse MOSAIC-based HIL [12] | IEEE 802.11p | Simple Network Simulator | SUMO | Flexible architecture for new add-in components | ✓Layered software architecture |
Commercial V2I RSUs Deployment [13] | C-V2X | / | / | User custom, GUI, COTS RSU, DOT | ✓User custom ✓GUI ✓Devices, Vehicles ✓Commercial |
SEP4CAM [14] | C-V2X | Ns-3 | / | Cross-border and cross-domain environments, vehicle tracks from real-world | •Previous work achieves the extraction of real-world critical scenarios [20] |
VDT Method for C-V2X [8] | C-V2X | Channel Simulator | / | Real-world channel parameters | •Previous work uses the same method to build communication channel [4] |
On-road Features based In-chamber Testing [15] | C-V2X | / | SUMO | GNSS control | •Previous work [15] |
MCity [2] | / | / | / | Remote operating system | / |
AstaZero [3] | / | / | / | Digital twin simulation platform | / |
Smart Highway [16,17] | ITS-G5/C-V2X | / | / | User custom | ✓User custom |
Large-scale LTE-V2X Test [18] | LTE-V2X | SimuLTE | SUMO | Large-scale OBUs instead of vehicles | ✓Scalable Density Node Array |
Proposed | C-V2X | NS-3 | SUMO | Software architecture | / |
Vehicle Density | Equivalent Vehicle Number/100 m | |
---|---|---|
Low | <0.3 | 15 |
Medium | 0.3–0.6 | 25 |
High | 0.6–0.8 | 50 |
Jam | >0.8 | 100 |
Number of Nodes | 10 | 20 | 30 | 40 | 50 |
---|---|---|---|---|---|
10 Hz | 0.53 ms | 0.51 ms | 0.37 ms | 0.46 ms | 0.46 ms |
@1.68 m/s | @6.49 m/s | @14.8 m/s | @25.5 m/s | @39.3 m/s | |
20 Hz | 0.38 ms | 0.53 ms | 0.57 ms | 0.51 ms | 0.57 ms |
@3.22 m/s | @12.6 m/s | @27.8 m/s | @49.2 m/s | @78.3 m/s | |
30 Hz | 0.57 ms | 0.52 ms | 0.51 ms | 0.36 ms | 0.58 ms |
4.68 m/s | 18.4 m/s | 41.4m/s | @73.7 m/s | @127 m/s |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Han, Q.; Zhou, M.; Zeng, L.; Ye, L.; Tan, M.; Xie, F. Cellular Vehicle-to-Everything Automated Large-Scale Testing: A Software Architecture for Combined Scenarios. Appl. Sci. 2024, 14, 9688. https://doi.org/10.3390/app14219688
Han Q, Zhou M, Zeng L, Ye L, Tan M, Xie F. Cellular Vehicle-to-Everything Automated Large-Scale Testing: A Software Architecture for Combined Scenarios. Applied Sciences. 2024; 14(21):9688. https://doi.org/10.3390/app14219688
Chicago/Turabian StyleHan, Qingwen, Miao Zhou, Lingqiu Zeng, Lei Ye, Mingdeng Tan, and Fukun Xie. 2024. "Cellular Vehicle-to-Everything Automated Large-Scale Testing: A Software Architecture for Combined Scenarios" Applied Sciences 14, no. 21: 9688. https://doi.org/10.3390/app14219688
APA StyleHan, Q., Zhou, M., Zeng, L., Ye, L., Tan, M., & Xie, F. (2024). Cellular Vehicle-to-Everything Automated Large-Scale Testing: A Software Architecture for Combined Scenarios. Applied Sciences, 14(21), 9688. https://doi.org/10.3390/app14219688