Assessing Cross-Domain Threats in Cloud–Edge-Integrated Industrial Control Systems
Abstract
1. Introduction
- 1.
- Systematic Framework Construction: We systematically construct a cross-domain attack evaluation framework for cloud–edge collaborative ICS by detailed collection and classification of system assets, cross-domain attack entry points, attack methods, and attack impacts. This includes identifying various types of ICS assets, categorizing potential attack entry points, surveying attack techniques/tools, and classifying attack impacts. These preparatory steps establish the necessary data foundation and basic structure for subsequent attack evaluation.
- 2.
- Cross-Domain Attack Indicator System: We develop a novel cross-domain attack evaluation indicator system tailored to cloud–edge ICS scenarios. This indicator system accounts for the unique characteristics of cross-domain attacks in ICS and covers four key dimensions: system modules, attack paths, attack methods, and their potential impacts. It provides a standardized theoretical basis and methodology for identifying, evaluating, and defending against cross-domain attacks.
- 3.
- Simulation-Based Validation: We conduct simulation experiments under a representative ICS topology to test the practical effectiveness of the proposed evaluation method. The experiments verify the validity and feasibility of the approach in identifying weak points of the system and improving overall security.
2. Related Work
3. Methodology
3.1. Framework Overview
- Cloud Layer: centralized analytics, long-term data storage, containerized services.
- Edge Layer: distributed compute nodes deployed near physical processes, often with constrained resources and minimal hardening.
- Field Layer: programmable logic controllers (PLCs), sensors, actuators, and associated industrial protocols.
- 1.
- Data Collection and Threat Modeling: inventory of assets and mapping of known ICS techniques.
- 2.
- Indicator Quantification: computation of 16 tailored metrics across four risk dimensions.
- 3.
- Score Aggregation and Analysis: synthesis into a composite risk score for benchmarking and “what-if” exploration.
3.2. Indicator Definition and Calculation
3.2.1. System Module Indicators
- 1.
- Functional Dependency Score (FDS): This indicator measures the degree to which other modules in the system depend on module i for correct operation. Instead of subjective importance, is computed by analyzing the system’s functional dependency graph. A module that supports many downstream components or processes has a higher , indicating that its compromise could cause cascading failures. Let be the number of modules directly dependent on module i and the maximum such value across all modules:
- 2.
- Exposure Level (EL): reflects the number and type of interfaces through which module i communicates with other internal or external components, representing its potential attack surface. Interfaces may include APIs, network ports, or wireless connections. A module with many exposed endpoints or Internet-facing services has a higher risk of being targeted. Let be the number of such interfaces, normalized by the maximum observed value:
- 3.
- Security Posture Deviation (SPD): quantifies how much module i diverges from the recommended security configuration baseline. This includes outdated patches, misconfigurations, or policy violations. It is derived from audit data or security scans. A higher means the module has more security gaps. Let represent a normalized deviation score based on violations and be the worst-case deviation:
- 4.
- Segmentation Effectiveness (SE): evaluates how well module i is isolated from the rest of the system, based on logical or physical segmentation. Modules within high-trust or tightly segmented zones will have higher values, limiting lateral movement. This score is obtained via architectural analysis (e.g., network design and VLAN separation). Let be the assigned isolation score:
3.2.2. Attack Path Indicators
- 1.
- Initial Access Barrier (IAB): represents the difficulty an attacker faces in gaining the initial foothold required to begin traversing path j. This includes required credentials, network reachability, and exploit availability. A higher means that starting the path requires more effort or privilege. Let be a normalized score reflecting access barriers:
- 2.
- Traversal Span (TS): measures how broadly path j stretches across different zones, layers, or security domains. It captures lateral movement capability and the scope of compromise. Let be the number of unique zones traversed, normalized by Z, the maximum across all paths:
- 3.
- Detection Evasion Potential (DEP): indicates how likely it is that attacks along path j will evade detection. We assess the residual undetected risk at each node i based on sensor coverage, logging, and response latency. Averaging across nodes yields the following:
- 4.
- Operational Impact Potential (OIP): estimates the damage to operations if path j is successfully exploited. This considers whether the path leads to critical process disruption, safety risks, or data loss. Let be a normalized score derived from threat impact analysis:
3.2.3. Attack Method Indicators
- 1.
- Adversarial Sophistication (AS): assesses the novelty and advancement of method k based on whether it is typically associated with advanced persistent threats (APTs), novel exploits, or automation. Higher implies the use of modern, often less detectable techniques.
- 2.
- Resource Requirement (RR): measures the effort, tools, and expertise needed to execute method k. This includes programming knowledge, availability of exploit kits, or hardware requirements. A higher score means fewer attackers are capable of using it.
- 3.
- Target Tailoring (TT): evaluates how specifically the method is designed for the victim system. Custom-built payloads or exploits tailored to particular ICS platforms yield high scores, signaling targeted intent and reconnaissance.
- 4.
- Detection Resistance (DR): reflects the inherent stealthiness of method k, considering whether it uses encrypted channels, fileless execution, or blending with legitimate processes. Higher indicates higher evasion capability.
3.2.4. Attack Impact Indicators
- Loss of Availability (LA)—reduction in or loss of system availability, causing downtime.
- Loss of Productivity and Revenue (LPR)—direct impact on production output and financial losses due to disruption.
- Loss and Denial of Control (LDC)—attacker gains or denies control of the system, e.g., via disabling control commands or performing unauthorized control actions (akin to Denial-of-Service or command hijacking).
- Manipulation of Control (MC)—attacker issues malicious or altered control commands, causing the system to operate unsafely or incorrectly.
- Loss of Safety (LS)—compromised safety mechanisms, potentially endangering human operators or equipment (e.g., disabling alarms or safety interlocks).
- Loss of Protection (LP)—disabling or bypassing security protections, such as encryption, authentication, or physical security measures.
- Damage to Property (DP)—physical destruction or damage to equipment, facilities, or products as a result of the attack.
- Theft of Operational Information (TOI)—exfiltration of sensitive operational data, such as proprietary process recipes, engineering drawings, or system configurations.
3.3. Overall Evaluation Score
4. Experiments and Results
4.1. Experimental Scenario Design
- Cloud layer—a three-node Kubernetes1.30 cluster (each node: 4vCPU@2.4GHz, 16 GB RAM, 100 GB SSD) running analytic micro-services and a TimescaleDB historian.
- Edge layer—two ARM-based gateways (RaspberryPi4B, 4 GB RAM) equipped with Modbus and OPCUA protocol adapters and an Nvidia JetsonNano (128-CUDA core, 4 GB RAM) for on-prem inference.
- Field/terminal layer—three OpenPLCv3 virtual controllers (compiled for x86), emulating Siemens S7-1200-class PLCs, each interfaced with a MiniCPS physics model of a power-distribution feeder; sensors/actuators exchange data via Modbus/TCP at 100 ms scan cycles.
4.2. Results and Analysis
- Scenario A—Baseline: The system operates with default settings (PLC firmware v3.4.5, flat network topology, no VLAN/ACL rules), representing the unprotected risk posture.
- Scenario B—Targeted Patch at IP1: Deploy vendor hotfix on the PLC at IP1 (firmware upgraded to v3.4.6 via our CI/CD pipeline), closing the known exploit while retaining original network segmentation.
- Scenario C—Micro-Segmentation of IP2: Enforce strict isolation on the edge controller at IP2 by migrating its interface into a dedicated VLAN 42 and applying ACLs that block all IT-subnet ingress, plus a stateful firewall on Modbus TCP port 502 (deny-by-default), effectively severing lateral attack pathways.
- Scenario D—Combined Defenses: Apply both the IP1 firmware patch (v3.4.6) and IP2 micro-segmentation together, removing the easy entry vector and reinforcing the critical pivot to demonstrate compounded security improvements.
5. Conclusions and Future Work
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A
References
- Stouffer, K.; Falco, J.; Scarfone, K. Guide to industrial control systems (ICS) security. NIST Spec. Publ. 2011, 800, 16. [Google Scholar]
- Lee, E.A. Cyber physical systems: Design challenges. In Proceedings of the 2008 11th IEEE International Symposium on Object and Component-Oriented Real-Time Distributed Computing (ISORC), Orlando, FL, USA, 5–7 May 2008; pp. 363–369. [Google Scholar]
- Rajkumar, R.; Lee, I.; Sha, L.; Stankovic, J. Cyber-physical systems: The next computing revolution. In Proceedings of the 47th Design Automation Conference, Anaheim, CA, USA, 13–18 June 2010; pp. 731–736. [Google Scholar]
- Shi, W.; Cao, J.; Zhang, Q.; Li, Y.; Xu, L. Edge computing: Vision and challenges. IEEE Internet Things J. 2016, 3, 637–646. [Google Scholar] [CrossRef]
- Bonomi, F.; Milito, R.; Zhu, J.; Addepalli, S. Fog computing and its role in the internet of things. In Proceedings of the First Edition of the MCC Workshop on Mobile Cloud Computing, Helsinki, Finland, 17 August 2012; pp. 13–16. [Google Scholar]
- Wang, K.; Wang, Y.; Sun, Y.; Guo, S.; Wu, J. Green industrial Internet of Things architecture: An energy-efficient perspective. IEEE Commun. Mag. 2016, 54, 48–54. [Google Scholar] [CrossRef]
- Samie, F.; Bauer, L.; Henkel, J. Edge computing for smart grid: An overview on architectures and solutions. In IoT for Smart Grids: Design Challenges and Paradigms; Springer: Cham, Switzerland, 2018; pp. 21–42. [Google Scholar]
- Arthurs, P.; Gillam, L.; Krause, P.; Wang, N.; Halder, K.; Mouzakitis, A. A taxonomy and survey of edge cloud computing for intelligent transportation systems and connected vehicles. IEEE Trans. Intell. Transp. Syst. 2021, 23, 6206–6221. [Google Scholar] [CrossRef]
- Langner, R. Stuxnet: Dissecting a cyberwarfare weapon. IEEE Secur. Priv. 2011, 9, 49–51. [Google Scholar] [CrossRef]
- Case, D.U. Analysis of the cyber attack on the Ukrainian power grid. Electr. Inf. Shar. Anal. Cent. (E-ISAC) 2016, 388, 3. [Google Scholar]
- Zhou, Y.; Zhang, Z.; Zhao, K.; Zhang, Z. A Novel Dynamic Vulnerability Assessment Method for Industrial Control System Based on Vulnerability Correlation Attack Graph. Comput. Electr. Eng. 2024, 119, 109482. [Google Scholar] [CrossRef]
- Wang, S.; Ding, L.; Sui, H.; Gu, Z. Cybersecurity Risk Assessment Method of ICS Based on Attack-Defense Tree Model. J. Intell. Fuzzy Syst. 2021, 40, 10475–10488. [Google Scholar] [CrossRef]
- Zhang, Q.; Zhou, C.; Tian, Y.C.; Xiong, N.; Qin, Y.; Hu, B. A fuzzy probability Bayesian network approach for dynamic cybersecurity risk assessment in industrial control systems. IEEE Trans. Ind. Inform. 2017, 14, 2497–2506. [Google Scholar] [CrossRef]
- Zhu, M.; Li, Y.; Xu, W. A Quantitative Risk Assessment Model for Distribution Cyber-Physical System under Cyber Attack. IEEE Trans. Smart Grid 2022, 13, 545–555. [Google Scholar]
- Huang, K.; Zhou, C.; Tian, Y.-C.; Yang, S.; Qin, Y. Assessing the Physical Impact of Cyberattacks on Industrial Cyber-Physical Systems. IEEE Trans. Ind. Electron. 2018, 65, 8153–8162. [Google Scholar] [CrossRef]
- Fu, Y.; Wang, X.; Liu, J.; Zhang, R. Cross-Space Risk Assessment of Cyber-Physical Distribution System under Integrated Attack Based on Fuzzy Bayesian Networks. IEEE Trans. Ind. Inform. 2021, 17, 6423–6433. [Google Scholar]
- Huang, L.; Zhang, X.; Luo, C. Association Analysis-Based Cybersecurity Risk Assessment for Industrial Control Systems. IEEE Access 2020, 8, 131479–131490. [Google Scholar]
- Jin, J.; Pang, Z.; Kua, J.; Zhu, Q.; Johansson, K.H.; Marchenko, N.; Cavalcanti, D. Cloud–Fog Automation: The New Paradigm towards Autonomous Industrial Cyber–Physical Systems. arXiv 2025, arXiv:2504.04908. [Google Scholar] [CrossRef]
- Liu, Y.; Zhang, J.; Huang, Z.; Zhang, P. Cyber-Physical Risk Assessment of Power Grid Control Systems Based on Colored Petri Nets and Bayesian Games. J. Electr. Comput. Eng. 2024, 2024, 1–15. [Google Scholar]
- Tan, R.; Li, M. Analyzing the Impact of Cross-Domain Attacks on Industrial Control Systems Using Attack Graph Models. SSRN Preprint, Posted 15 December 2024, 18p. Available online: https://ssrn.com/abstract=5056771 (accessed on 12 July 2025).
- Cao, Z.; Liu, B.; Gao, D.; Zhou, D.; Han, X.; Cao, J. A Dynamic Spatiotemporal Deep Learning Solution for Cloud–Edge Collaborative Industrial Control System Distributed Denial of Service Attack Detection. Electronics 2025, 14, 1843. [Google Scholar] [CrossRef]
- Liu, Y.; Chen, T.; Zhang, P. Operational Risk Evaluation of Active Distribution Networks Considering Cyber Contingencies. IEEE Trans. Smart Grid 2019, 10, 5003–5013. [Google Scholar]
- Li, H.; Zhao, J.; Wu, C.; Sun, Y. An Integrated Cyber-Physical Risk Assessment Framework for Worst-Case Attacks in Industrial Control Systems. arXiv 2023, arXiv:2304.07363. [Google Scholar]
- Zheng, T.; Kumar, I.; Inam, R.; Naz, N.; Elmirghani, J.M. Cybersecurity Solutions for Industrial Internet of Things–Edge: A Comprehensive Survey. Sensors 2025, 25, 213. [Google Scholar] [CrossRef]
- Wu, Y.; Song, T.; Zhou, M.; Fang, J. Federated Real-Time Threat Scoring in Cloud–Edge Collaborative ICS: A Lightweight Anomaly Detection Approach. Electronics 2025, 14, 2050. [Google Scholar] [CrossRef]
- Baybulatov, A.; Promyslov, V. Industrial Control System Availability Assessment with a Metric Based on Delay and Dependency. IFAC-PapersOnLine 2021, 54, 472–476. [Google Scholar] [CrossRef]
- Ross, R.S. Guide for Conducting Risk Assessments 2012. Special Publication (NIST SP) 800-30 Rev 1, National Institute of Standards and Technology, Gaithersburg, MD, USA, September 2012 (Updated January 2020). Available online: https://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-30r1.pdf (accessed on 8 July 2025).
- Wang, S.; Ko, R.K.; Bai, G.; Dong, N.; Choi, T.; Zhang, Y. Evasion attack and defense on machine learning models in cyber-physical systems: A survey. IEEE Commun. Surv. Tutorials 2023, 26, 930–966. [Google Scholar] [CrossRef]
- Strom, B.E.; Applebaum, A.; Miller, D.P.; Nickels, K.C.; Pennington, A.G.; Thomas, C.B. Mitre att&ck: Design and philosophy. In Technical Report; The MITRE Corporation: Bedford, MA, USA, 2018. [Google Scholar]
Category | Method | Key Contribution | Model | Simulator | Dataset | Strength | Limitation |
---|---|---|---|---|---|---|---|
Static | Wang et al. [12] | ICS risk via attack–defense trees | Attack–defense tree | N/A | Synthetic configs | Formal attacker logic | No dynamics |
Zhou et al. [11] | Correlated attack graph | Correlation-based graph | N/A | Case study | Better exploit prediction | Still static | |
Zhu et al. [14] | Fault-aware static risk evaluation | Fault + attack path | System simulator | CPS substation | Considers fault impact | No timing model | |
Dynamic | Zhang et al. [15] | Bayesian ICS inference | Bayesian network | ICS emulator | Synthetic logs | Sensor-actuator linkage | Fixed priors |
Fu et al. [16] | Cross-domain via fuzzy-Bayes | Fuzzy Bayes net | Grid emulator | SCADA flows | Models fuzzy transitions | Limited scale realism | |
Huang et al. [17] | Event-state association mining | Association analysis | Event logger | ICS logs | Data-driven risk links | No causality | |
Jin et al. [18] | Fog-Cloud 3C coupling | Cloud–fog architecture | Digital twin | Co-simulated ICPS | Dynamic ICPS modeling | Deployment cost | |
Cross-Domain | Liu et al. [19] | Petri-game cyber–physical modeling | Petri Net + Bayes game | Power emulator | Grid logs | Game-theoretic attacker | Hard-coded rules |
Tan et al. [20] | RL-trained attack propagation | Graph + RL | MDP simulator | Hybrid ICS | Learns adaptive tactics | High training cost | |
Cao et al. [21] | FedDynST for DDoS detection | Fed-GCN + T-CNN | Edge testbed | ICS-DDoS traces | Federated DDoS learning | Specific to DDoS | |
Cloud–Edge | Liu et al. [22] | Co-simulation of contingencies in distribution networks | Co-simulation | Distribution network simulator | Synthetic grid | Fault propagation modeling | No cloud–edge focus |
Li et al. [23] | Optimizing worst-case attacker strategies | Optimization framework | Simulated ICS environment | Synthetic configs | Resource-aware attacker modeling | Lacks distribution | |
Zheng et al. [24] | Survey on edge-IIoT threats | Threat taxonomy | N/A | N/A | Comprehensive taxonomy | No empirical eval | |
Wu et al. [25] | FL-based edge threat scoring | Anomaly detection | ICS edge testbed | Labeled logs | Real-time scoring | Needs labeled data |
Impact Category | Specific Indicators | Explanation and Calculation |
---|---|---|
System Availability | LA, LPR | LA = ; LPR = production-loss (USD) linearly scaled from [USD 0–USD 1M] into [0, 1]. |
System Control | LDC, MC | LDC = ; MC = . |
Safety and Protection | LS, LP | LS = ; LP = . |
Integrity and Information | DP, TOI | DP = cost (USD) mapped to [0, 1] via thresholds (e.g., 0–USD = 0, USD –USD = 0.25, …, ≥USD = 1); TOI = . |
Scenario | System Module Score | Attack Path Score | Attack Method Score | Attack Impact Score | Overall Score |
---|---|---|---|---|---|
A (Baseline) | 0.41 ± 0.02 | 1.00 | 0.90 ± 0.03 | 0.73 | 3.04 ± 0.05 |
B (Patch IP1) | 0.30 ± 0.01 | 1.00 | 0.60 | 0.73 | 2.63 ± 0.01 |
C (Isolate IP2) | 0.30 ± 0.01 | 0.60 | 0.80 | 0.60 | 2.30 ± 0.01 |
D (Both Mitigations) | 0.21 ± 0.02 | 0.50 | 0.40 | 0.40 | 1.51 ± 0.02 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, L.; Wang, Y.; Chang, C.; Shen, X. Assessing Cross-Domain Threats in Cloud–Edge-Integrated Industrial Control Systems. Electronics 2025, 14, 3242. https://doi.org/10.3390/electronics14163242
Zhang L, Wang Y, Chang C, Shen X. Assessing Cross-Domain Threats in Cloud–Edge-Integrated Industrial Control Systems. Electronics. 2025; 14(16):3242. https://doi.org/10.3390/electronics14163242
Chicago/Turabian StyleZhang, Lei, Yi Wang, Cheng Chang, and Xingqiu Shen. 2025. "Assessing Cross-Domain Threats in Cloud–Edge-Integrated Industrial Control Systems" Electronics 14, no. 16: 3242. https://doi.org/10.3390/electronics14163242
APA StyleZhang, L., Wang, Y., Chang, C., & Shen, X. (2025). Assessing Cross-Domain Threats in Cloud–Edge-Integrated Industrial Control Systems. Electronics, 14(16), 3242. https://doi.org/10.3390/electronics14163242