Next Article in Journal
Optimal Consumption, Portfolio, and Retirement Under Implementation Delay
Previous Article in Journal
An AI Framework for Unlocking Actionable Insights from Text Reviews: A Cultural Heritage Case Study
Previous Article in Special Issue
A Comprehensive Approach to Rustc Optimization Vulnerability Detection in Industrial Control Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Endogenous Security Defense for Cloud-Edge Collaborative Industrial Control Systems Based on Luenberger Observer

College of Computer Science and Artificial Intelligence, Fudan University, Shanghai 200437, China
*
Author to whom correspondence should be addressed.
Mathematics 2025, 13(17), 2703; https://doi.org/10.3390/math13172703
Submission received: 15 July 2025 / Revised: 13 August 2025 / Accepted: 13 August 2025 / Published: 22 August 2025
(This article belongs to the Special Issue Research and Application of Network and System Security)

Abstract

Industrial Control Systems (ICSs) are fundamental to critical infrastructure, yet they face increasing cybersecurity threats, particularly data integrity attacks like replay and data forgery attacks. Traditional IT-centric security measures are often inadequate for the Operational Technology (OT) environment due to stringent real-time and reliability requirements. This paper proposes an endogenous security defense mechanism based on the Luenberger observer and residual analysis. By embedding a mathematical model of the physical process into the control system, this approach enables real-time state estimation and anomaly detection. We model the ICS using a linear state-space representation and design a Luenberger observer to generate a residual signal, which is the difference between the actual sensor measurements and the observer’s predictions. Under normal conditions, this residual is minimal, but it deviates significantly during a replay attack. We formalize the system model, observer design, and attack detection algorithm. The effectiveness of the proposed method is validated through a simulation of an ICS under a replay attack. The results demonstrate that the residual-based approach can detect the attack promptly and effectively, providing a lightweight yet robust solution for enhancing ICS security.

1. Introduction

1.1. Research Background

Industrial Control Systems (ICSs) [1,2,3] serve as the “neural center” of modern critical infrastructure, underpinning the stable operation of sectors that are vital to societal functioning—from power grids regulating electricity distribution across cities, to oil refineries controlling chemical reactions, from smart factories automating production lines, to water treatment plants ensuring clean water supply. These systems bridge the physical and digital worlds: sensors collect real-time data on physical processes (e.g., temperature, pressure, flow rate), controllers analyze this data to issue operational commands (e.g., adjusting valves, modifying motor speed), and actuators execute these commands to maintain processes within safe thresholds. For decades, ICS operated in relatively isolated environments, relying on proprietary hardware and protocols to ensure stability. However, this isolation has eroded rapidly in recent years [4].
The accelerating convergence of Information Technology (IT) and Operational Technology (OT) has revolutionized ICS efficiency but also exposed them to unprecedented cybersecurity risks. Driven by the rise of the Industrial Internet of Things (IIoT), cloud-edge collaboration, and remote monitoring demands, ICSs are now integrated with enterprise networks, cloud platforms, and even public internet services. For example, a smart grid may use cloud platforms to optimize energy distribution based on real-time user data, while a manufacturing plant might adopt edge computing to enable real-time analytics on production lines [5]. This connectivity, while enabling innovations like predictive maintenance and dynamic resource allocation, has shattered the “air gap” that once protected OT environments, creating new attack surfaces for malicious actors [6].
Among the emerging threats, replay attacks and data forgery attacks stand out for their stealth and destructive potential. A replay attack [7] involves an attacker intercepting and storing legitimate sensor data (e.g., a 10 min window of normal pressure readings) and retransmitting it to the controller at a later time. This tricks the controller into relying on outdated information, masking deviations in the actual physical process—for instance, a replay of “normal” temperature data could hide a dangerous overheating in a chemical reactor. Data forgery attacks [8], meanwhile, involve directly manipulating sensor readings (e.g., falsifying a flow meter’s output to make it appear within safe limits) to mislead the controller into issuing incorrect commands, such as increasing fuel supply to an already overloaded generator. Both attack types exploit a critical vulnerability: ICS controllers often lack inherent mechanisms to verify whether incoming data reflects the current physical state of the system, rather than manipulated or stale information. The consequences can be catastrophic, ranging from production shutdowns and equipment damage to large-scale blackouts or environmental disasters [9].
Traditional cybersecurity measures, designed for IT environments, are ill-suited to address these threats. IT-focused solutions like firewalls, Intrusion Detection Systems (IDSs), and antivirus software prioritize confidentiality and general threat blocking but struggle to meet the unique demands of OT:
  • Real-time performance [4]: ICSs require microsecond-level response times to maintain process stability. Complex security software, which introduces latency through deep packet inspection or signature matching, can disrupt control loops, leading to process oscillations or failures.
  • Protocol specificity: OT relies on legacy protocols (e.g., Modbus, DNP3, PROFINET) that lack built-in security features. Firewalls designed for IT protocols (e.g., TCP/IP) often fail to filter malicious traffic over these specialized protocols.
  • Resource constraints [10]: OT devices (e.g., programmable logic controllers, remote terminal units) typically have limited processing power and memory, making them unable to run heavyweight security tools.
  • Adaptive threats [11]: Signature-based IDSs, which rely on known attack patterns, are ineffective against zero-day attacks or targeted campaigns (e.g., Advanced Persistent Threats (APTs)) tailored to specific ICS environments.
These limitations highlight the need for a paradigm shift: endogenous security defense [12]. Unlike exogenous measures (e.g., add-on firewalls), endogenous security embeds protection directly into the control system’s core, leveraging the inherent properties of the physical process it regulates. By modeling the mathematical relationships between system states (e.g., how pressure changes with flow rate) and using these models to validate sensor data, endogenous defense can detect anomalies that violate physical laws—even if the attack is novel. This approach is lightweight (integrated with existing control logic), real-time (aligned with control cycles), and robust (grounded in the immutable physics of the process), making it uniquely suited to OT environments [13].
The central novelty of our work lies in demonstrating that this intentionally simple, non-adaptive observer framework, when combined with sensitive statistical monitoring, offers a more robust defense against persistent data integrity attacks than more complex adaptive estimators, providing a practical and computationally efficient solution for resource-constrained OT devices.

1.2. Research Objectives

Against this backdrop, this study aims to develop an endogenous security defense mechanism tailored to ICSs, with a focus on countering replay attacks through Luenberger observers [14,15] and residual analysis. The specific objectives are as follows:
Design a lightweight, integrable defense scheme: The proposed mechanism must operate with minimal computational overhead, ensuring it can be deployed on resource-constrained OT devices (e.g., edge controllers in a cloud-edge collaborative ICS). This requires optimizing the observer’s mathematical complexity and ensuring it aligns with the control loop’s sampling frequency (typically 10–100 Hz in industrial settings).
Enable real-time detection of replay attacks: Replay attacks exploit temporal discrepancies between stored and current data. The solution must identify these discrepancies within milliseconds of the attack initiation, preventing the controller from acting on stale information. This involves refining residual analysis to distinguish attack-induced deviations from normal noise (e.g., sensor inaccuracies, minor process fluctuations).
Validate robustness across attack scenarios: The mechanism’s performance will be rigorously evaluated using metrics critical to ICS security:
  • False Alarm Rate (FAR): Minimizing false alarms to avoid unnecessary process shutdowns (a single false alarm in a nuclear power plant could cost millions in downtime).
  • Missed Detection Rate (MDR): Ensuring even low-intensity replay attacks (e.g., short windows of replayed data) are detected to prevent cumulative system drift.
  • Detection Delay: Quantifying the time between attack initiation and alarm triggering, with a target of <5 control cycles for critical processes.
By achieving these objectives, the study seeks to provide a practical, model-driven solution that improves ICS resilience against data integrity attacks while preserving operational stability.

2. Related Work

2.1. Research on Industrial Control System Security Protection

Research on ICS security protection has evolved significantly in response to the increasing digitization and interconnectivity of industrial environments. Early efforts, as detailed in, primarily revolved around adapting traditional IT security mechanisms to the ICS domain. Firewalls, originally designed for IT networks, were customized to filter traffic specific to ICS protocols such as Modbus and DNP3. This allowed for the segmentation of OT subnets from enterprise networks to restrict unauthorized access. Intrusion Detection Systems (IDSs) were also developed, with a focus on identifying malicious patterns in industrial communication. Signature-based IDSs, for example, were trained to recognize known attack signatures within PROFINET or EtherCAT frames, leveraging the existing knowledge of common attack vectors in the IT realm [16].
During the same time, as the need for more comprehensive security became evident, researchers began to explore network traffic analysis based on process semantics [17]. This approach involved validating that sensor data transmission rates corresponded to the physical process dynamics. For instance, in a slow-heating reactor, a temperature sensor should not logically update at an extremely high frequency. Another direction was the use of physical process invariants for security monitoring. Ensuring that the flow rate in a pipeline adhered to the laws of fluid dynamics, where pressure and flow rate maintain a proportional relationship under steady-state conditions, was one such application.
However, these existing research efforts have several notable limitations. Industrial firewalls, despite their role in network segmentation, are ineffective against threats originating from compromised internal devices. A malicious insider with legitimate access can bypass firewall rules and manipulate sensors [18], as demonstrated in numerous real-world incidents. Signature-based IDSs [16], being reliant on pre-defined attack patterns, are ill-equipped to deal with zero-day attacks or novel attack vectors. In the case of replay attacks using previously unseen historical data, these IDS systems fail to raise alerts, leaving the ICS vulnerable. Additionally, the computational demands of complex security algorithms, such as deep packet inspection [19], pose a significant challenge in ICSs. Resource-constrained OT devices, like legacy Programmable Logic Controllers (PLCs), struggle to handle the associated latency, which can disrupt real-time control loops and lead to process instability [10].
The method proposed in this paper addresses these limitations by adopting an endogenous security framework. Instead of relying solely on external security measures like firewalls or signature-based IDS, our approach embeds security directly into the control loop. By integrating a Luenberger observer into the control system, we continuously compare real-time sensor measurements with values predicted by a mathematical model of the physical process. Deviations from the expected behavior, based on the underlying physics of the system, trigger alerts. This approach is not only more robust against zero-day and novel attacks but also operates with minimal latency, ensuring that the real-time requirements of ICS are maintained. It does not rely on pre-defined attack signatures, making it adaptable to a wide range of potential threats, and can be deployed on resource-constrained OT devices without sacrificing performance.

2.2. Detection of Data Integrity Attacks

Model-based detection methods leverage the physical laws governing industrial processes. For instance, the work by Pasqualetti et al. [13] established a foundational framework for attack detection in cyber-physical systems by characterizing the effect of attacks on system dynamics. While powerful, their approach often assumes a perfect system model, making it sensitive to the model mismatch and noise commonly found in real-world industrial settings [20]. Similarly, many approaches utilize Kalman filters for state estimation and anomaly detection due to their optimality under Gaussian noise [21]. However, a critical shortcoming is that Kalman filters are designed to adapt to new data, which can cause them to “learn” and incorporate the effects of a persistent attack, thereby reducing the residual and causing the attack to be missed over time—a limitation clearly demonstrated in our own simulation results.
Our work builds on these model-based principles but addresses their shortcomings by using a Luenberger observer with a fixed gain. This intentionally non-adaptive design prevents the detector from normalizing persistent attacks, ensuring a sustained residual signal. We further enhance robustness by combining thresholding with CUSUM analysis to reliably detect both abrupt and stealthy attacks.
Data-driven methods [4,22,23] have also gained substantial traction. These methods train algorithms like neural networks or autoencoders on historical data to learn the normal behavior of an ICS. Any deviation from these learned patterns is flagged as an anomaly. While effective for complex, non-linear systems where accurate models are unavailable, they have their own limitations. They require vast amounts of high-quality training data covering all operational states, which is often impractical to obtain. Furthermore, these models can be “black boxes,” suffering from low interpretability, and their computational demands can be too high for resource-constrained OT devices [24].
However, both model-based and data-driven methods have notable limitations. Model-based methods are highly sensitive to model mismatch [20]. Even small errors in the system model, such as incorrect friction coefficients in a fluid flow model, can lead to frequent false alarms. This is especially problematic in ICSs, where the processes often exhibit high levels of noise and time-varying dynamics. In such cases, distinguishing between normal process variations and attack-induced deviations becomes challenging.
Data-driven methods, on the other hand, require large volumes of high-quality historical data that comprehensively cover all possible normal operating conditions. In ICSs, obtaining such data can be a significant challenge. Historical data may lack rare but normal transient states, such as the startup and shutdown phases of industrial equipment, which are crucial for training a robust model. Additionally, data-driven models often suffer from low interpretability. It can be difficult to understand exactly why a particular data point is flagged as an anomaly, which can be a major drawback for ICS operators who need to quickly diagnose and respond to potential attacks. Moreover, these methods typically have high computational demands, which can be a bottleneck for resource-constrained ICS devices and may compromise the real-time performance requirements of the system [24].
The method proposed in this paper aims to address these limitations. We build on the model-based approach by using a Luenberger observer, which is optimized for robustness against model mismatch. We employ estimation techniques to ensure that the observer can handle noise and minor model errors without generating false alarms. Instead of relying solely on absolute deviations between measured and predicted values, we focus on residual statistics, such as mean shift and variance changes. This approach is particularly effective in detecting stealthy replay attacks, where the initial deviation may be small but gradually grows as the system state evolves. Compared to data-driven methods, our framework does not require large-scale training datasets. It operates with a fixed and relatively low computational complexity, making it highly suitable for real-time ICS environments where resource availability and immediate response times are critical.

2.3. Application of Observers and Residual Analysis in Security Defense

Observers and residual analysis have been increasingly adopted in ICS security, building on their long-standing use in fault detection [25,26]. The Luenberger observer [15,27], a foundational tool in control theory, estimates unmeasurable system states by combining a mathematical model with real-time measurements, generating residual signals as the discrepancy between actual and predicted outputs. In fault detection, residuals reflect anomalies like sensor malfunctions, and this principle has been extended to cybersecurity: attacks altering system behavior create detectable residual deviations.
Early applications focused on Kalman filters for detecting false data injection, leveraging their optimality under Gaussian noise. More recently, Luenberger observers have gained traction for their low computational overhead, which is critical for resource-constrained OT devices. Residuals under normal conditions follow a zero-mean stochastic process, while attacks (e.g., replay) induce statistically significant shifts, forming the basis for detection.
However, existing methods have limitations. They often target specific attacks (for example, only false data injection [21]) and do not generalize to replay attacks, which exploit temporal mismatches between stale data and current system states. Observer gains are rarely optimized for attack detectability, prioritizing state estimation accuracy instead, which delays the detection of stealthy attacks. Additionally, simple threshold-based residual monitoring struggles with noise, leading to high false alarms in industrial environments.
Furthermore, the security of discrete event systems, which are common in ICSs for modeling logical operations, faces challenges like intermittent observation loss that can be exploited by attackers. Recent work by Cong et al. [28] explores the critical observability of such systems, highlighting the fundamental importance of ensuring that system state can be reliably inferred, a principle that directly underpins the feasibility of our observer-based detection approach.
This study addresses these gaps by explicitly modeling replay attacks and optimizing the Luenberger observer for detectability via robust design (e.g., LMI optimization). Residual analysis combines static thresholds with CUSUM monitoring to track persistent shifts, critical for detecting slow-evolving replay attacks. This integration enhances sensitivity to temporal anomalies while mitigating noise-induced false alarms.

3. Model and Algorithm Design

3.1. Introduction to ICS Security Modeling

Industrial Control Systems (ICSs) are fundamental to the functioning of critical infrastructures such as energy distribution, chemical processing, and water treatment. These systems operate at the intersection of the physical and cyber domains, where the physical processes are regulated by embedded digital controllers connected via communication networks. Despite the long-standing stability of ICS environments, the increasing integration of open networks and Internet-facing services has exposed these systems to sophisticated cyber-attacks.
To ensure resilient operation under both stochastic disturbances and malicious interference, a rigorous model-based security approach is necessary. The core idea is to embed security functions—such as state estimation, anomaly detection, and fault isolation—directly into the control logic using mathematically grounded frameworks. This chapter focuses on constructing a detailed model of ICS dynamics and designing an observer-based detection algorithm that can uncover replay and data-injection attacks in real time.

3.2. Mathematical Modeling of Cyber-Physical ICS Systems

We consider a discrete-time Linear Time-Invariant (LTI) model that encapsulates the internal state evolution and the observation process of the ICS:
x k + 1 = A x k + B u k + E d k + w k y k = C x k + F a k + v k
where:
  • x k R n is the internal state vector at time step k, where R n denotes the n-dimensional space of real numbers, representing system variables such as temperature, pressure, motor speed, etc.
  • u k R m denotes the control input vector applied by the controller to the actuators.
  • y k R p is the measurement output vector collected by sensors and used for control decisions.
  • d k R q captures exogenous disturbances like ambient noise, load fluctuations, or supply variability.
  • a k R p models potential adversarial actions including sensor spoofing or replay.
  • w k N ( 0 , Q ) and v k N ( 0 , R ) are zero-mean, uncorrelated Gaussian process and measurement noise, respectively.
The matrices A , B , C define the nominal dynamic behavior and measurement relationships. E represents the entry point of environmental disturbances into the plant’s internal state, and F models the effect of an attacker who can manipulate sensor outputs (e.g., injecting malicious measurements).
This modeling framework is expressive enough to support multi-loop control architectures, where different state components may be regulated by separate feedback channels, and it accommodates extensions to switching or hybrid dynamics through the inclusion of time-varying matrices or mode-dependent parameters.
For complex deployments, the system can be modeled as multiple interconnected subsystems:
x k + 1 ( i ) = A ( i ) x k ( i ) + j N i A ( i j ) x k ( j ) + B ( i ) u k ( i ) + w k ( i )
Here, N i denotes the set of subsystems interacting with subsystem i, and  A ( i j ) captures coupling dynamics. In matrix notation, this yields a block-sparse system structure, which has implications for observability and decentralized detection.

3.3. State Observability and Sensor Configuration

The ability to reconstruct x k from outputs { y i } i = 0 k is essential for anomaly detection. Observability is defined through the rank condition of the observability matrix:
O = C C A C A 2 C A n 1
If rank ( O ) = n , the system is observable. Otherwise, some internal modes are hidden from measurement and can be exploited by an attacker without being detected. In practice, observability is not guaranteed globally but can be enhanced via sensor placement strategies and input design.
For systems with limited sensors, reduced-order or sliding mode observers may be used. However, in this study we assume full observability and design a standard full-order Luenberger observer for its simplicity and well-understood convergence properties.

3.4. Design of Secure State Observers

To detect deviations from expected behavior, a Luenberger observer is employed:
x ^ k + 1 = A x ^ k + B u k + L ( y k y ^ k ) y ^ k = C x ^ k
The observer corrects its internal prediction using the innovation term y k y ^ k . The gain L determines the rate and stability of error convergence.
The estimation error e k = x k x ^ k evolves as:
e k + 1 = ( A L C ) e k + E d k + w k L v k + L F a k
In the absence of disturbances and attacks, the error dynamics reduce to e k + 1 = ( A L C ) e k , and stability is achieved when all eigenvalues of ( A L C ) lie strictly within the unit circle in the complex plane.
In practical conditions, bounded disturbances cause the error to fluctuate within a noise-correlated envelope. The key insight is that under normal operation, r k = C e k + v k remains zero-mean and bounded, whereas in the presence of attacks the  L F a k term drives e k and hence r k into a statistically distinct regime.

Robust Observer Design Considerations

Robust observer design involves selecting L not only to stabilize the error dynamics but also to minimize the sensitivity to noise while maximizing attack detectability. Techniques from H estimation, LMI optimization, and fault-tolerant observer theory can be used to formulate this tradeoff. For instance, one may minimize the worst-case gain from disturbance input d k to residual output r k :
min L sup d k r k d k
Such robust formulations offer better real-world performance, especially in systems with poor SNR or adversarial environments.

3.5. Modeling Replay and Other Attack Types

We focus primarily on replay attacks, but the framework generalizes to several canonical ICS attack models:
  • Replay Attack:  y k a = y k T r ; historical data are re-sent to hide deviations.
  • Bias Injection:  y k a = y k + b ; attacker adds a persistent offset to mislead the controller.
  • DoS Attack:  y k a is missing; the communication channel is jammed or blocked.
  • False Data Injection:  y k a = y k + f k ; dynamic corruption injected via compromised sensor.
The proposed observer-based mechanism can detect all of these, provided that the corruption leads to an observable deviation in r k .

3.6. Residual Generation and Detection Algorithm

The residual signal is defined as:
r k = y k C x ^ k = C e k + v k + F a k
Under attack-free conditions:
E [ r k ] = 0 , Var ( r k ) = C Σ e C T + R
where Σ e is the steady-state covariance of e k . This provides residual-based detection tests for deviations in mean or variance from these nominal statistics.
The complete detection logic is summarized in Algorithm 1. The algorithm operates in a real-time loop for each time step k. It begins by predicting the next system state based on the current estimate and control input (Line 3). It then calculates the residual by comparing the actual sensor measurement y k with the predicted output y ^ k (Line 5). This residual is used to correct the state estimate (Line 6). Finally, a two-level check is performed: the instantaneous residual | r k | is compared against a static threshold τ , and the cumulative sum of residuals, S k , is compared against a dynamic threshold H. An alarm is triggered if either threshold is breached (Lines 8–10), allowing the system to detect both large, abrupt attacks and small, persistent ones.
Algorithm 1 Observer with Real-Time Residual Monitoring
Require: 
A , B , C , L , τ , H , ω , initial estimate x ^ 0
Ensure: 
Residual signal r k , anomaly alert
    1:
Initialize: x ^ 0 , S 0 0
    2:
for  k = 0 to N do
    3:
      Predict: x ^ k + 1 A x ^ k + B u k
    4:
      Output estimate: y ^ k C x ^ k
    5:
      Residual: r k y k y ^ k
    6:
      Correction: x ^ k + 1 x ^ k + 1 + L r k
    7:
      CUSUM update: S k max ( 0 , S k 1 + | r k | ω )
    8:
      if  | r k | > τ or S k > H then
    9:
            Raise alert: Anomaly detected
  10:
      else
  11:
            Continue monitoring
  12:
      end if
  13:
end for

4. Theoretical Analysis and Performance Evaluation

4.1. Overview and Motivation

Robust detection of cyber-attacks in Industrial Control Systems (ICSs) hinges on a rigorous understanding of the theoretical behavior of the system under both nominal and adversarial conditions. This chapter presents an in-depth analysis of the state estimation error dynamics, residual behavior, and attack detectability for the observer-based framework introduced previously. We systematically analyze the stability of the error dynamics, the statistical properties of the residual signal, and the response of the system under various attack scenarios. Furthermore, we evaluate the detection performance using statistical metrics such as false alarm rate, detection delay, and missed detection probability, and highlight how different system parameters influence detection sensitivity and robustness.

4.2. Estimation Error Dynamics and Stability Analysis

Consider the estimation error defined as e k = x k x ^ k . The evolution of e k is governed by the recursive difference equation:
e k + 1 = ( A L C ) e k + E d k + w k L v k + L F a k
In the absence of attacks ( a k = 0 ) and disturbances ( d k = 0 ), the system reduces to:
e k + 1 = ( A L C ) e k + w k L v k
We assume that the gain L is chosen such that all eigenvalues of ( A L C ) lie strictly inside the unit circle. Then, the homogeneous part of the system is asymptotically stable. The particular solution of the inhomogeneous system remains bounded if the noise terms are bounded in variance.
Define the covariance of the estimation error as:
Σ k = E [ e k e k T ]
Then, under stationary noise conditions, Σ k converges to a steady-state solution Σ that satisfies the discrete-time Lyapunov equation:
Σ = ( A L C ) Σ ( A L C ) T + Q + L R L T
This equation establishes the statistical envelope within which the error e k fluctuates in normal operation. As a consequence, the residual r k = C e k + v k also has zero-mean and bounded covariance.

4.3. Residual Analysis Under Normal Conditions

In normal operation, the residual signal reflects the deviation between actual and predicted output:
r k = y k C x ^ k = C e k + v k
Assuming e k and v k are uncorrelated Gaussian random variables, the residual r k is also Gaussian with:
E [ r k ] = 0
Cov ( r k ) = C Σ C T + R
This allows one to construct probabilistic bounds such as:
P ( r k 2 > τ ) ϵ
where τ is chosen based on a desired false alarm probability ϵ , assuming Gaussianity.

4.4. Residual Behavior Under Replay Attacks

During a replay attack, the controller receives outdated measurements:
y k a = y k T r = C x k T r + v k T r
The observer, however, uses the current estimate x ^ k , resulting in the residual:
r k a = y k a C x ^ k = C ( x k T r x ^ k ) + v k T r
Assuming the state evolves significantly over T r time steps, the quantity x k T r x ^ k becomes large, leading to a detectable change in r k a . Furthermore, the residual now has nonzero mean:
E [ r k a ] = C ( E [ x k T r ] E [ x ^ k ] ) 0
This shift from a zero-mean distribution is the statistical basis for detection.

4.5. Detection Algorithms and Performance Metrics

We consider two standard detection schemes:
Static Threshold Detection: Trigger an alarm when | | r k | | 2 > τ . The threshold τ is chosen using the distribution of r k under nominal conditions (e.g., τ = 3 σ for Gaussian noise). The False Alarm Rate (FAR) is given by:
F A R = P ( | | r k | | > τ | H 0 ) = 2 Φ ( τ σ )
CUSUM Detection: To detect small but persistent shifts in the mean of the residual signal, which may be missed by simple static thresholding, we employ the Cumulative Sum (CUSUM) algorithm. CUSUM is a sequential analysis technique that accumulates deviations from a target value over time. It is particularly effective for identifying gradually growing or stealthy attacks. The CUSUM statistic, S k , is defined as:
S k = max ( 0 , S k 1 + | r k | ω )
hlAn alarm is triggered if S k exceeds a predefined threshold H. The parameter ω (drift) is a slack parameter, typically chosen based on the expected noise level, which allows the algorithm to ignore minor fluctuations while remaining sensitive to true changes in the mean.

4.6. Detection Delay and Sensitivity Analysis

Define the detection delay D as the expected number of samples between the start of an attack and its detection. In CUSUM detection, the Average Detection Delay (ADD) under an attack with residual mean shift μ a is:
ADD H μ a ω
Hence, attacks that induce a small μ a are harder to detect and require longer monitoring windows. The tradeoff between detection sensitivity and false alarms is governed by the gap between μ a and ω .

4.7. Robustness Considerations

In practice, model uncertainties, time delays, and noise mismatch can impair detection reliability. The following factors affect detection performance:
  • Model mismatch: Incorrect matrices A, B, or C distort observer predictions.
  • Parameter tuning: Thresholds τ and H and drift term ω must be calibrated based on historical data.
  • Sensor placement: Redundant and strategically located sensors improve observability and residual quality.
  • Correlated noise: Assumptions of Gaussian i.i.d. noise may not hold in practice.
Techniques such as adaptive thresholds, robust observers, and Kalman filtering with covariance adaptation can be used to mitigate these issues.

5. Simulation Verification

To empirically validate the proposed defense mechanism and rigorously assess its performance, a comprehensive simulation was conducted using MATLAB/Simulink. The experiment is designed to evaluate the Luenberger observer’s ability to identify common data integrity attacks within an Industrial Control System (ICS) environment, analyzing its detection delay, robustness, and limitations compared to an optimal state estimator.

5.1. Simulation Setup

The simulation environment was configured to closely mirror a realistic control process under attack.
  • System and Observer Initialization: The simulation was run for 100 discrete time steps. The initial state of the system was set to x 0 = [ 0 , 0 ] T , and the observer’s initial estimate was also initialized to x ^ 0 = [ 0 , 0 ] T to simulate a scenario where the observer starts with perfect knowledge, isolating the effects of subsequent attacks. A constant control input u k = 0.5 was applied throughout the simulation to ensure persistent system excitation.
  • Noise and Process Parameters: The system dynamics were subject to zero-mean Gaussian process noise w k and measurement noise v k , with covariance matrices Q = 0.001 · I and R = 0.01 · I , respectively. These values represent typical levels of process disturbances and sensor inaccuracies in industrial settings.
  • Detection Threshold: Anomaly detection relies on a residual-based method. An attack is flagged if the absolute magnitude of the residual, | r k | , exceeds a predefined threshold τ . To balance detection sensitivity with a low false alarm rate, the threshold was determined based on the statistical properties of the residual during normal (attack-free) operation. The threshold was set to τ = 3 σ r , where σ r is the standard deviation of the residual under noise, effectively implementing a 3-sigma rule. This ensures that the probability of a false positive in any given time step is less than 0.3%.

5.2. System Simulation and Defense Model

The industrial process is modeled as a discrete-time Linear Time-Invariant (LTI) system, described by the state-space equations:
x k + 1 = A x k + B u k + w k y k = C x k + v k
where x k R 2 is the system state, u k R is the control input, and y k R is the sensor measurement. The system parameters were defined as:
A = 1.0 0.1 0 1.0 , B = 0 0.1 , C = 1 0
The primary defense mechanism is a Luenberger observer, designed with a fixed gain of L = [ 0.8 , 0.2 ] T . The observer detects anomalies by analyzing the residual signal r k = y k y ^ k , where y ^ k = C x ^ k is the estimated output. The key characteristic of this observer is its static nature; its gain does not change in response to measurement data. For comparison, a standard Kalman filter was also simulated. As an optimal state estimator, the Kalman filter dynamically adjusts its gain at each time step to minimize the estimated state error covariance. This fundamental difference in design philosophy directly impacts their performance as attack detectors.

5.3. Attack Scenarios and Detection Analysis

To comprehensively evaluate the observer’s performance, three distinct data integrity attacks were simulated, each launched between time steps k = 40 and k = 60 .

5.3.1. False Data Injection (FDI) Attack

In this scenario, the attacker injects a constant malicious offset, f k = 2.0 , into the sensor measurements, making the compromised output y k a = y k + f k . This attack directly corrupts the data received by the controller and observer.
Results and Analysis: As shown in Figure 1 and summarized in Table 1, the Luenberger observer’s residual exhibits an immediate and sharp deviation from zero the moment the FDI attack begins at k = 40. The attack is successfully detected, crossing the detection threshold τ in the very first step of the attack (Detection Delay = 1). Because the observer gain L is fixed, it continuously expects measurements consistent with the original system model, treating the injected offset as a persistent anomaly. This results in a sustained alarm throughout the attack period. In contrast, while the Kalman filter’s residual initially spikes, it quickly diminishes. This occurs because the Kalman filter, designed for optimal estimation in the presence of noise, adapts its state estimate to what it perceives as new valid measurements. It effectively “learns” the attack’s offset, causing its residual to fall back within the normal bounds, thereby failing to provide a continuous alarm.

5.3.2. Replay Attack

The replay attack involves recording legitimate sensor data and re-transmitting it later. In the simulation, the current sensor output y k is replaced by a past measurement, y k T r , where the delay T r was set to 20 steps. This attack is stealthy because the replayed values are individually valid, just temporally incorrect.
Results and Analysis: The simulation results in Figure 2, and quantified in Table 1, highlight the observer’s effectiveness against replay attacks. Although the replayed data point y k 20 was valid when recorded, the system’s state has evolved due to the control input u k and internal dynamics. Consequently, the stale measurement does not align with the observer’s predicted output y ^ k , which is based on the estimated current state. This temporal inconsistency causes the residual r k to grow significantly, quickly crossing the detection threshold. The state estimate from the Luenberger observer visibly diverges from the true state, demonstrating that any attack breaking the temporal consistency of the system dynamics can be reliably detected.

5.3.3. Covert Attack

The covert attack is a sophisticated scenario where the attacker manipulates both the control input u k and the sensor output y k simultaneously. The goal is to alter the system’s physical state while ensuring the manipulated output y k a appears consistent with the manipulated input u k a , thereby keeping the observer’s residual low.
Results and Analysis: Figure 3 illustrates the challenge posed by covert attacks, with the outcome noted in Table 1. By design, the attacker crafts the sensor signal to perfectly match the expected output of the compromised system dynamics. As a result, the residuals of both the Luenberger observer and the Kalman filter remain close to zero, and the attack goes completely undetected by the residual-based method. Both observers are successfully deceived into tracking a false state that does not correspond to the true physical state of the system, which diverges dangerously due to the hidden manipulation of the control input. This result underscores a fundamental limitation: an observer can only detect attacks that create a mismatch between the received measurements and the mathematical model it relies on.

5.4. Overall Performance and Discussion

The simulation study validates that the Luenberger observer is a lightweight and effective mechanism for detecting data integrity attacks like FDI and replay in industrial control systems. The superior performance against these attacks compared to the standard Kalman filter is a key finding. As discussed in Section 2.2, adaptive estimators like the Kalman filter can be deceived by persistent attacks because they treat the malicious data as a new normal, adjusting their state estimate accordingly. Our results in Figure 1 provide clear empirical evidence of this vulnerability. In contrast, the fixed-gain Luenberger observer continuously generates a large residual, demonstrating its robustness for security monitoring where the goal is to detect deviations from a fixed, trusted model, not adapt to them.
However, the experiment also highlights a critical limitation shared by many model-based detectors: its vulnerability to well-designed covert attacks (Figure 3). This finding is consistent with research on undetectable attack spaces [13], which shows that any observer-based method can be bypassed if the attacker has sufficient knowledge of the system model to manipulate both sensor and actuator signals simultaneously. This underscores that our proposed method is not a panacea but should be considered a foundational layer in a defense-in-depth strategy.
The choice of observer gain L and detection threshold tau also presents a fundamental trade-off. A higher gain leads to faster error convergence and quicker detection but increases sensitivity to measurement noise, potentially raising the false alarm rate. Conversely, a lower gain provides smoother estimation but may result in a longer detection delay. These parameters must be carefully calibrated for the specific process to balance detection sensitivity and operational stability.

6. Conclusions

This paper proposed and validated an endogenous security defense mechanism for industrial control systems based on a Luenberger observer and residual analysis. By leveraging a mathematical model of the system’s physical dynamics, the method provides a lightweight and real-time approach to detecting data integrity attacks. The theoretical analysis and simulation results confirmed that replay attacks induce a significant, detectable deviation in the observer’s residual signal, allowing for detection with minimal delay and high accuracy. This model-based approach does not rely on attack signatures and is computationally efficient, making it well-suited for deployment on resource-constrained devices within an ICS architecture.

Limitations and Future Work

While this foundational study demonstrates the viability of using a non-adaptive observer for security, we acknowledge several limitations that frame important directions for future research.
First, the proposed method is, by design, vulnerable to sophisticated covert attacks, as demonstrated in our simulations. This is a fundamental limitation of any single observer-based approach. Future work should focus on integrating this lightweight detector into a multi-layered defense strategy, potentially combined with methods like moving target defense or cryptographic verification to address such advanced threats.
Second, this study relied on several simplifying assumptions, namely that the system is linear and fully observable, and that observer gains and detection thresholds were manually tuned. These choices were made to establish a clear and tractable proof-of-concept. A critical next step is to extend this approach to nonlinear systems using techniques like extended or unscented observers. Furthermore, robust and optimal design methods, such as LMI optimization or H-infinity techniques, as suggested by the reviewer, should be investigated to systematically tune the observer gain for improved performance and disturbance rejection.
Finally, the validation was conducted entirely within a MATLAB simulation environment. While this provides a controlled and reproducible setting, the ultimate test requires real-world evaluation. Therefore, future work must include Hardware-in-the-Loop (HIL) testing and validation on a physical ICS testbed to assess the method’s performance and robustness under practical operational conditions.
Future work could extend this research by applying the method to more complex, nonlinear system models and validating it on a physical testbed. While this study demonstrated the mechanism’s effectiveness against common attacks, its primary limitation, as shown, is its inability to detect sophisticated covert attacks. Future research should investigate integrating this lightweight observer with complementary defense layers, such as moving target defense or cryptographic verification, to address this vulnerability. Furthermore, a critical next step is to conduct a comprehensive quantitative comparison with other state-of-the-art methods using large-scale datasets. This would involve evaluating performance using standard metrics such as detection rate, accuracy, precision, and recall to rigorously benchmark our approach against established techniques.

Author Contributions

Conceptualization, L.G.; methodology, L.G.; validation, C.T. and P.C.; formal analysis, C.T.; investigation, L.G.; resources, L.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Key R&D Program of China (2022YFB3104300).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Requests to access the datasets should be directed to 24210240304@m.fudan.edu.cn.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Drias, Z.; Serhrouchni, A.; Vogel, O. Analysis of cyber security for industrial control systems. In Proceedings of the 2015 International Conference on Cyber Security of Smart Cities, Industrial Control System and Communications (SSIC), Shanghai, China, 5–7 August 2015; pp. 1–8. [Google Scholar]
  2. Stouffer, K.; Falco, J.; Scarfone, K. Guide to industrial control systems (ICS) security. Nist Spec. Publ. 2011, 800, 16. [Google Scholar]
  3. Cárdenas, A.A.; Amin, S.; Sastry, S. Research challenges for the security of control systems. HotSec 2008, 5, 1158. [Google Scholar]
  4. Koay, A.M.; Ko, R.K.L.; Hettema, H.; Radke, K. Machine learning in industrial control system (ICS) security: Current landscape, opportunities and challenges. J. Intell. Inf. Syst. 2023, 60, 377–405. [Google Scholar] [CrossRef]
  5. Moreno Escobar, J.J.; Morales Matamoros, O.; Tejeida Padilla, R.; Lina Reyes, I.; Quintana Espinosa, H. A comprehensive review on smart grids: Challenges and opportunities. Sensors 2021, 21, 6978. [Google Scholar] [CrossRef]
  6. Langner, R. Stuxnet: Dissecting a cyberwarfare weapon. IEEE Secur. Priv. Mag. 2011, 9, 49–51. [Google Scholar] [CrossRef]
  7. Yu, Y.; Yang, W.; Ding, W.; Zhou, J. Reinforcement learning solution for cyber-physical systems security against replay attacks. IEEE Trans. Inf. Forensics Secur. 2023, 18, 2583–2595. [Google Scholar] [CrossRef]
  8. Tian, Y.; Nogales, A.F.R. A Survey on Data Integrity Attacks and DDoS Attacks in Cloud Computing. In Proceedings of the 2023 IEEE 13th Annual Computing and Communication Workshop and Conference (CCWC), Las Vegas, NV, USA, 8–11 March 2023; pp. 788–794. [Google Scholar]
  9. Asiri, M.; Saxena, N.; Gjomemo, R.; Burnap, P. Understanding indicators of compromise against cyber-attacks in industrial control systems: A security perspective. ACM Trans.-Cyber-Phys. Syst. 2023, 7, 1–33. [Google Scholar] [CrossRef]
  10. Alsabbagh, W.; Langendörfer, P. Security of programmable logic controllers and related systems: Today and Tomorrow. IEEE Open J. Ind. Electron. Soc. 2023, 4, 659–693. [Google Scholar] [CrossRef]
  11. Iyer, K.I. From Signatures to Behavior: Evolving Strategies for Next-Generation Intrusion Detection. Eur. J. Adv. Eng. Technol. 2021, 8, 165–171. [Google Scholar]
  12. Wu, J. Cyberspace endogenous safety and security. Engineering 2022, 15, 179–185. [Google Scholar] [CrossRef]
  13. Pasqualetti, F.; Dörfler, F.; Bullo, F. Attack detection and identification in cyber-physical systems. IEEE Trans. Autom. Control. 2013, 58, 2715–2729. [Google Scholar] [CrossRef]
  14. Luenberger, D.G. Observing the state of a linear system. IEEE Trans. Mil. Electron. 1964, 8, 74–80. [Google Scholar] [CrossRef]
  15. Niazi, M.U.B.; Cao, J.; Sun, X.; Das, A.; Johansson, K.H. Learning-based design of Luenberger observers for autonomous nonlinear systems. arXiv 2022, arXiv:2210.01476. [Google Scholar]
  16. Specht, F.; Otto, J. Efficient Machine Learning-Based Security Monitoring and Cyberattack Classification of Encrypted Network Traffic in Industrial Control Systems. In Proceedings of the 2024 IEEE 29th International Conference on Emerging Technologies and Factory Automation (ETFA), Padova, Italy, 10–13 September 2024; pp. 1–7. [Google Scholar]
  17. Uysal, E.; Kaya, O.; Ephremides, A.; Gross, J.; Codreanu, M.; Popovski, P.; Assaad, M.; Liva, G.; Munari, A.; Soret, B.; et al. Semantic communications in networked systems: A data significance perspective. IEEE Netw. 2022, 36, 233–240. [Google Scholar] [CrossRef]
  18. Al-Muntaser, B.; Mohamed, M.A.; Tuama, A.Y. Real-Time Intrusion Detection of Insider Threats in Industrial Control System Workstations Through File Integrity Monitoring. Int. J. Adv. Comput. Sci. Appl. 2023, 14, 326–333. [Google Scholar] [CrossRef]
  19. Çelebi, M.; Özbilen, A.; Yavanoğlu, U. A comprehensive survey on deep packet inspection for advanced network traffic analysis: Issues and challenges. Niğde Ömer Halisdemir Üniv. Mühendis. Bilim. Derg. 2023, 12, 1–29. [Google Scholar] [CrossRef]
  20. Guan, P.; Iqbal, N.; Davenport, M.A.; Masood, M. Solving inverse problems with model mismatch using untrained neural networks within model-based architectures. arXiv 2024, arXiv:2403.04847. [Google Scholar] [CrossRef]
  21. Yan, J.; Tang, Y.; Tang, B.; He, H.; Sun, Y. Power grid resilience against false data injection attacks. In Proceedings of the 2016 IEEE Power and Energy Society General Meeting (PESGM), Boston, MA, USA, 17–21 July 2016; pp. 1–5. [Google Scholar]
  22. Kim, M.; Jeon, S.; Cho, J.; Gong, S. Data-Driven ICS Network Simulation for Synthetic Data Generation. Electronics 2024, 13, 1920. [Google Scholar] [CrossRef]
  23. Ali, B.S.; Ullah, I.; Al Shloul, T.; Khan, I.A.; Khan, I.; Ghadi, Y.Y.; Abdusalomov, A.; Nasimov, R.; Ouahada, K.; Hamam, H. ICS-IDS: Application of big data analysis in AI-based intrusion detection systems to identify cyberattacks in ICS networks. J. Supercomput. 2024, 80, 7876–7905. [Google Scholar] [CrossRef]
  24. Wang, J.; Zhang, W.; Shi, Y.; Duan, S.; Liu, J. Industrial big data analytics: Challenges, methodologies, and applications. arXiv 2018, arXiv:1807.01016. [Google Scholar] [CrossRef]
  25. Ahmad, S.; Ahmed, H. Robust intrusion detection for resilience enhancement of industrial control systems: An extended state observer approach. IEEE Trans. Ind. Appl. 2023, 59, 7735–7743. [Google Scholar] [CrossRef]
  26. Kordestani, M.; Saif, M. Observer-based attack detection and mitigation for cyberphysical systems: A review. IEEE Syst. Man, Cybern. Mag. 2021, 7, 35–60. [Google Scholar] [CrossRef]
  27. Kim, T.; Shim, H.; Cho, D.D. Distributed Luenberger observer design. In Proceedings of the 2016 IEEE 55th Conference on Decision and Control (CDC), Las Vegas, NV, USA, 12–14 December 2016; pp. 6928–6933. [Google Scholar]
  28. Cong, X.; Zhu, H.; Cui, W.; Zhao, G.; Yu, Z. Critical Observability of Stochastic Discrete Event Systems Under Intermittent Loss of Observations. Mathematics 2025, 13, 1426. [Google Scholar] [CrossRef]
Figure 1. State estimation and residual analysis under a False Data Injection (FDI) attack. The top plot shows the true state versus the Luenberger and Kalman estimates. The bottom plot shows the residuals; the Luenberger residual (blue) shows a clear and sustained violation of the detection threshold (red dashed line), while the Kalman residual (green) quickly returns to normal.
Figure 1. State estimation and residual analysis under a False Data Injection (FDI) attack. The top plot shows the true state versus the Luenberger and Kalman estimates. The bottom plot shows the residuals; the Luenberger residual (blue) shows a clear and sustained violation of the detection threshold (red dashed line), while the Kalman residual (green) quickly returns to normal.
Mathematics 13 02703 g001
Figure 2. State estimation and residual analysis under a replay attack. The Luenberger observer’s state estimate diverges significantly from the true state during the attack. Its residual (blue, bottom plot) sharply increases and remains above the detection threshold, ensuring reliable detection.
Figure 2. State estimation and residual analysis under a replay attack. The Luenberger observer’s state estimate diverges significantly from the true state during the attack. Its residual (blue, bottom plot) sharply increases and remains above the detection threshold, ensuring reliable detection.
Mathematics 13 02703 g002
Figure 3. State estimation and residual analysis under a covert attack. Despite the true physical state (black) diverging, both the Luenberger and Kalman estimates track a false state. The corresponding residuals (bottom plot) remain near zero, demonstrating the successful evasion of the detection mechanism.
Figure 3. State estimation and residual analysis under a covert attack. Despite the true physical state (black) diverging, both the Luenberger and Kalman estimates track a false state. The corresponding residuals (bottom plot) remain near zero, demonstrating the successful evasion of the detection mechanism.
Mathematics 13 02703 g003
Table 1. Quantitative performance summary of the Luenberger observer.
Table 1. Quantitative performance summary of the Luenberger observer.
Attack ScenarioDetection OutcomeDetection Delay (Time Steps)
False Data Injection (FDI)Success1
Replay AttackSuccess1
Covert AttackFailN/A 1
1 N/A: Not Applicable. The detection delay is not applicable because the covert attack was not detected by the system.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Guan, L.; Tao, C.; Chen, P. Research on Endogenous Security Defense for Cloud-Edge Collaborative Industrial Control Systems Based on Luenberger Observer. Mathematics 2025, 13, 2703. https://doi.org/10.3390/math13172703

AMA Style

Guan L, Tao C, Chen P. Research on Endogenous Security Defense for Cloud-Edge Collaborative Industrial Control Systems Based on Luenberger Observer. Mathematics. 2025; 13(17):2703. https://doi.org/10.3390/math13172703

Chicago/Turabian Style

Guan, Lin, Ci Tao, and Ping Chen. 2025. "Research on Endogenous Security Defense for Cloud-Edge Collaborative Industrial Control Systems Based on Luenberger Observer" Mathematics 13, no. 17: 2703. https://doi.org/10.3390/math13172703

APA Style

Guan, L., Tao, C., & Chen, P. (2025). Research on Endogenous Security Defense for Cloud-Edge Collaborative Industrial Control Systems Based on Luenberger Observer. Mathematics, 13(17), 2703. https://doi.org/10.3390/math13172703

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop