1. Introduction
Digital forensics refers to the discipline concerned with the identification, preservation, examination, and presentation of digital evidence while ensuring that its integrity is maintained throughout the investigative process. As quantum computing technologies continue to advance, conventional forensic methods may become increasingly vulnerable, highlighting the need to explore quantum-based approaches to digital forensics. Although both traditional and quantum forensic models aim to protect the integrity, authenticity, and confidentiality of digital evidence, the techniques they employ and their resilience to emerging threats differ significantly.
Digital forensics is a specialized area within forensic science dedicated to the identification, collection, examination, and documentation of electronic data. It plays a critical role in modern law enforcement investigations because digital evidence is present in the majority of criminal activities. The primary objective of digital forensics is to retrieve information from digital sources, transform that information into useful investigative insights, and present the results as evidence in legal proceedings. To maintain admissibility in court, all investigative steps must adhere to established and reliable forensic procedures [
1].
As quantum computing continues to progress rapidly, many of the cryptographic techniques currently used in digital forensics face increasing risks from quantum-enabled attacks. Widely used security mechanisms such as RSA, AES, and SHA-256 rely on computational complexity assumptions that could be weakened by powerful quantum algorithms [
2]. In response to these emerging threats, quantum cryptography—particularly technologies such as Quantum Key Distribution (QKD) and Quantum Digital Signatures (QDS)—has been proposed as a potential solution for protecting forensic investigations in the post-quantum era [
3].
Digital forensic science plays an essential role in both law enforcement and cybersecurity by ensuring that electronic evidence remains authentic, protected from tampering, and admissible in legal proceedings [
4]. However, the increasing capabilities of quantum computers raise concerns that existing forensic methods relying on conventional cryptographic protections may no longer remain effective. In this context, quantum entanglement—a fundamental concept in quantum mechanics—presents new opportunities to strengthen evidence authentication and integrity verification within forensic investigations [
5].
Several research organizations and governmental bodies have emphasized the growing importance of Quantum Digital Forensics (QDF). For instance, the National Institute of Standards and Technology (NIST) has highlighted the necessity of adopting post-quantum cryptographic methods to secure digital evidence against emerging cyber threats [
6]. In a similar direction, the European Telecommunications Standards Institute (ETSI) has advocated the development of quantum-resistant cryptographic protocols to ensure the long-term protection of forensic data and records [
7].
The study of quantum phenomena has given rise to a specialized area of physics known as quantum mechanics, which examines the behavior of particles at the subatomic scale. In this framework, physical entities exhibit both wave-like and particle-like characteristics. A central principle of quantum mechanics is that many processes in the universe follow probabilistic rather than deterministic rules. Quantum computing applies these physical principles to perform calculations with remarkable efficiency. By exploiting quantum mechanical properties, these systems can address computational problems that are extremely difficult for classical computers to solve [
8]. Unlike classical machines that process information using binary bits represented as 0 or 1, quantum computers operate with quantum bits, or qubits. Due to the property of superposition, a qubit can represent multiple states simultaneously, enabling new computational capabilities beyond those of traditional computing systems.
The growing sophistication of digital technologies has significantly transformed modern life while simultaneously creating new challenges for information security. As technological systems continue to evolve, the digital environment expands accordingly, requiring the field of digital forensics to adapt and advance in response to these changes.
Many researchers in the digital forensics community anticipate that quantum computing technologies may become widely accessible within the next two decades. As progress in this area continues, the potential for cybercrime may also increase. Detecting and apprehending cybercriminals is already a complex task, and obtaining legal convictions can be even more difficult when standardized digital investigation procedures are lacking. In quantum environments, the act of observing or measuring quantum data can alter its state, making it challenging to determine whether changes in the data are the result of malicious manipulation or simply a consequence of the measurement process itself. Additionally, the capability of quantum computers to compromise widely used encryption methods raises serious concerns regarding the protection and integrity of digital evidence that relies on such techniques for storage or transmission. In response to these developments, several new digital forensic investigation models have recently been proposed to address the challenges introduced by the rapid advancement of quantum computing technologies.
Table 1 presents a comparison between traditional digital forensics and quantum digital forensics with respect to the key security aspects of integrity, authenticity, and confidentiality.
This paper aims to integrate hybrid quantum–classical forensic frameworks that enhance integrity verification, authenticity validation, and confidentiality protection in digital evidence management. A key focus will be on developing entanglement-assisted hashing and quantum key distribution mechanisms that operate reliably under noisy intermediate-scale quantum (NISQ) conditions. The study will further explore explainable quantum verification models capable of tracing tampering and chain-of-custody operations at the quantum state level. This hybrid approach is particularly advantageous because it leverages quantum parallelism for evidence authentication while relying on classical post-processing to mitigate current hardware limitations and ensure forensic admissibility.
2. Related Work
Quantum hash functions have recently gained attention as potential solutions for strengthening data integrity in the post-quantum computing landscape. Hou et al. proposed a quantum hashing approach based on Controlled Alternate Lively Quantum Walks designed to improve resistance to hash collisions [
9]. Their framework utilized the inherent unpredictability of quantum mechanics to produce secure hash outputs and demonstrated improved resilience against simulated quantum attack models compared with conventional hashing methods. However, the approach also faced challenges, particularly in accurately modeling quantum walk dynamics and ensuring the stability of the algorithm. Although the reported outcomes were promising, the evaluation relied mainly on simulation experiments and did not include implementation on physical quantum hardware. In a related study, Ziatdinov (2016) introduced quantum hashing techniques derived from graph structures and quantum walk mechanisms, emphasizing the distinctive properties of graph-based quantum outputs [
10]. Despite its theoretical contributions, transforming these conceptual designs into efficient and practical quantum algorithms remained challenging, and the work included limited experimental validation.
Furthermore, some existing studies emphasize that classical mechanisms may become increasingly vulnerable in the presence of quantum-capable adversaries, highlighting the need for alternative approaches that ensure evidence integrity, authenticity, and confidentiality in a post-quantum environment. For instance, Nguyen and Costa (2022) discuss how emerging quantum threats necessitate the evolution of forensic practices to maintain evidentiary reliability [
11].
In response to these challenges, research has explored the impact of quantum computation on classical hashing mechanisms. Studies have shown that quantum algorithms can influence collision resistance and structural properties of hash functions, raising concerns about their long-term reliability. Hamlin and Song (2019) analyze the behavior of iterated hash constructions under quantum adversaries [
12], while Garcia-Escartin et al. (2021) demonstrate how quantum collision-finding techniques can weaken traditional hash properties [
13]. In parallel, alternative approaches such as quantum hashing have been proposed, where Ablayev and Ablayev (2014) introduce constructions based on ε-universal hashing to enhance security using quantum principles [
14].
Quantum Digital Signatures (QDS) have become a key element in the development of secure communication systems designed for the post-quantum era. Early work by D. Gottesman and I. Chuang introduced a QDS framework that relied on distributing non-repudiable and unforgeable quantum states through quantum communication channels [
15]. Their study demonstrated that quantum-based signature mechanisms could potentially provide stronger security guarantees than classical digital signature schemes. Nevertheless, the practical realization of such systems remains difficult due to the fragile nature of quantum states and the limitations imposed by the no-cloning theorem. At the time, the requirement for highly reliable quantum channels and the absence of experimental demonstrations represented major challenges. Despite these limitations, their pioneering research established an important foundation for subsequent developments in QDS technologies. Building on this concept, X. Lu and D.-G. Feng proposed a protocol utilizing quantum one-way functions to enhance authentication and support non-repudiation [
16]. Simulation results suggested performance improvements compared with traditional cryptographic approaches, although implementing the protocol on existing quantum hardware continues to be a significant challenge. Although experimental validation has not yet been achieved, their work contributed valuable theoretical progress toward the realization of practical QDS systems.
The importance of entanglement in achieving computational advantages over classical systems has been established by Jozsa and Linden (2003), providing the theoretical foundation for many quantum-based security protocols [
17]. QDS schemes have been developed to provide authentication, non-repudiation, and message integrity. Early experimental work by Clarke et al. (2013) demonstrated the feasibility of QDS using phase-encoded quantum states [
18], while more recent approaches have focused on improving scalability and practicality. For example, Cid et al. (2023) propose a hybrid quantum-assisted signature scheme capable of handling arbitrary message lengths [
19].
Recent experimental advancements have demonstrated the feasibility of deploying quantum authentication protocols in real-world environments. Chapman et al. (2024) implemented quantum digital signatures over an entanglement-based campus network, highlighting the potential for integration into operational systems [
20]. Additionally, high-dimensional quantum signature schemes based on entanglement swapping and super-dense coding have been proposed to enhance both efficiency and security, as demonstrated by Aktaş and Yilmaz (2023) [
21].
Within Quantum Key Distribution (QKD), several protocols have been proposed to strengthen communication security against potential quantum attacks. Ahammed and Kadir introduced a hybrid QKD architecture that combines entanglement assistance with quantum teleportation and employs Orthogonal Frequency Division Multiplexing (OFDM) modulation. Their evaluated through simulations and compared with the BB84 protocol under various attack conditions [
22]. The proposed system demonstrated strong performance, achieving high secure key rates (SKRs) and maintaining low quantum bit error rates (QBERs), which indicates resilience to both quantum noise and classical interference. However, practical deployment remains challenging due to issues such as the absence of reliable quantum memory as well as the effects of phase noise and decoherence. In another study, Tagliavacche et al. experimentally implemented a frequency-bin entanglement-based QKD scheme incorporating active phase-drift compensation over a 26 km optical fiber link [
23]. By employing wavelength multiplexing together with a specially designed Mach–Zehnder interferometer, the system achieved fidelity levels exceeding 0.975. Nevertheless, the setup faced limitations related to environmental instability, system complexity, and reliance on high-quality photon sources. Earlier theoretical work by Curty, Lewenstein, and Lütkenhaus emphasized the importance of entanglement for achieving unconditional security in QKD systems and introduced witness operators derived from measurable quantum correlations [
24]. Applied to both four-state and six-state protocols, their approach demonstrated the possibility of certifying security even under higher error thresholds. Although their study lacked experimental validation and included only limited modeling of noise effects, it provided a strong theoretical link between entanglement conditions and the security guarantees of quantum communication protocols.
In the context of secure communication, quantum network models have also been explored to improve data transmission reliability and security. Shang et al. (2015) introduced a quantum network coding approach based on teleportation, illustrating how quantum communication techniques can support efficient and secure information exchange [
25].
3. Research Plan and Methodological Overview
Building on the research plan, this paper presents the research strategy used to answer the central research question:
How can quantum entanglement be utilized to strengthen the integrity, authenticity, and confidentiality of digital forensic evidence?
Unlike prior studies that examined these security properties in isolation, this research proposes a unified forensic framework that integrates integrity, authenticity, and confidentiality within a single experimental architecture. The framework is explicitly evaluated under NISQ noise models, enabling a realistic assessment of quantum forensic reliability.
While the National Institute of Standards and Technology (NIST) has standardized several post-quantum cryptographic algorithms designed to resist attacks from future quantum computers, these approaches operate entirely within classical computational frameworks and rely on mathematical hardness assumptions such as lattice problems or hash-based constructions. In contrast, the framework proposed in this study explores a different paradigm in which security mechanisms are derived from quantum physical properties such as entanglement and nonlocal correlations. As a result, the proposed system should not be interpreted as a computationally efficient alternative to post-quantum cryptography. Instead, it represents an experimental hybrid quantum–classical model investigating whether physics-based mechanisms can enhance forensic evidence verification and confidentiality in future quantum-enabled infrastructures.
3.1. Distinction from Existing Research
This research differs from existing studies in several key aspects:
Integrated Scope:
Previous work focused on individual mechanisms, such as quantum hashing or signature transmission. In contrast, this study integrates hashing, authentication, and key distribution into a single, coherent forensic pipeline.
Forensic-Centered Design:
Unlike experiments conducted primarily in communication or physics contexts, this work situates quantum techniques directly within digital forensic evidence management and introduces forensic-specific metrics, including tamper-detection thresholds and custody verification logs.
Noise-Aware Experimentation:
While many prior studies assumed ideal, noise-free environments, this research evaluates quantum hashing, CHSH testing, and BBM92 QKD under realistic decoherence and measurement noise, reflecting the conditions of near-term quantum devices.
Hybrid Post-Processing:
Earlier experiments often ended at key generation or quantum measurement. This framework extends the pipeline with classical post-processing, including error reconciliation, privacy amplification, and HMAC-based chain-of-custody tracking, demonstrating practical forensic integration.
3.2. Operational Phases of the Research Strategy
The research strategy is implemented through three operational phases, each corresponding to a core forensic property.
An entanglement-assisted quantum hashing mechanism is implemented to detect evidence tampering. Quantum circuits encode forensic data into entangled states, which are then measured to generate quantum digests. Controlled modifications to the evidence file simulate tampering, and the Hamming distance between the original and altered digests quantifies sensitivity to data alteration, demonstrating an enhanced avalanche effect.
Authenticity is verified using the CHSH inequality, which quantifies nonlocal correlations between entangled qubits. By measuring the S-value, the framework determines whether correlations exceed the classical bound (S > 2.2), thereby confirming genuine quantum entanglement and authentic evidence origin. This phase extends prior CHSH-based studies by incorporating noise tolerance and forensic authentication criteria.
Confidentiality is enforced through a BBM92-based QKD protocol that generates symmetric encryption keys from entangled measurements. These keys encrypt forensic evidence, while classical modules handle error reconciliation, privacy amplification, and HMAC-based custody logging. This phase ensures both confidentiality and traceability of digital evidence throughout its lifecycle.
3.3. Evaluation Metrics
The effectiveness of the proposed framework is evaluated using the following metrics:
Hamming Distance—measures integrity and tamper sensitivity,
CHSH S-value—validates authenticity through nonlocal correlations,
Quantum Bit Error Rate (QBER)—assesses confidentiality and key reliability.
Together, these metrics demonstrate how quantum entanglement contributes to improved forensic robustness under practical conditions.
4. The Experiment and Test Strategy
The study conducted all experimental phases—covering integrity, authenticity, and confidentiality—in a Jupyter Notebook environment using IBM Qiskit.(v0.45.1) The experiments ran on the Qiskit AerSimulator(v0.13.1), which provides a realistic emulation of quantum hardware performance under Noisy Intermediate-Scale Quantum (NISQ) conditions. The study selected this simulator because it support of configurable noise models, gate errors, and readout errors, enabling controlled testing of entanglement behavior in forensic applications. The framework codes, executes, and analyzes each experiment in Jupyter (v6.5.4) to enable step-by-step visualization, parameter calibration, and real-time result interpretation.
The experimental evaluation in this study was conducted using the IBM Qiskit AerSimulator, which enables controlled validation of quantum circuits in a noise-free environment. While current quantum hardware platforms provide access to real quantum processors, practical limitations such as qubit coherence times, gate error rates, and restricted circuit depth can significantly influence experimental outcomes. The entanglement-based protocols presented in this work—such as CHSH-based authentication and BBM92 key distribution—require reliable multi-qubit entanglement, which remains challenging on current noisy intermediate-scale quantum (NISQ) devices.
Nevertheless, the proposed circuits are compatible with existing IBM Quantum architectures and could be executed on real backends with appropriate optimization techniques such as circuit transpilation and error mitigation. In practical forensic deployments, the framework would operate across quantum communication channels connecting evidence acquisition devices, verification systems, and secure storage nodes. Future research will focus on implementing the proposed protocols on real quantum hardware and evaluating their robustness under realistic noise conditions and network constraints.
Additionally, to clarify the operational context of the proposed framework, it is useful to consider its role within a typical digital forensic investigation workflow. During the evidence acquisition stage, digital data collected from a suspect device is typically accompanied by cryptographic hash values that allow investigators to verify evidence integrity. In the proposed approach, an entanglement-assisted hashing mechanism can be used alongside traditional hashing algorithms to generate an additional verification digest. During evidence transfer between investigative units or forensic laboratories, entanglement-based authentication protocols can provide additional assurance that the transmitted evidence has not been modified. Finally, in the chain-of-custody documentation process, each evidence handling event is recorded in forensic management systems, where entries are commonly protected using cryptographic message authentication techniques such as HMAC-SHA256. In the proposed framework, quantum key distribution can be used to securely generate and distribute the keys used for these authentication mechanisms. This integration demonstrates how the proposed quantum-based techniques can complement existing forensic practices while strengthening the security of evidence handling procedures.
4.1. Ensuring Integrity
The first phase of this research directly addresses the integrity component of the central research question by demonstrating how quantum entanglement can be operationalized to detect tampering in digital forensic evidence. This phase implements an entanglement-assisted quantum hashing model in IBM Qiskit, using pairs of entangled qubits to generate unique quantum digests that represent forensic data states.
In contrast to classical hashing algorithms such as SHA-256, which rely on deterministic bitwise transformations, the quantum hash leverages the superposition and interference of entangled states to encode evidence with inherently tamper-sensitive characteristics. Any unauthorized modification to the evidence alters the quantum state collapse pattern, producing a distinct digest that can be quantified using Hamming distance.
Although quantum measurements collapse superposed states into classical outcomes, the proposed hashing mechanism derives its tamper sensitivity from the quantum transformation that occurs before measurement. During circuit execution, the encoded data evolve through superposition, entanglement, and interference across multiple qubits. Even small modifications to the input data alter the interference patterns and correlation structure of the quantum system. When the circuit is measured, these altered probability distributions collapse into different classical bit strings, producing measurable divergence between digests. The classical comparison of the resulting binary outputs therefore serves only as a detection mechanism, while the amplification of tampering effects originates from the quantum state evolution that precedes measurement.
Additionally, even quantum measurements are inherently probabilistic, the proposed hashing mechanism ensures reliable and repeatable outputs through controlled circuit execution and classical aggregation. Each quantum circuit is executed using identical gate configurations, fixed initialization parameters, and deterministic random seeds. The circuit is then evaluated over multiple measurement shots to obtain a stable statistical distribution of outcomes. These measurement results are aggregated and converted into a deterministic binary digest using classical post-processing. Because the circuit structure and encoding parameters remain constant for identical inputs, repeated executions produce consistent digests while still preserving high sensitivity to data modifications. This hybrid strategy allows the system to benefit from quantum probabilistic behavior while maintaining the repeatability required for forensic integrity verification.
The experiment follows three steps:
Quantum Digest Generation: The original forensic file is encoded into qubit registers, entangled using Hadamard and CNOT operations, and measured to produce a baseline digest.
Tamper Simulation: Controlled alterations are introduced into the evidence file to simulate unauthorized modification.
Digest Comparison: A second digest is computed from the tampered file, and the bit-level difference between the two is calculated to evaluate sensitivity.
The experiment begins by preparing two-qubit entangled pairs that represent correlated forensic data. A Hadamard (H) gate is applied to the first qubit to create superposition, followed by a CNOT gate to entangle both qubits. The digital evidence file’s data is converted to binary and encoded into the quantum circuit via parameterized rotations. After entanglement and transformation, the qubits are measured to produce a quantum digest.
The test plan proceeds in three primary stages:
Baseline Digest Generation: The original forensic file is hashed using the entanglement-based circuit to produce a reference digest.
Tamper Injection: A modified version of the same file is introduced with controlled changes (e.g., single-byte or bit-level alterations) to simulate evidence tampering.
Comparison and Evaluation: Both digests are extracted, converted into binary representations, and compared using Hamming distance to quantify divergence.
Each test cycle ran under varying noise parameters to assess the model’s reliability in both ideal and noisy conditions. The experiment uses multiple random seeds to ensure repeatability and averages the results across iterations to reduce stochastic bias caused by quantum randomness.
This test plan provides a repeatable and transparent framework for evaluating the impact of quantum entanglement on forensic integrity. By combining simulated noise with deterministic comparison metrics, the approach enables quantitative validation of tamper sensitivity and the statistical robustness of the proposed entanglement-hash model.
Furthermore, the integrity framework is designed to be hardware-agnostic and deployable on both simulated and real quantum backends. By incorporating controlled noise models, the research evaluates how decoherence affects digest stability, thereby ensuring the proposed model remains realistic under Noisy Intermediate-Scale Quantum (NISQ) conditions.
From a cryptographic perspective, the collision resistance of the proposed quantum hashing mechanism differs fundamentally from that of classical hash functions such as SHA-256. Classical hashing algorithms rely on deterministic mathematical transformations and computational complexity assumptions to prevent collisions. In contrast, the entanglement-assisted quantum hashing approach derives its sensitivity from quantum superposition and entanglement correlations, where small input variations alter the quantum measurement outcomes that generate the digest. When applied to larger forensic datasets, the hashing procedure can be extended using block-based encoding in which evidence data is processed in segments through the quantum circuit. In such configurations, alterations in any block propagate through the entanglement structure and produce measurable divergence in the resulting digests. While the present study focuses on demonstrating tamper detection and avalanche behavior, future work will examine the collision resistance and scalability of this approach under high-volume forensic workloads and compare its performance with established classical hashing standards.
To address the practical limitations of quantum hardware, the proposed entanglement-assisted hashing mechanism adopts a hybrid quantum–classical encoding approach rather than attempting to store the entire forensic evidence file in quantum memory. The macroscopic evidence file is first processed classically and converted into a binary representation. Because current quantum systems operate with a limited number of qubits, the data are partitioned into fixed-size blocks suitable for circuit-level encoding. Each block is then mapped to a quantum state through parameterized rotation gates, where classical bits determine the rotation angles applied to qubits in the circuit. Entanglement is subsequently introduced using Hadamard and CNOT operations, allowing correlations between qubits to influence the resulting measurement outcomes. After circuit execution, the qubits are measured to generate quantum digests representing each encoded block. These digests are concatenated and processed classically to produce the final hash value. By restricting quantum processing to compact encoded representations rather than the full evidence file, the framework avoids computational bottlenecks and remains compatible with the resource constraints of Noisy Intermediate-Scale Quantum (NISQ) devices while still benefiting from the tamper sensitivity introduced by entanglement.
Moreover, it is important to note that the proposed encoding strategy is not intended to load entire gigabyte-scale forensic disk images directly into quantum states. Such direct mapping would be impractical under current Noisy Intermediate-Scale Quantum (NISQ) constraints due to limited qubit availability, circuit depth restrictions, state-preparation overhead, and computational cost. Instead, the framework adopts a hybrid quantum–classical design in which the evidence is first processed classically and reduced to fixed-size blocks, sampled segments, or compact feature representations suitable for circuit-level encoding. The quantum component is then applied only to these selected representations, enabling evaluation of entanglement-assisted tamper sensitivity without requiring full-file quantum storage. Consequently, the present study should be interpreted as a scalability-aware proof-of-concept, while full support for large forensic datasets remains a direction for future research requiring advances in qubit resources, encoding efficiency, and quantum–classical co-processing.
Overall, this phase provides experimental validation that quantum entanglement can be directly applied to preserve forensic integrity, thereby supporting the broader objective of establishing a tamper-evident, non-repudiable digital evidence management system.
4.2. Guarantee Authenticity
The second phase of this research addresses the verification of the entangled communication channel used in the forensic framework. In this context, the objective is not to authenticate the human author or legal origin of a digital file in the traditional cryptographic sense. Instead, this phase focuses on verifying that the quantum states used during the forensic verification process originate from genuine entanglement rather than classical correlations or simulated signals.
To achieve this, the framework applies the Clauser–Horne–Shimony–Holt (CHSH) inequality, a widely used method for testing nonlocal correlations between entangled particles. The CHSH test evaluates whether the measured correlations between two qubits exceed the classical limit, thereby confirming the presence of quantum entanglement. In the context of this study, the CHSH result serves as a physical-layer verification mechanism, ensuring that the entangled states used in the forensic workflow have not been replaced, forged, or disrupted during transmission or measurement.
The experiment was implemented using IBM Qiskit with the AerSimulator under configurable NISQ noise conditions. A pair of qubits was first entangled using a Hadamard gate followed by a controlled-NOT (CNOT) operation, producing a Bell-state configuration. The qubits were then measured using two alternative measurement bases for each qubit. These bases correspond to measurement settings A0, A1 for the first qubit and B0, B1 for the second.
To obtain optimal measurement settings, a calibration process was conducted to determine the angles that maximize the CHSH violation. The calibration identified measurement angles of A = (0.0, 1.611) radians and B = (0.835, −0.835) radians as producing the strongest nonlocal correlations within the simulated environment.
Using these measurement bases, repeated circuit executions were performed to calculate the four expectation values required for the CHSH parameter: S = E00 + E01 + E10 − E11.
In classical systems, the CHSH inequality is bounded by: S ≤ 2.
Quantum mechanics predicts that entangled states can exceed this bound, reaching a theoretical maximum of 2√2 ≈ 2.828.
The experiments were repeated across multiple runs with varying noise parameters and random seeds to ensure statistical reliability. Each run used several thousand measurement shots to stabilize the correlation estimates. The resulting measurements produced consistent S-values above the classical bound, confirming the presence of nonlocal correlations.
Within the forensic framework, this CHSH verification step functions as a trust indicator for the quantum verification channel. If the observed correlations remain above the classical threshold, the system confirms that the quantum states used in the verification process maintain genuine entanglement. If the S-value falls below the threshold, the system treats the channel as compromised or invalid.
It is important to emphasize that this mechanism does not authenticate the human creator or legal owner of a digital file. Instead, it validates the genuine quantum origin and integrity of the entangled verification process itself. Formal attribution of authorship or legal identity remains the responsibility of complementary classical mechanisms, such as digital signatures, public key infrastructure (PKI), or certified forensic custody documentation.
By integrating CHSH-based entanglement validation into the forensic pipeline, this phase provides an additional physics-based layer of trust that complements classical identity verification mechanisms while ensuring that the underlying quantum processes used in the framework remain authentic and uncompromised.
4.3. Protect Confidentiality and Chain of Custody
The third phase of this research addresses the confidentiality aspect of the central research question by applying quantum entanglement to secure the exchange and storage of digital forensic evidence. In digital forensics, confidentiality ensures that evidence is accessible only to authorized investigators throughout the collection, transmission, and archival processes. In this study, confidentiality is established through an entanglement-based Quantum Key Distribution (QKD) mechanism implemented in IBM Qiskit, derived from the BBM92 protocol—a variant of the E91 entanglement model.
The BBM92 protocol enables two endpoints to generate a shared secret encryption key by making correlated measurements on entangled photon pairs. Because any interception or measurement by an unauthorized party disturbs the entanglement, the protocol inherently detects eavesdropping. When integrated into a forensic workflow, this mechanism ensures that encryption keys—and thus the evidence itself—remain unexposed and verifiable throughout the chain of custody.
Implementing BBM92-based quantum key distribution in operational forensic environments would require dedicated quantum communication infrastructure between participating institutions. In practice, this could involve fiber-optic quantum channels linking forensic laboratories, trusted-node quantum networks connecting government facilities, or satellite-assisted quantum communication for inter-jurisdictional evidence transfer. Such infrastructure is currently under development in several national quantum communication initiatives and is primarily deployed in research or governmental secure communication networks. Consequently, the implementation presented in this study should be interpreted as a protocol-level proof-of-concept for integrating entanglement-based confidentiality into forensic workflows, rather than as an immediate deployment-ready system. Future forensic applications of this framework would depend on the maturation and availability of scalable quantum communication infrastructure.
The study implements the confidentiality experiment using IBM Qiskit’s AerSimulator, with entanglement-based key generation and classical post-processing. The workflow consisted of four main stages:
Entanglement Generation: Hadamard and CNOT gates entangle qubit pairs to simulate photon-pair correlations similar to those in the BBM92 model.
Measurement and Key Sifting: Measurements of the entangled qubits occur in complementary bases (Z and X), and the protocol retains only the cases where both endpoints choose the same basis as raw key bits.
Error Reconciliation and Privacy Amplification: The protocol corrects bit mismatches using a six-pass CASCADE reconciliation process, followed by privacy amplification using SHAKE-256 to compress the reconciled bits into a secure final key.
Custody Logging and Encryption: The framework uses the final key to encrypt a sample forensic file via XOR encryption and appends each encryption event to a cryptographically signed custody log using HMAC-SHA256 to ensure traceability.
The testing procedure assessed the protocol under both ideal and noisy environments. The evaluation included measuring the Quantum Bit Error Rate (QBER) before and after the reconciliation stage, analyzing the proportion of key bits retained following advantage distillation, and determining the final key length produced after the privacy amplification process.
We compared the results to experimental benchmarks reported in related studies, such as Ahammed and Kadir’s hybrid entanglement-assisted QKD model [
22] and Wang et al.’s 26 km fiber implementation [
26], to validate consistency and demonstrate forensic applicability.
Although quantum key distribution is traditionally used to secure communication channels, its role in this framework is to protect the transfer and controlled access of digital forensic evidence throughout the chain of custody rather than the long-term storage of the evidence itself. Digital evidence frequently moves between acquisition systems, forensic laboratories, investigators, legal authorities, and archival repositories. Each transfer introduces confidentiality risks that must be mitigated to preserve evidentiary integrity and privacy. By generating encryption keys through the BBM92 entanglement-based protocol, the framework ensures that any interception or measurement attempt during evidence transmission produces detectable disturbances in the quantum channel. The resulting keys are then used to encrypt evidence files and to authenticate custody logs, linking each transfer event to a verifiable forensic record. In this way, quantum key distribution complements classical archival protections by securing the dynamic transfer stages of the forensic lifecycle.
Unlike prior experiments, this implementation extends beyond raw key generation to include chain-of-custody integration, linking each encryption event to an auditable, cryptographically signed log. This added layer transforms QKD from a communication security protocol into a forensic confidentiality mechanism, bridging cryptographic and evidentiary assurance.
5. Findings
5.1. Integrity
The final results of the entanglement-assisted integrity experiment confirm the practicality of using quantum entanglement for tamper detection in digital forensics. The Qiskit-based implementation generated unique digests for both the original and altered evidence samples, demonstrating the entanglement-hash model’s ability to detect even minor data modifications.
In a representative test case, the Hamming distance between the original and tampered quantum digests was 113 out of 256 bits, corresponding to a 44.1% bit-level divergence rate. This result demonstrates a strong avalanche response, indicating that the quantum hashing process effectively transforms minimal input alterations into substantial output deviations. Although this value is slightly below the ideal 50% avalanche benchmark typically associated with highly optimized classical cryptographic hash functions, it still reflects significant sensitivity to input modifications. Such behavior is a desirable property for integrity verification in digital forensic workflows, particularly when considering the operational constraints of Noisy Intermediate-Scale Quantum (NISQ) simulation environments. The detailed experimental results are presented in
Table 2.
Importantly, the divergence values remained consistent across multiple experimental runs, even when gate noise and readout noise were introduced in the simulation. Additional test runs examined system behavior NISQ-level noise parameters (gate and readout error rates of 0.002 to 0.005). Across these configurations, digest divergence values remained consistent, with only ±2% fluctuation in Hamming distance. This demonstrates that the entanglement-hash model maintains stable performance under realistic hardware noise, confirming its suitability for near-term quantum environments.
Control experiments were also conducted using identical evidence inputs. In these cases, the resulting digests produced zero divergence, confirming that the hashing mechanism generates consistent outputs when the input data remain unchanged. This behavior demonstrates that the framework satisfies the fundamental forensic requirement of repeatability, ensuring that identical evidence consistently produces identical digests.
This control experiment produced zero divergence between digests generated from identical evidence inputs. This result does not imply that individual quantum measurements were perfectly identical under noisy simulation conditions. Instead, the digest generation process relies on multi-shot circuit execution and classical aggregation of measurement outcomes. Each circuit was evaluated over thousands of measurement shots, allowing the system to estimate a stable probability distribution for the measurement results. The final binary digest was derived from this aggregated distribution rather than from a single measurement outcome. As a result, while noise may introduce variation in individual shots, the dominant measurement pattern remains statistically consistent for identical inputs, producing identical digests and therefore zero divergence in the control case.
As seen in
Table 2, all tampered test cases produced Hamming-distance values above the established detection threshold of 100 bits, confirming the system’s ability to distinguish modified data from original evidence. The control experiment processed identical inputs, yielded zero divergence, verifying the hashing function’s internal consistency and accuracy.
These outcomes provide empirical validation that quantum entanglement can be leveraged to enhance the integrity of digital forensic evidence. The experiment confirms that quantum hashing not only detects unauthorized alterations with high precision but also remains resilient to quantum noise—making it a practical, tamper-evident mechanism for forensic integrity assurance in hybrid quantum–classical systems.
We have to acknowledge that the measured Hamming distance of the quantum digest demonstrates strong sensitivity to small input modifications, which is a desirable property also observed in classical hash functions such as SHA-256. However, the present work does not perform a direct experimental comparison between the quantum hashing mechanism and classical hashing algorithms on identical inputs.
5.2. Authenticity
The results of the CHSH-based authenticity experiment confirm strong quantum correlations in the simulated forensic environment. With the calibrated angles A = (0.0, 1.611) and B = (0.835, −0.835) radians, the system produced the following correlation coefficients: E00 = 0.674, E01 = 0.662, E10 = 0.726, and E11 = −0.762.
From these values, the calculated CHSH
S-value is 2.824, clearly surpassing the classical bound of 2.0 and exceeding the predefined authenticity threshold of
S > 2.2. Multiple runs at varying noise levels yielded consistent outcomes, with
S-values ranging from 2.79 to 2.82, demonstrating the model’s robustness to minor decoherence and readout error fluctuations, the detailed experimental outcomes are shown in
Table 3.
In practical forensic deployments, environmental noise and measurement uncertainty may occasionally produce intermediate results where the CHSH parameter lies between the classical limit and the defined authentication threshold (i.e., ). In such situations, additional CHSH measurements with larger sampling sizes can be performed to increase statistical confidence and determine whether the observed correlations represent genuine entanglement or noise-induced fluctuations.
The average S-value of 2.806 across all runs provides quantitative evidence of nonlocal correlation, confirming that the entangled qubits maintain coherent relationships despite controlled quantum noise. This result reinforces the conclusion that entanglement can serve as a reliable authenticity metric for forensic verification.
In the context of forensic authentication, the CHSH parameter can be interpreted as a decision metric for verifying the presence of genuine quantum entanglement between communicating parties. According to the CHSH inequality, classical systems satisfy the bound , whereas quantum entanglement can produce values up to . When the measured S-value exceeds the classical bound, the observed correlations cannot be reproduced by classical processes alone, indicating that the underlying quantum states maintain authentic entanglement within the verification channel. In practical implementations, measurement noise and statistical fluctuations may affect the observed S-value; therefore, a decision threshold slightly above the classical bound may be adopted to ensure reliable authentication. For example, a threshold such as provides a conservative margin that distinguishes genuine entanglement from classical correlations while accounting for experimental imperfections. In this framework, the CHSH test provides physical-layer validation of the entangled verification process rather than legal attribution of evidence origin. Legal attribution and evidentiary provenance remain the responsibility of classical authentication mechanisms such as digital signatures, PKI-based identity verification, and certified chain-of-custody documentation. Repeated measurements and statistical analysis can further reduce the risk of false positives and false negatives in practical deployments.
These findings confirm that quantum entanglement can be effectively leveraged to ensure authenticity in digital forensic systems. By directly linking entanglement validation to forensic verification, this research establishes a foundation for quantum-secure audit trails that resist forgery, interception, and classical cryptographic compromise.
5.3. Confidentiality
The results of the BBM92 confidentiality experiment confirm that entanglement-based key generation can provide secure, verifiable evidence protection while maintaining low error rates under NISQ conditions. The experiment produced an initial raw key of 9000 bits, with a Quantum Bit Error Rate (QBER) of 1.522%. After advantage distillation (block size m = 3), 2867 bits were retained, reducing the QBER to 0.035%. Following six-pass CASCADE reconciliation, the residual QBER dropped to 0.000%, with 272 parity leaks recorded, and a final 128-bit key was generated through privacy amplification.
The framework uses the final key for encryption and records it in an authenticated chain-of-custody log. Any deviation from the expected key sequence or log signature automatically rejected the session, demonstrating built-in tamper detection within the confidentiality workflow, the detailed results of each processing stage are presented in
Table 4.
The negligible post-reconciliation QBER and successful encryption-log verification confirm that the quantum confidentiality model provides both security and traceability.
Moreover, unlike classical encryption, where key secrecy relies on computational hardness, the BBM92-based approach provides information-theoretic security, ensuring that any eavesdropping attempt introduces detectable disturbance.
Although the BBM92 protocol in this study produced a final 128-bit key from an initial 9000-bit raw key, it is important to consider how such keys would be used in practical forensic environments. In most secure communication architectures, quantum key distribution is not used to encrypt large data volumes directly. Instead, QKD is employed to generate short, highly secure symmetric keys that are subsequently used with classical encryption algorithms to protect bulk data transmission. In a digital forensic context, the generated quantum keys could therefore be used to secure communication channels between investigators, forensic laboratories, and evidence storage systems. This approach allows large forensic datasets to be transferred securely using conventional encryption mechanisms while benefiting from the strong security guarantees provided by quantum key distribution. Future work will investigate the throughput and scalability of the proposed framework in distributed forensic environments involving multiple investigative agencies.
Overall, these findings empirically validate that quantum entanglement can be leveraged to ensure confidentiality and chain-of-custody protection in digital forensics. The combination of entanglement-based key generation and authenticated custody logging provides a complete, tamper-evident confidentiality layer suitable for integration into hybrid quantum–classical forensic frameworks.
We have to acknowledge that the BBM92-based key generation process was evaluated to demonstrate the feasibility of secure key establishment within the proposed framework. While metrics such as quantum bit error rate (QBER) and final key length were analyzed, detailed performance comparisons with other QKD implementations or classical key-exchange protocols are outside the scope of this study.