You are currently on the new version of our website. Access the old version .
LawsLaws
  • Article
  • Open Access

15 January 2026

The Collingridge Dilemma and Its Implications for Regulating Financial and Economic Crime (FEC) in the United Kingdom: Navigating the Tension Between Innovation and Control

Centre for Resilient Business and Society (CRBS), Coventry University, Coventry CV1 5FB, UK

Abstract

The capacity of the United Kingdom (UK) to prosecute technology-enabled financial and economic crime (FEC) is increasingly shaped by the Collingridge dilemma. Even though the dilemma was broadly conceptualized in technology governance, its application to prosecutorial and enforcement practice, evidentiary standards, and criminal liability attribution represents uncharted scholarly territory. Through socio-legal mixed methods combining doctrinal analysis, case studies, and comparative analysis, the paper shows how the dilemma’s two horns or pillars (i.e., early epistemic uncertainty and late institutional inertia) manifest in criminal law and regulatory contexts. The paper finds that just like the European Union and United States, the UK criminal enforcement ecosystem exhibits both horns across cryptocurrency, algorithmic trading, artificial intelligence (AI), and fintech domains. By integrating supplementary theories such as responsive regulation, precautionary principles and technological momentum, the study advances a socio-legal framework that explains enforcement inertia and doctrinal gaps in liability attribution for emerging technologies. The paper demonstrates how epistemic uncertainty and institutional entrenchment shape enforcement outcomes and proposes adaptive strategies for anticipatory governance including technology-literate capacity building, anticipatory legal reform, and data-driven public-private coordination. These recommendations balance ex-ante legal clarity (reducing uncertainty) with ex-post enforcement agility (overcoming entrenchment) to provide a normative framework for navigating the Collingridge dilemma in FEC prosecution.

1. Introduction

The prosecution and regulation of financial and economic crime (FEC) in the United Kingdom (UK) face increasing complexity due to rapid technological advancements and evolving criminal methodologies such as cryptocurrency-enabled money laundering, algorithmic market manipulation, and artificial intelligence (AI)-driven fraud (Gilmour 2021; Lui and Ryder 2021; Caldwell et al. 2020). These innovations have evolved faster than legal adaptation, creating enforcement gaps in investigation, attribution, and prosecution (Crown Prosecution Service 2025).
This lag in enforcement is evident in the UK’s reliance on legacy statutes, such as (Proceeds of Crime Act 2002) (POCA), the (Fraud Act 2006), and the (Money Laundering Regulations 2017) (MLRs), which struggle to address technology-mediated FEC behaviors. Fraud remains the most prevalent offence, accounting for 41% of all crimes in 2024 (ONS 2025; National Crime Agency 2025b). The National Crime Agency (NCA) estimates that over £100 billion is laundered annually through UK-registered structures, often through complex corporate and digital channels, while enforcement outcomes remain uneven across offences and venues (NCA 2025). Criminals increasingly exploit crypto-assets and digital infrastructures to launder criminal proceeds at speed and scale (Crown Prosecution Service 2025; NCA 2024).
At the heart of these challenges lies what Collingridge has identified as a fundamental dilemma in technology regulation, which is the “control dilemma” or “pacing problem” that emerges when authorities make efforts to govern technological innovation (Collingridge 1980). This dilemma presents a paradox in technology regulation positing that early in the lifecycle of a technology, when intervention or control is deemed to be most feasible and least costly, there is insufficient information, limited understanding, and uncertainty of the technology’s implications, which inhibit appropriate action or response.
On the other hand, once the impacts of technology are clear, over time, elements such as technical complexity and entrenched interests embedded in economic, social, and political investments in technology make control increasingly difficult and expensive (Collingridge 1980; Liebert and Schmidt 2010). Although widely discussed in technology governance, the dilemma’s application in criminal law and prosecutorial practice is analytically underdeveloped in the UK (Ruof 2023; Moses 2013; Coffee 2021).
This paper applies the Collingridge dilemma to the UK’s FEC enforcement ecosystem, showing how epistemic uncertainty and institutional inertia manifest in statutory interpretation, evidentiary standards, and cross-agency coordination. Prosecutors face high evidentiary thresholds using artefacts like on-chain traces and system logs, while courts and regulators must balance fairness, reliability, and regulatory clarity (Dal Bó 2006; Kwak 2014; FCA 2018).
The paper engages with debates on adaptive regulation, including the effectiveness of regulatory sandboxes and concerns over selection bias and regulatory capture (Kálmán 2025; Dal Bó 2006; Kwak 2014). It also examines operational challenges in AI-driven enforcement and gaps in fintech oversight (Maple et al. 2023; Alan Turing Institute 2025; Law Commission 2025a; Yeung 2018; Langley and Leyshon 2023). These tensions reflect broader debates on anticipatory versus reactive regulation (Ferran 2023; Gov.UK 2022).
To mitigate these challenges, the paper explores how the Collingridge dilemma affects legislative frameworks, investigative practices, and prosecutorial strategies (Genus and Stirling 2018; Demos Helsinki 2022). It addresses the following questions: 1. How do epistemic uncertainty and institutional inertia manifest in UK doctrinal and institutional practice across cryptocurrency, algorithmic trading, AI, and fintech platforms? 2. What evidentiary and liability challenges arise in prosecuting technology-enabled FEC, and how have UK courts and agencies responded? 3. What adaptive legal and institutional strategies are emerging to mitigate the pacing problem, and where are reforms most needed?
Using a socio-legal mixed-methods design, the paper makes three contributions: (i) it operationalizes the Collingridge dilemma within criminal law, identifying doctrinal bottlenecks; (ii) it clarifies where UK law supports adaptation and where legislative updates are needed; and (iii) it distils practical strategies such as data-sharing routines and crypto asset tracing standards informed by EU regulatory models (e.g., MiCA) (Genus and Stirling 2018; Demos Helsinki 2022; Collingridge 1980).
The paper concludes that addressing the Collingridge dilemma is essential for effective FEC enforcement. The UK’s global financial role and multi-agency architecture position it as a reference point for international regulatory discourse (Home Office 2025a; O’Reilly 2025). The dilemma’s constraints can be managed through anticipatory legal tools, institutional capacity building, and modular coordination that converts regulatory insight into courtroom-ready evidence.
I have organised the rest of the paper into the following main sections: The next sub-section under Section 1 briefly surveys relevant scholarship in technology governance, regulation, and criminology, highlighting the germane gap in the literature. This sub-section also presents a conceptual framework that operationalizes the Collingridge dilemma. Section 2 outlines methods used in the paper. Section 3 presents findings across the four technological domains—crypto assets, algorithmic trading, fintech platforms, and AI, integrating doctrinal developments and case illustrations. Section 4 offers recommendations for adaptive governance in prosecution. Section 5 concludes the paper.

Brief Literature Review and Theoretical Framework

The Collingridge dilemma highlights a persistent challenge in technology governance: early-stage uncertainty limits effective intervention, while late-stage entrenchment impedes reform (Collingridge 1980). This paradox can be seen pronounced in FEC enforcement, where rapidly evolving technologies such as cryptocurrency and AI outpace the development of legal and regulatory frameworks (Bank for International Settlements & FSI 2024; Coffee 2021; World Economic Forum 2025). The resulting pacing problem creates enforcement blind spots that are readily exploited by offenders (Marchant et al. 2011).
Scholarship on adaptive regulation and anticipatory governance offers conceptual tools for addressing these challenges (Barben et al. 2008). These approaches advocate flexible, forward-looking regulatory strategies designed to evolve alongside technological change. However, their application within criminal law enforcement, particularly in prosecutorial decision-making and evidentiary standards, remains underdeveloped. Mechanisms such as regulatory sandboxes (Zetzsche et al. 2017) and responsive regulation theory (Ayres and Braithwaite 1992) propose iterative, risk-based oversight. Yet, their translation into enforcement practice is limited, particularly in contexts that require legal certainty and procedural rigor.
Criminological frameworks offer additional, underutilized insights into offender behavior and opportunity structures. Routine activity theory (Cohen and Felson 1979) and crime script analysis (Cornish 1994) illuminate how technological affordances shape criminal decision-making. These theories can inform the design of adaptive interventions by identifying patterns and vulnerabilities in offender routines. Despite their relevance, they are rarely integrated into technology governance or enforcement strategies. This omission represents a missed opportunity for interdisciplinary innovation.
This paper addresses these gaps by integrating criminological and regulatory theory, conducting empirical case studies of UK enforcement practices, and offering policy recommendations grounded in implementation realities. It contributes to the literature by operationalizing the Collingridge dilemma within financial crime prosecution and proposing an adaptive governance framework responsive to both technological evolution and institutional constraints (Collingridge 1980; Liebert and Schmidt 2010).
The study is grounded in the Collingridge dilemma (Collingridge 1980), which highlights the paradox of technology governance: early in a technology’s lifecycle, regulators lack sufficient information to intervene effectively, while later, when consequences are clearer, entrenched interests and systemic dependencies make intervention difficult. This dilemma is particularly salient in FEC enforcement, where emerging technologies rapidly evolve and outpace legal adaptation.
To deepen this theoretical foundation, the study draws on complementary theories such as: Technological Momentum, developed by Hughes (1994), is a theory in science and technology studies that explains how technologies become resistant to change as they embed within institutional and infrastructural systems. Responsive Regulation, proposed by Ayres and Braithwaite (1992), is a regulatory theory advocating flexible, graduated enforcement strategies that adapt to evolving risks. These theories help explain why financial regulators struggle to keep pace with technological innovation and why adaptive strategies are necessary.
Building on these theoretical insights, the conceptual framework identifies four dimensions of regulatory challenge in FEC enforcement: (1) Temporal lag between innovation and legal understanding, as seen in delayed responses to crypto asset misuse (Zetzsche et al. 2017); (2) Technical complexity requiring specialized oversight, exemplified by difficulties in prosecuting algorithmic trading manipulation (Kirilenko and Lo 2013); (3) Jurisdictional fragmentation in cross-border technologies, illustrated by enforcement limitations against global platforms like Binance (FCA 2021b); (4) Economic integration that entrenches technological systems, such as mobile payment infrastructures complicating fraud detection (Kliewer 2022).
These dimensions are not isolated; they interact with institutional responses such as reactive prohibition, adaptive regulation, technology-neutral principles, and anticipatory governance (Barben et al. 2008). These responses form feedback loops with criminal adaptation, in which enforcement influences innovation and vice versa. The conceptual framework, thus, operationalizes theoretical principles by providing a structured lens for the analysis of how UK enforcement agencies navigate the governance of emerging financial technologies under conditions of uncertainty and systemic inertia.
The doctrinal foundations are set out in Section 3.1.3, aligning property classification (R v Kelly 1999; AA v Persons Unknown 2019; Property (Digital Assets etc) Act 2025), inferential intent (R v Woollin 1999), and corporate attribution (ECCTA 2023 s.196/s.199) with the Collingridge framing.

2. Methods

2.1. Research Design

This study employs a mixed-methods approach, combining doctrinal legal analysis with empirical case studies and brief comparative analysis. It is grounded in socio-legal research that focuses on the interaction between legal frameworks, social contexts, technological developments, and enforcement practices. The integration of doctrinal and empirical methods enables a nuanced understanding of how legal norms operate within technological, regulatory and enforcement contexts (Banakar and Travers 2005; Cane and Kritzer 2010).

2.2. Data Collection and Analysis

Primary legal sources used in the paper include statutory provisions, appellate case law, regulatory guidance, and parliamentary materials. Secondary sources were identified through a comprehensive literature review across Westlaw, LexisNexis, Google Scholar, and JSTOR. Key search terms used include “Collingridge dilemma,” “financial crime and economic crime,” “technology governance,” and “cryptocurrency regulation” (Zetzsche et al. 2017; Moses 2013).
Although quantitative data was limited, the study utilized enforcement statistics and agency reports, including those from the Financial Conduct Authority (FCA), National Crime Agency (NCA), and Serious Fraud Office (SFO). These included metrics such as the number of registered crypto firms, asset recovery figures, and operational statistics from Joint Money Laundering Intelligence Taskforce (JMLIT) (e.g., 1200 operations, £250 million seized). The data points were used to contextualize doctrinal analysis, support case study selection, and triangulate enforcement trends.

2.2.1. Criteria for Case Selection

Case selection criteria were based on the logic of theoretical sampling and replication, which prioritizes cases chosen for their potential to illuminate and refine emergent constructs rather than statistical representativeness (Eisenhardt 1989; Nair et al. 2023). This paper, thus, employed cases using purposive sampling (Bouncken et al. 2025; Emmel 2013; Seawright and Gerring 2008) to ensure conceptual clarity and diversity.
The approach was guided by inclusion and exclusion criteria designed to capture cases that meaningfully illustrate the Collingridge dilemma in FEC enforcement contexts. The inclusion criteria include the following: (1) Regulatory uncertainty, in relation to gaps in legacy statutes like POCA, as evidenced by appellate challenges. This involves cases illustrating gaps in statutory clarity or enforcement capacity. (2) Prosecutorial significance, which is demonstrated by high-profile outcomes involving FEC agencies (e.g., CPS/SFO prosecutions or FCA interventions).
This includes cases that have shaped enforcement practice or judicial interpretation (e.g., AA v Persons Unknown 2019; D’Aloia 2024). (3) Technological novelty, which is characterized by representation of dilemma stages (e.g., early uncertainty in Bitcoin’s 2009–2014 phase versus late rigidity in the opacity of AI from 2020 onwards). Cases involving emerging technologies such as cryptocurrency, algorithmic trading, and AI are applicable (Emmel 2013; Seawright and Gerring 2008). (4) Availability of relevant court judgments, regulatory decisions, or credible investigative journalism that allow doctrinal analysis. (5) Evidence of regulatory lag, in terms of cases demonstrating measurable delay between technology emergence and regulatory/prosecutorial response, or entrenched complexity hindering late-stage intervention.
Any case study or court judgement that is featured in this paper meets two or more of the above criteria. This approach reflects established practices in case study methodology (Yin 2018) and socio-legal empirical research (Banakar and Travers 2005; Dobinson and Johns 2017). Table 1 provides insights into how the three main technology cases met the above criteria and the relationship with Collingridge dilemma.
Table 1. Inclusion criteria of three case studies.
The exclusion criteria include the following: (1) Traditional fraud cases that, for instance, involve offenses employing well-established methods such as cheque fraud that lack technological mediation. (2) Cases that lack public records, sealed proceedings, or relying solely on media speculation without authoritative sources. (3) Non-technology-mediated crimes that, for instance, involve cases where technology was incidental rather than integral to offense commission or detection challenges. (4) Cases without UK nexus or significant lessons for the UK. The cases above did not meet the above exclusion criteria. The selection criteria balance theoretical relevance with empirical accessibility (Eisenhardt and Graebner 2007).
In this regard, relevant cases were purposively selected to represent diverse technology categories, such as cryptocurrency, algorithmic trading, and AI applications, and to illustrate distinct manifestations of the Collingridge dilemma (Collingridge 1980). These included legally and institutionally significant enforcement actions and judicial decisions. Doctrinal analysis involved analysis of legal texts and case law, focusing on statutory interpretation, judicial reasoning, and prosecutorial strategies in the context of FEC (Hutchinson and Duncan 2012; Caldwell et al. 2020).
While the study is UK-centric, examining domestic statutes, case law, and enforcement practices, it incorporates comparative elements from the European Union (EU) and United States (US) where these inform UK regulatory evolution or provide contrasting approaches to the pacing problem. However, this is not a comprehensive comparative study; rather, EU and US examples serve as reference points illuminating alternative regulatory strategies. To enhance rigor, the comparative analysis employs a structured analytical matrix (see Table 2) to assess three dimensions across the UK, EU, and US: Adaptability, predictability, and regulatory models.
Table 2. Comparative analysis of regulatory model, adaptability and predictability.

2.2.2. Boundaries of Theoretical Applicability

The analysis was anchored in the Collingridge dilemma (Collingridge 1980). This dilemma is operationalized where three assumptions or conditions jointly apply and thereby structure its Horn 1 and Horn 2. In the first place, a knowledge (information) assumption holds in the early phase where regulators and prosecutors face genuine epistemic uncertainty about the likely pathways of impact of a technology and feasible levers of control. This corresponds to Horn 1 in Collingridge’s original formulation and has remained the cornerstone of later reconstructions (Collingridge 1980; Liebert and Schmidt 2010).
Secondly, a power/actor (entrenchment) assumption holds in the later phase whereby once markets, infrastructures, and organizational routines have adapted around the technology, changing course does become costly and politically difficult. Economic dependencies, technical complexities, or jurisdictional fragmentations impede control. The entrenchment logic characterizes institutional inertia, which corresponds to Horn 2 and is central to both the original Collingridge (1980) text and analytic restatements that frame it as a “power” or “actor” problem (Collingridge 1980; Liebert and Schmidt 2010).
The third assumption is a temporal (pacing) assumption, which holds across the horns, presenting a measurable lag between technological deployment (including harmful uses) and institutional response. In this regard, regulatory response exhibits measurable lag relative to technology deployment and criminal exploitation. This predisposition characterizes the timing axis that makes the two horns more actionable. This is, nonetheless, not a separate “third horn”, but the operational condition under which the horns are manifested. Contemporary policy discourse often expresses this timing axis as the pacing problem, that is, exponential technological change versus incremental legal and institutional change (Downes 2009; Thierer 2018).
Essentially, these three assumptions provide a faithful restatement of Collingridge’s original double-bind (i.e., “when change is easy, the need for it cannot be foreseen; when the need for change is apparent, change has become expensive, difficult, and time-consuming”), while providing a transparent rubric for critical assessment in the present context (Collingridge 1980; Liebert and Schmidt 2010).
The dilemma does not, however, explain all regulatory failures. For instance, where enforcement failure is primarily driven by influence over regulators, regulatory capture provides superior explanatory power to Collingridge’s knowledge-power timing structure. Regulatory capture (Stigler 1971; Bernstein 1955) occurs when regulators align with industry interests, undermining public interest and eroding the impartiality these frameworks presuppose.
Where fragmentation and overlapping authorities impede action despite agencies possessing the requisite knowledge, the dominant problem is coordination, not ignorance or entrenchment. The U.S. Government Accountability Office has documented these structural frictions in the financial regulatory system (general architecture and, more recently, crypto-specific oversight), underscoring that gaps can persist irrespective of individual agencies’ expertise (U.S. Government Accountability Office 2023; GAO 2016).
In the same vein, where capability gaps or resource constraints are driven by inputs, such as budget, workforce, and tooling, rather than knowledge or authority, resource constraints are the root cause, thus requiring attention. In the UK, the government’s Economic Crime (Anti-Money Laundering) Levy report 2023–24 details targeted investments in investigators, technology, and suspicious activity reporting capacity (HM Treasury 2025); the National Economic Crime Centre’s annual report 2023–24 also underscores the need for system-level resourcing and partnerships to scale outcomes (NCA 2024).
Similarly, where regulators explicitly choose strategic forbearance/deliberate non-intervention or experimentation, the timing of intervention reflects deliberate policy design, not Collingridge constraints. In the UK, the FCA’s regulatory sandbox (first opened in 2016 and maintained as part of the FCA’s innovation services) provides a controlled environment for firms to test innovative products and services under regulatory oversight before wider market deployment (FCA 2025a; Debevoise & Plimpton 2018).
Accordingly, this paper deploys the Collingridge dilemma where the knowledge problem (early phase), the power/entrenchment problem (later phase), and a demonstrable pacing lag together structure the regulatory dilemma. Where outcomes are better explained by regulatory capture, institutional fragmentation/coordination failure, capability gaps/resource constraints, or strategic forbearance/deliberate non-intervention, the analysis invokes those alternative frameworks. This positioning keeps temporal lag central while preserving the dilemma’s canonical two horns and the three underlying assumptions (temporal, knowledge, power/actor) (Liebert and Schmidt 2010; Collingridge 1980).
Furthermore, while the Collingridge dilemma provides a robust analytical lens for understanding temporal dynamics in technology regulation, its explanatory power is further enhanced when supplemented by alternative theoretical frameworks that address dimensions beyond early uncertainty and late rigidity. Such theoretical frameworks used in this paper include regulatory theory, particularly responsive regulation, which conceptualizes an enforcement pyramid that begins with persuasion and escalates to punitive measures as needed (Ayres and Braithwaite 1992); and smart regulation, which advocates a flexible mix of instruments and actors, such as government, industry, and third parties, for efficient compliance (Gunningham et al. 1998). Responsive regulation addresses the control problem of Collingridge by creating feedback loops: Thus, regulatory learning occurs through graduated enforcement, which progressively reduces epistemic uncertainty. For example, the FCA’s regulatory sandbox embodies responsive principles, permitting controlled experimentation with proportionate safeguards.
However, responsive regulation assumes that enforcers possess escalation capacity (Ayres and Braithwaite 1992). For technology-enabled FEC, this assumption fails at different points. Firstly, the pyramid’s base (persuasion, guidance) requires technical expertise to diagnose non-compliance, which is precisely the knowledge deficit that characterizes Horn 1. Secondly, mid-level enforcement (civil penalties, enforceable undertakings) assumes jurisdictional reach, yet crypto assets usually operate transnationally beyond unilateral regulatory grasp. Thirdly, the pyramid’s apex (criminal prosecution, license revocation) faces Horn 2 constraints, whereby, once platforms are entrenched with millions of users (e.g., Binance), revocation becomes economically disruptive and practically difficult to enforce.
The UK’s crypto enforcement trajectory does not strictly follow a responsive pyramid. For instance, the FCA issued consumer warnings on crypto assets around 2017 (FCA 2017a), then imposed registration requirements from 2020 (FCA 2019c), and later imposed restrictions on Binance in 2021 with the declaration that Binance Markets Limited was not permitted to undertake any regulated activity in the UK without prior written consent (FCA 2021a). But these actions targeted different entities rather than escalating enforcement against a single firm. This pattern suggests responsive regulation has limited applicability in fast-moving technology markets where innovation and new entrants continue to emerge.
The analysis also drew on criminological theory to understand opportunity structures and decision-making, including routine activity theory (Cohen and Felson 1979), which focuses on the convergence of motivated offenders, suitable targets, and absence of capable guardians, and rational choice perspectives on offender decision-making (Cornish 1994). It also utilized technology governance frameworks such as anticipatory governance, which emphasizes foresight, stakeholder engagement, and institutional integration (Barben et al. 2008), and responsible innovation, which emphasizes anticipation, reflexivity, inclusive engagement, and responsive action (Stilgoe et al. 2013).
The analysis further considered the usefulness of precautionary principle, which holds that where there are threats of serious or irreversible harm, lack of full scientific certainty should not be used as a reason for postponing cost-effective measures to prevent environmental degradation (United Nations 1992, Principle 15). Contrary to the focus of Collingridge on control timing, the precautionary principle prescribes action under uncertainty. When applied to FEC, this precautionary principle would, for instance, justify early crypto-asset regulation based on potential harm, even without empirical evidence of exploitation. This disposition illustrates how the precautionary principle operates normatively (how regulators should act) while Collingridge operates descriptively (how they do act). The precautionary principle also introduces distributional questions that are lacking in Collingridge dilemma. For instance, it asks: Who bears the burden of proof? In pharmaceutical regulation, manufacturers must prove safety before market authorization. To transpose this to fintech and FEC would require that technology developers demonstrate crime-resistance before deployment. This is a standard no jurisdiction has systematically adopted.
Risk-based regulation is another theoretical dimension, which calibrates intensity of enforcement according to likelihood and severity of harm (Black 2005). In contrast, responsive regulation emphasizes dynamic adaptation to the behavior of regulated entities and contextual factors such as risk changes, institutional environments and regulatory capacity (Baldwin and Black 2008). The FCA’s risk-based supervision explicitly prioritizes firms and products by potential consumer detriment and market integrity threats. This framework addresses a Collingridge limitation: i.e., the dilemma presents a binary choice (early intervention versus late control), whereas risk-based approaches enable graduated responses. Applied to crypto assets, risk-based regulation would stratify oversight as follows: High-touch supervision for systemically important exchanges, principles-based guidance for small wallets, and outright prohibition for privacy coin services that facilitate laundering. This granularity is obscured by the structural framing of Collingridge dilemma.
However, risk-based regulation faces its own temporal challenges: Risk assessments require data, which creates a delayed response that is structurally like Horn 1. Risk-based frameworks also struggle with uncertainties beyond foreseeable scenarios, often referred to as “unknown unknowns” (Diebold et al. 2010; Rumsfeld 2002). AI-enabled FEC presents risks that can hardly be statistically modeled because they have not yet materialized. In the same vein, algorithmic systems may exhibit emergent behaviors unpredictable from design specifications. Here, risk-based regulation, though has additional granular layers, aligns with the Collingridge dilemma, as both approaches grapple with inherent uncertainties that statistical risk assessments alone cannot fully resolve or eliminate.
In terms of theoretical integration in FEC enforcement, the above supplementary frameworks converge with Collingridge at specific points. For example, responsive regulation provides feedback mechanisms that reduce uncertainty incrementally; at the same time, it assumes jurisdictional reach and escalation capacity often lacking in transnational technology markets. The precautionary principle addresses how regulators should respond to Horn 1 (act despite uncertainty) but does not resolve entrenchment problem in Horn 2. Risk-based regulation provides a graduated alternative to binary intervention timing but reintroduces temporal lag through assessment requirements.
This paper retains the Collingridge dilemma as its primary analytical framework because it uniquely captures the temporal dimension of technology governance that defines FEC enforcement challenges. However, the analysis is strengthened by recognizing where precautionary reasoning could have guided earlier intervention (but did not), where risk-based logics could have enabled graduated responses (but lacked data), and where responsive principles did generate adaptive learning (e.g., FCA sandbox).
The boundaries of theoretical applicability of the Collingridge dilemma are anchored in contexts where technological evolution creates enforcement lag and doctrinal ambiguity. This ensures findings remain relevant to technology-mediated FEC rather than general criminal law. By establishing these boundaries, the paper clarifies its analytical domain while acknowledging that the Collingridge dilemma interacts with but does not replace other explanatory frameworks for regulatory performance in technology governance.

2.3. Limitations and Ethical Considerations

The binary structure of Collingridge dilemma—thus, early intervention (Horn 1: when information is scarce) versus late intervention (Horn 2: when inertia dominates)—may oversimplify gradual regulatory adaptation. This structure risks obscuring intermediate adaptive strategies that do not neatly fit either horn (Liebert and Schmidt 2010). To address the Collingridge dilemma’s binary limits, the dilemma was supplemented, in this paper, by theoretical imperatives such as the Precautionary Principle (for early uncertainty, e.g., anticipatory strict regulation of high-risk AI under EU Artificial Intelligence Act (EUR Lex 2024b)), and Risk-based regulation (for late entrenchment, e.g., tiered oversight in MiFID II). These enable hybrid analysis where precautionary tools mitigate epistemic gaps in FEC (e.g., proactive crypto tracing), while risk-based escalation counters inertia, as seen in JMLIT’s graduated responses (National Crime Agency 2025a).
The binary framing, nonetheless, retains analytical utility. Even if pure Horn 1 or Horn 2 are difficult to obtain, the directional tension between early uncertainty and late inertia is still observable. The paper’s case studies (e.g., OneCoin, Wirecard, Binance) demonstrate that interim adaptive measures do not eliminate these tensions—they rather manage them. The analytical task, therefore, is not to replace Collingridge but to situate it within a richer theoretical ecosystem that includes but is not limited to precautionary principle, risk-based regulation, and responsive regulation as supplementary rather than competing frameworks (see Section 2.2 for details of these theoretical imperatives).
Data limitations in the paper include heavy reliance on secondary literature due to restricted access to granular CPS/SFO datasets; and limited quantitative data on enforcement outcomes for technology mediated crimes such as AI-enabled fraud. To mitigate these limitations, triangulation was achieved by integrating doctrinal legal analysis, empirical case studies (e.g., OneCoin, Wirecard), and comparative regulatory models (e.g., EU Markets in Crypto Assets, US enforcement-led adaptation). This approach enhanced analytical depth.
The paper adheres to academic ethical standards, ensuring confidentiality, transparency, and alignment with public interest. No identifiable personal data, which was not publicly available, was used; but all sources were publicly available. Internal validity was supported through methodological triangulation and peer review while external validity was ensured through comparative jurisdictional analysis (Marchant et al. 2011). Reflexivity was maintained throughout to mitigate interpretive bias and acknowledge the limitations of applying static theoretical models to dynamic enforcement contexts.

3. Findings and Discussions

3.1. Landscape of Financial and Economic Crime (FEC) in the UK

3.1.1. Scale of Financial Crime

Financial and Economic Crime (FEC) represents a significant challenge for the UK’s economy and legal system. Though estimates vary, the scale of FEC is consistently described as systemic. Spotlight on Corruption (2022) places the annual cost at approximately £290 billion, equivalent to 14.5% of GDP, while evidence submitted to the All-Party Parliamentary Group on Anti-Corruption and Responsible Tax suggests that the total economic impact could reach £350 billion annually, or 17.5% of GDP, when accounting for direct losses, enforcement costs, and wider economic distortions (APPG on Anti-Corruption & Responsible Tax 2022).
Furthermore, the NCA reports that more than £100 billion is laundered through or within the UK annually, which is facilitated by vulnerabilities in corporate structures and professional services (NCA 2025). Although these figures are subject to methodological uncertainty—given the inherent difficulty of quantifying illicit financial flows (Collin 2019)—they underscore the pervasive nature of economic crime and its classification as a national economic security threat rather than a discrete criminal justice issue. This magnitude of harm reinforces the argument that FEC is not only a legal problem but also a structural economic risk that requires adaptive governance and coordinated enforcement strategies.
The Home Office’s Economic Crime Plan 2 outcomes report (2025) recorded 6845 prosecutions and 3756 convictions for money laundering offences in the year ending December 2024, which provides further context to enforcement efforts (Home Office 2025c).
The CPS Economic Crime Strategy 2025: Final Progress Report, records 25,665 fraud and forgery prosecutions between April 2021 and December 2024, with an 85.4% conviction rate in the first three quarters of FY 2024/25. Between 2019 and 2024, confiscation orders recovered £450 million, returning £88 million to victims. Yet evidentiary complexities persist as fraud now accounts for 43% of all estimated crime (Crown Prosecution Service 2025).
The UK’s Economic Crime Plan 2019–2022 identifies several priority areas including money laundering, fraud, corruption, sanctions evasion, and terrorist financing (HM Treasury and Home Office 2019a). Each of these areas has been significantly impacted by technological developments (with around 80% of fraud in the UK cyber enabled) (NECC 2024, p. 4), which create new opportunities for criminal exploitation while simultaneously offering new tools for detection and prevention. The CPS also flags legislative uncertainty around AI-facilitated offending, including gaps in conspiracy law where interaction with AI systems falls outside statutory definitions (Crown Prosecution Service 2025).
The complexity of modern FEC is exemplified by cases such as the Global Laundromat, where more than $20 billion was laundered through UK companies using sophisticated shell company structures and correspondent banking relationships (OCCRP 2014). Such cases demonstrate how technological capabilities enable criminals to exploit regulatory gaps and jurisdictional boundaries.
Complementing the above metrics, the SFO Annual Report 2024–25 highlights technological adaptation in disclosure management (SFO 2025). Its Annual Report 2024–25 notes that technology-assisted review (TAR) accelerated disclosure by up to 40%, and the Business Plan 2025–26 commits to full TAR rollout and modern case management (SFO 2025). These innovations mitigate disclosure burdens but leave unresolved questions of algorithmic evidence admissibility and corporate attribution. Altogether, the data reveals a high-volume, tech-enabled enforcement landscape that remains doctrinally unsettled in addressing AI-mediated misconduct, particularly mens rea inference and evidential standards for algorithmic outputs.

3.1.2. Technological Transformation of FEC

The digitization of financial services has fundamentally changed both the commission and detection of FEC. Cryptocurrency exemplifies the first horn of the Collingridge dilemma (early uncertainty about criminal misuse), while its later adoption and obfuscation tools reflect the second horn (entrenched complexity). Although cryptocurrency technologies offer legitimate benefits, they have created new avenues for crimes such as money laundering, tax crime, and fraud (Fanusie and Robinson 2018). In particular, the pseudo-anonymous nature of many cryptocurrency transactions, together with the global and decentralized nature of cryptocurrency networks, does present critical constraints for traditional FEC investigation techniques.
Algorithmic trading technologies have fostered new forms of market manipulation such as spoofing and layering. These technologies illustrate both horns of the dilemma: initial regulatory unfamiliarity and later evidentiary complexity in proving intent. Such strategies can be executed at speeds and scales impossible for human traders (Kirilenko and Lo 2013). Prosecutors need to be equipped with sophisticated understanding of both market microstructure and algorithmic implementation, capabilities that traditional law enforcement agencies have struggled to develop (Kirilenko and Lo 2013).
Furthermore, AI and machine learning technologies present both opportunities and challenges for financial crime. AI’s opacity and autonomous behavior complicate liability attribution, reinforcing the Collingridge dilemma’s doctrinal implications. Whereas these technologies enable more sophisticated fraud detection and anti-money laundering systems, they also enable criminals to develop more sophisticated attacks, including deep-fake technologies for identity fraud and adversarial machine learning techniques to evade detection systems (Caldwell et al. 2020; FATF 2025).

3.1.3. Regulatory and Enforcement Framework

The UK’s regulatory response to FEC is characterized by a multi-agency architecture with overlapping mandates. The Financial Conduct Authority (FCA) serves as the principal conduct regulator for financial services, while the Prudential Regulation Authority (PRA) supervises prudential soundness (Bank of England PRA 2025; HM Treasury and Home Office 2025; FCA n.d.). Law enforcement responsibilities are distributed across agencies such as the NCA, FCA, SFO, CPS; HM Revenue and Customs (HMRC), and police forces, with the City of London Police as National Lead Force for fraud (City of London Police n.d.; HMRC 2025).
As highlighted in the NCA’s National Strategic Assessment, this fragmentation of institutions in the FEC ecosystem creates coordination challenges in technologically complex cases (NCA 2025). Although there are some joint tasking and intelligence workflows across these institutions, the NCA and HM Inspectorate of Constabulary and Fire and Rescue Services (HMICFRS) have acknowledged the need for improved strategic governance, intelligence sharing, and multi-agency tasking to address serious and organised crime effectively (National Crime Agency 2025b; HMICFRS 2018).
These institutions rely on several legal instruments to control early or late realization of technology-mediated FEC, including the following: The (Fraud Act 2006), which defines offences such as fraud by false representation and abuse of position; (Terrorism Act 2000), and (Anti-Terrorism, Crime and Security Act 2001), which deal with terrorism offences; POCA, which governs prosecution, asset recovery and confiscation; and the (Money Laundering Regulations 2017), which impose AML obligations on regulated entities (Money Laundering Regulations 2017). Other primary legal instruments in the UK’s FEC regulatory and enforcement ecosystem include: (Financial Services and Markets Act 2000; Sanctions and Anti-Money Laundering Act 2018; Serious Crime Act 2015; Bribery Act 2010; Criminal Finances Act 2017; Economic Crime (Transparency and Enforcement) Act (ECTEA 2022); Economic Crime and Corporate Transparency Act (ECCTA 2023); and Sentencing Bill 2025).
These instruments have provided important legal infrastructure for regulating technology-enabled FEC including powers for crypto-asset seizure and forfeiture (HM Government 2024a). Nonetheless, fragmentation across agencies, uneven technical capacity, and cross-border dependence often vitiate enforcement in complex, data-intensive cases (HMICFRS 2018; National Crime Agency 2025b). These difficulties are magnified where innovative technologies challenge attribution (crypto), intent inference (algorithms), perimeter clarity (platforms), and evidentiary explainability (AI) (Crown Prosecution Service 2022; Kirilenko and Lo 2013; Yeung 2018). Thus, though these statutes provide a robust foundation, the emergence of AI and digital financial technologies has exposed doctrinal and operational gaps that challenge traditional enforcement paradigms.
Technological Innovation and Criminal Liability
Legislative debates in the UK Parliament and EU institutions have begun to explore these doctrinal gaps. The Joint Committee on Human Rights (UK Parliament 2025b) launched a 2025 inquiry into the regulation of AI, raising critical questions about liability attribution in AI development and deployment, and whether existing legal frameworks sufficiently safeguard human rights in the context of agentic AI systems. The inquiry also considers whether private actors should be held to the same human rights standards as public bodies, and whether different AI technologies warrant distinct regulatory approaches.
The discussion paper of UK Law Commission 2025 equally explores evidentiary standards and liability attribution in AI-assisted prosecutions, with emphasis on the need for legal reform to keep pace with technological change (Law Commission 2025a). At the European level, the European Union (EU) Artificial Intelligence Act (EUR Lex 2024b) has introduced a risk-based regulatory framework for AI systems, including obligations for high-risk applications. However, it has stopped short of establishing a comprehensive criminal liability regime (European Parliament 2024).
Traditional criminal law doctrines such as mens rea (intent) and actus reus (conduct) are increasingly strained when applied to autonomous systems (Caldwell et al. 2020). Statutes such as the (Computer Misuse Act 1990) and (Fraud Act 2006) presuppose human agency, which may not be directly attributable in cases that involve machine learning systems that independently evolve of their programmers. Scholars such as Yeung (2018) and Pagallo (2013) advocate for models of functional liability and algorithmic accountability, which assign responsibility based on control over system design, deployment, and oversight, rather than direct intent. These debates underscore the need to clarify criminal liability attribution in cases where AI systems autonomously generate harmful outcomes.
Recent Legal Developments and Anticipatory Legal Reform
Recent UK case law underscores the importance of adapting legal frameworks to technological realities. In (AA v Persons Unknown 2019), the High Court held that Bitcoin constitutes property under English law. The court granted interim proprietary relief and allowed service out of jurisdiction, reflecting the emergent adaptability of English law to innovative digital assets (Society for Computers and Law 2020). Similarly, in (D’Aloia 2024), the High Court recognized cryptocurrencies as property and allowed service via blockchain in a £2.5 million fraud case, highlighting the evidentiary and jurisdictional challenges of digital asset recovery (Hill 2024) [See Section 3.1.3, particularly with respect to “Property, Mens Rea, and Corporate Attribution in Digital Contexts”, for the doctrinal implications (tracing and evidential rigor)]. Conversely, in (Ayinde v London Borough of Haringey and Al-Haroun v Qatar National Bank 2025), the High Court criticized the use of AI-generated fictitious case law in legal submissions, which raises concerns about the integrity of AI-assisted litigation.
The SFO’s implementation of Technology Assisted Review (TAR) to facilitate large-scale document disclosure in complex fraud investigations, however, signifies a strategic institutional response to the challenges posed by data-intensive legal processes. This adoption does reflect a broader commitment to leveraging AI to enhance investigative efficiency and compliance with disclosure obligations (Tan et al. 2023; UK Parliament 2025c).
Integrating these perspectives into the Collingridge dilemma underscores the need for anticipatory legal reform. As AI systems become increasingly embedded in financial services, the legal system must evolve to address not only technological complexity but also foundational questions of culpability, justice, and regulatory legitimacy.
Against these recent developments, a doctrinal synthesis to clarify how actus reus and mens rea converge in digital contexts, and how corporate attribution now operates for FEC is provided below.
Property, Mens Rea, and Corporate Attribution in Digital Contexts
The convergence of actus reus and mens rea in technology-enabled offences can be evinced through three strands: Property classification, inferential intent, and corporate liability frameworks. It must, however, be recognized that this convergence is partially emergent rather than fully settled.
In terms of Property, (R v Kelly 1999) has established that items that are subjected to lawful skill may constitute “property,”. This principle has been extended to crypto assets by (AA v Persons Unknown 2019), which has recognized Bitcoin as property that is capable of supporting proprietary relief. Residual classification uncertainty has been addressed by the (Property (Digital Assets etc) Act 2025 (c.29)), which received Royal Assent on 2 December 2025 and confirms that a thing is not prevented from attracting personal property rights merely because it is neither a thing in possession nor a thing in action (R v Kelly 1999; AA v Persons Unknown 2019; Property (Digital Assets etc) Act 2025).
With respect to inferential intent, UK tribunals have accepted pattern-based inferences of manipulative purpose in algorithmic trading. For instance, in (Gonzalez, Sheth and Urra v FCA 2025), the Upper Tribunal upheld sanctions for spoofing on the basis of repeated large orders placed without genuine execution intent, applying the balance of probabilities standard under FSMA/ Market Abuse Regulation (MAR). This approach is conceptually analogous, though procedurally distinct from, the common-law position in (R v Woollin 1999), which permits juries to find criminal intent where a consequence was virtually certain and appreciated by the actor. Any transposition of pattern-based inference into criminal prosecutions must therefore be expressly justified against the beyond-reasonable-doubt standard and evidential rules (Gonzalez, Sheth and Urra v FCA 2025; R v Woollin 1999).
On corporate liability, Statute has evolved along two tracks. On one hand, (ECCTA 2023 s.196) places corporate attribution for specified economic crimes on a statutory footing by introducing a “senior manager” test. This provides an alternative route to attribution beyond the traditional directing-mind test. On the other hand, (ECCTA 2023 s.199) creates a distinct failure to prevent fraud offence for large organizations with a “reasonable procedures” defense. Both reforms cohere with Meridian Global Funds Management Asia Ltd. v Securities Commission 1995 on rules of attribution, but operate independently of it (ECCTA 2023, s.196; ECCTA 2023, s.1999; Meridian 1995).
Finally, the litigation in (D’Aloia 2022) shows the potential and practical limitations of English law’s engagement with digital assets: procedural innovation in service by non-fungible tokens (NFT). This case authorizes service by NFT (judicial pragmatism), while (D’Aloia 2024) engages with tracing of USDT (Tether’s cryptocurrency stablecoin) but dismisses the claim for evidential shortcomings in blockchain analysis. The claimant’s claim failed due to deficient blockchain tracing analysis. It highlights that legal possibilities do not guarantee practical recovery (D’Aloia 2022, 2024). These nuances demonstrate the uncertainties in technological development in the context of criminal law, highlighting the first Horn of the Collingridge dilemma.

3.1.4. Case Duration and Asset Recovery Challenges

Cryptocurrency-related cases often involve complex asset tracing across decentralized networks and international jurisdictions. Although blockchain transparency can facilitate investigations, the use of mixers, privacy coins, and unregulated exchanges complicates recovery. Case durations are often prolonged due to the need for technical expertise and international cooperation (BIS 2023; Brownstein 2023).
Algorithmic trading cases involving spoofing or layering require extensive forensic analysis of high-frequency trading data and expert testimony due to the complexity of automated systems (Leonard et al. 2024). It is especially challenging to establish manipulative intent as this often depends on patterns inferred from order and trade data rather than direct evidence (Practical Law et al. 2019). Investigations are typically prolonged as regulators must differentiate between legitimate order cancellations and deceptive practices designed to mislead the market. Even when liability is established, remedies often include civil penalties and sometimes restitution, though practical recovery is limited (Cartea et al. 2020; FCA 2025b).
In contrast, fintech-related fraud, such as peer-to-peer lending scams and mobile payment fraud, does pose distinct challenges in quickly tracing funds that have been moved through digital platforms. Even though digital records can aid recovery, the speed of transactions and lack of traditional oversight can hinder enforcement. Case durations in fintech fraud tend to be shorter than those involving algorithmic systems due to lower technical barriers (Kliewer 2022). Furthermore, AI-enabled financial crimes are emerging and doctrinally complex. It is legally and technically challenging to establish liability and causation when autonomous systems are involved. These cases are usually protracted, with minimal asset recovery because of the complexity of the offenses and lack of established legal frameworks (Law Commission 2025b; Caldwell et al. 2020).

3.2. Manifestations of the Collingridge Dilemma in FEC

3.2.1. Cryptocurrency and Digital Assets

The regulation and prosecution of cryptocurrency-related financial crime illustrate the Collingridge dilemma in its clear form; thus, a conceptual framework that describes the tension between early-stage regulatory uncertainty and late-stage institutional inertia. In the initial phase of Bitcoin’s emergence in 2009, UK regulatory and enforcement bodies lacked the technical capacity and legal foresight to anticipate the implications of decentralized, pseudo-anonymous, and cross-border digital assets for financial crime. This regulatory lag was not unique to the UK. As Walker (2021) observes, Bitcoin’s design was intentionally decentralized to avoid regulation, and global regulators, including those in the UK, struggled for many years to understand and respond to its implications for financial oversight and criminal misuse.
During this initial period, UK authorities appear to have adopted a non-interventionist stance. This reflected in the Bank of England’s 2014 assessment, which concluded that digital currencies posed minimal risk to financial stability, while only cautiously acknowledging their potential misuse for money laundering (Bank of England 2014). The report’s limited engagement with criminal exploitation risks demonstrates the first dimension or horn of the Collingridge dilemma, which is, limited information and understanding of the technology’s societal and legal implications constrained the development of a proactive regulatory framework or response.
As adoption of cryptocurrencies increased and criminal exploitation became more visible, the limitations of existing enforcement mechanisms became increasingly apparent. The NCA reported a significant rise in the use of cryptocurrencies for laundering proceeds of fraud and cybercrime, which has been facilitated by privacy-enhancing technologies such as mixers, privacy coins, and decentralized exchanges (NCA 2025). These developments have complicated asset tracking and have undermined the effectiveness of conventional investigative tools.
Prosecution of cryptocurrency-related financial crime presents substantial evidentiary and procedural challenges. Law enforcement agencies and prosecutors often struggle to establish definitive links between digital asset transactions and criminal proceeds due to the pseudo-anonymous nature of blockchain systems and the use of obfuscation tools. The decentralized nature of cryptocurrencies, together with the inability to compel disclosure of private keys, significantly hinders attribution and asset recovery in criminal investigations (U.S. Government Accountability Office 2023; Balthazor 2019). According to the NCA (2025), criminals increasingly exploit these technologies to launder illicit funds, complicating attribution and asset recovery efforts. The Crown Prosecution Service (CPS) has issued guidance on managing digital evidence, emphasizing the importance of early collaboration with investigators, forensic imaging, and the use of algorithmic tools to manage large volumes of data (Crown Prosecution Service 2022).
Legal precedents such as (R v E 2018) and (R v Bater-James 2020) have shaped the evidentiary standards for digital disclosure. In (R v E 2018), the Court of Appeal overturned a trial judge’s decision to stay proceedings due to the police’s failure to seize and examine a complainant’s mobile phone. The judgment clarified that seizure of digital devices must be based on a “reasonable line of enquiry,” not a blanket presumption, and that failure to obtain such evidence does not automatically render a trial unfair.
Building on insights from (R v E 2018), (R v Bater-James 2020) established four guiding principles for digital disclosure: (1) digital material should only be reviewed if it aligns with a reasonable line of enquiry; (2) investigators must adopt a staged and proportionate approach; (3) complainants must be kept informed about the scope and duration of any device examination; and (4) disclosure must meet the legal test and respect privacy rights. The judgements in these cases underscore the need for transparency, proportionality, and technical competence in prosecutorial strategy, particularly in cases that involve complex digital ecosystems such as cryptocurrency (Crown Prosecution Service 2022; NCA 2025).
In response to these challenges, the FCA has taken up supervisory responsibility for crypto asset businesses pursuant to the amended (Money Laundering Regulations 2017) (MLRs), which aligns with the EU’s Fifth Money Laundering Directive (5AMLD) (FCA 2024). Although this regulatory shift sought to enhance oversight, its implementation has proven complex. This is because a number of firms have found compliance burdensome, which results in market consolidation and the proliferation of unregulated alternatives. This dynamic reflects the later stage or second horn of the dilemma, wherein regulatory intervention is constrained by the inertia of established market structures and behavioral norms, thereby complicating efforts to implement effective oversight on financial and economic crime. Thus, delayed regulatory intervention faces entrenched interests and established practices that make effective control against this crime difficult. For the property/mens rea foundations of digital-asset offences, see Section Property, Mens Rea, and Corporate Attribution in Digital Contexts.

3.2.2. Algorithmic Trading and Market Manipulation

High-frequency and algorithmic trading technologies present another important illustration of the impact of Collingridge dilemma on the prosecution of FEC. The prosecution of crimes linked to these technologies including market manipulation shows the evidentiary and doctrinal complexities that arise when legal systems confront technology-mediated financial crime. These trading technologies, which enable the execution of market orders in microseconds, became increasingly prevalent in UK markets throughout the 2000s and 2010s (FCA 2018).
However, regulatory responses during this formative period were constrained by limited institutional understanding of the implications such technologies posed for market integrity and abuse. The predecessor of FCA, the Financial Services Authority, focused primarily on traditional forms of market manipulation while algorithmic trading grew rapidly in volume and sophistication (Government Office for Science 2012).
The prosecution of algorithmic market manipulation presents complex legal challenges, particularly in establishing criminal intent within technologically mediated environments. A notable example is the case of United States Commodity Futures Trading Commission (CFTC) against Navinder Singh Sarao, where the defendant was extradited from the UK and pleaded guilty in 2016 to charges of wire fraud and spoofing before the U.S. District Court for the Northern District of Illinois (U.S. Department of Justice 2016). Sarao admitted to using an automated trading program to place thousands of large-volume orders, which he had no intention to execute—commonly referred to as “spoof orders”—to create false market signals and manipulate the price of E-mini-S&P 500 futures contracts (CFTC 2015; U.S. Department of Justice 2016).
Although the UK High Court in Sarao against The Government of the United States of America upheld his extradition, it did not adjudicate the substantive criminal allegations (High Court of England and Wales 2016). The case is instructive in demonstrating the evidentiary clarity that can arise when direct communications and admissions establish mens rea. As algorithmic agents continue to operate at speeds and scales beyond human oversight, distinguishing between deliberate manipulation and unintended system behavior poses significant evidentiary and doctrinal challenges for courts.
Further jurisprudence has reinforced the evidentiary and interpretive challenges in algorithmic manipulation. In (FCA v Da Vinci Invest Ltd. 2015), the High Court held that layering (i.e., a form of algorithmic spoofing) constituted market abuse pursuant to the FSMA. The Court implemented an objective standard, holding that intent was not a necessary element to establish liability pursuant to the FSMA’s civil market abuse regime. This doctrinal stance facilitates regulatory enforcement but complicates the criminal prosecution of algorithmic actors, where intent is a central requirement.
More recently, in (Gonzalez, Sheth and Urra v FCA 2025), the Upper Tribunal upheld the decision of the FCA to fine and prohibit three traders for engaging in spoofing within Italian bond futures markets. The Tribunal accepted that trading patterns and forensic data analysis could serve as sufficient proxies for intent, even in the absence of direct evidence such as internal communications or confessions. This decision reflects a growing judicial willingness to infer culpability from circumstantial and behavioral evidence in technologically complex cases.
These cases, altogether, tend to highlight the limitations of traditional legal frameworks in addressing the emergent risks posed by algorithmic trading. Prosecutors need to demonstrate that market manipulation occurred. They also must show that the manipulation resulted from intentional programming rather than algorithmic error or unintended behavior (Nitschke 2018). This requirement for technical expertise has created capacity constraints in both investigation and prosecution of algorithmic crimes associated with FEC.
The technical complexity of algorithmic trading systems creates evidentiary challenges, which existing legal frameworks struggle to address. In response to these challenges, the UK has implemented the Markets in Financial Instruments Directive II (MiFID II) (Directive 2014), which has introduced enhanced oversight requirements for algorithmic trading systems (Financial Conduct Authority 2023). Although these reforms aim to improve oversight and mitigate systemic risk, they also impose significant compliance burdens. Smaller firms that lack the resources to meet these obligations have increasingly been frustrated, which results in a concentration of algorithmic trading among larger firms with greater regulatory capacity (FCA 2018).

3.2.3. Fintech Innovation and Regulatory Gaps

The proliferation of financial technology (fintech) firms has introduced new vectors for financial and economic crime while simultaneously exposing structural limitations in the regulatory architecture of the UK. Innovations such as peer-to-peer (P2P) lending platforms, mobile payment systems, and algorithm-driven investment services (e.g., robot-advisors) have disrupted traditional financial intermediation models. However, these developments have also outpaced the evolution of legal and regulatory frameworks, which creates enforcement blind spots and jurisdictional ambiguities.
The collapse of London Capital & Finance (LCF) in 2019 exemplifies the regulatory challenges posed by fintech innovation. LCF operated a P2P-style investment scheme that raised over £230 million from retail investors through the issuance of mini-bonds, much of which was subsequently lost through poor lending decisions and potential fraud (FCA 2019a). The case revealed regulatory gaps in oversight of P2P lending platforms, which fell outside traditional banking regulation while offering bank-like services to consumers. This, thereby, enabled the evasion of scrutiny typically applied to deposit-taking institutions or collective investment schemes.
The FCA’s response was that it introduced enhanced regulatory requirements for P2P lending platforms in 2020. These reforms included mandatory appropriateness assessments, investor classification protocols, and investment limits for retail clients. Although these measures sought to strengthen consumer protection, they also imposed significant compliance burdens. Many smaller platforms exited the market, resulting in increased concentration and reduced access to alternative credit for underserved borrowers (FCA 2019a, 2019b). The response by the FCA to the LCF collapse demonstrated the challenges of addressing regulatory gaps after problems become apparent.
Legislative reforms have strengthened enforcement capabilities. The (ECCTA 2023) has introduced expanded powers for seizure and forfeiture of crypto assets, enhanced identity verification requirements for company officers, and created new offences such as failure to prevent fraud. These measures seek to close evidential and enforcement gaps in the fight against FEC (Home Office et al. 2024).
These developments underscore the manifestation of the Collingridge dilemma in the fintech context. Thus, early regulatory inaction, driven by limited understanding of emerging business models, has been followed by reactive enforcement efforts, which are constrained by institutional inertia and legal ambiguity. This results in a fragmented regulatory landscape, despite the presence of coordination mechanisms such as the JMLIT and the Economic Crime Strategic Board (ECSB), which seek to bridge institutional gaps in which innovation often outpaces oversight, and enforcement agencies struggle to adapt legacy frameworks to emerging risks.

3.2.4. Artificial Intelligence (AI) and Machine Learning Applications

The integration of AI and machine learning technologies into financial services has presented emerging manifestations of the Collingridge dilemma. These technologies enhance fraud detection, automate compliance processes, and improve transaction monitoring (Home Office 2025a; O’Reilly 2025). However, they also create new opportunities for criminal exploitation, particularly in the context of financial crime.
AI-enabled FEC demonstrates the Collingridge dilemma in its most dynamic and complex form. In the early stages of AI deployment, regulators and prosecutors lacked technical expertise to anticipate the implications of autonomous decision-making systems. As these systems have advanced, with the ability to generate synthetic identities, manipulate digital or electronic evidence, and execute high-frequency fraud, the challenge of implementing effective legal controls has significantly grown (Home Office 2025c; O’Reilly 2025; Department for Science, Innovation & Technology 2025). AI-powered fraud detection systems can now identify suspicious transaction patterns with greater speed and accuracy than traditional rule-based approaches. However, these technologies also enable criminals to devise more sophisticated attacks. For instance, adversarial machine learning techniques can be used to bypass detection mechanisms, and deepfake technologies facilitate more convincing identity fraud (Caldwell et al. 2020; FATF 2025).
Recent assessments by the Alan Turing Institute’s Centre for Emerging Technology and Security (CETaS) highlight the UK’s limited institutional capacity to address AI-enabled crime. The report identifies a significant acceleration in AI-driven FEC, including phishing, impersonation scams, and synthetic fraud. It calls for the establishment of a dedicated AI Crime Taskforce within the National Crime Agency (Burton et al. 2025, p. 4). The report underscores the need for law enforcement to adopt AI tools to both detect and proactively disrupt criminal networks.
On the part of prosecution, AI-enabled financial crime presents unique evidentiary and doctrinal challenges. Establishing criminal intent when algorithms make autonomous decisions raises fundamental questions about responsibility and causation. If an AI system autonomously develops strategies that constitute market manipulation, determining criminal liability requires analysis of both the system’s programming and its emergent behavior (Law Commission 2025a; Kroll 2025).
Thus, traditional legal doctrines, anchored on human agency and intent, struggle to accommodate scenarios in which harm is occasioned by algorithmic outputs that were neither directly programmed nor foreseen by human actors. As Dsouza (2020) and the Law Commission (2025a) have noted, although current legal frameworks may suffice for narrow AI applications, they are increasingly strained by systems capable of adaptive, unsupervised learning and emergent behavior.
The evidentiary burden is further complicated by the opacity of many AI systems. Black-box algorithms, particularly those based on deep learning architectures, frustrate forensic scrutiny and challenge the admissibility and reliability of digital evidence. This has prompted calls for the development of comprehensive explainable AI (XAI) standards and the incorporation of algorithmic accountability mechanisms into criminal procedure (Dsouza 2020).
Current UK legal framework provides limited remedies for these challenges. For instance, the (Computer Misuse Act 1990) and the (Fraud Act 2006) focus on human intent and action, which become ambiguous when applied to AI systems with varying degrees of autonomy. This creates uncertainty for both prosecutors and defense counsel about how existing laws apply to AI-enabled crimes.
Despite these challenges, AI also offers significant potential for enhancing financial crime enforcement. Financial institutions are increasingly deploying AI to reduce false positives in AML systems, automate Know Your Customer (KYC) processes, and detect anomalous transaction patterns in real time (Kroll 2025). However, the dual-use nature of AI—where the same tools can be used for both compliance and criminal purposes—necessitates a regulatory approach that is both technologically literate and anticipatory.
Therefore, the application of AI in financial services demonstrates the temporal and technical dimensions of the Collingridge dilemma. Early regulatory inaction, driven by uncertainty and limited foresight, has given way to enforcement challenges exacerbated by the complexity and opacity of AI systems. To address these challenges, there is the need to ensure doctrinal innovation as well as institutional investment in technical expertise, cross-sector collaboration, and the development of clear legal standards that can accommodate the evolving nature of machine agency.
Corporate and Strict Liability in AI-FEC
A rigorous framework for corporate criminal attribution in AI-mediated FEC in the UK operates along two distinct statutory paths introduced by the (ECCTA 2023).
Firstly, attribution via senior managers (ECCTA 2023, s.196): Where a senior manager acting within actual or apparent authority commits a scheduled economic crime (for instance, fraud configured, deployed, or overseen through an AI system), the manager’s mens rea is attributed to the company, thereby rendering the organization criminally liable (ECCTA 2023, s.196). Section 196 extends the traditional “identification doctrine” by replacing the narrow “directing mind and will” test with a broader “senior manager” standard for offences listed in Schedule 12, including fraud, false accounting, money laundering, and market abuse offences.
Secondly, organizational failure to prevent fraud (ECCTA 2023, s.199), creates a strict liability corporate offence for large organizations where an associated person commits a listed fraud offence intending to benefit the organization (or a client to whom the organization provides services), unless the organization proves it had reasonable prevention procedures in place.
In AI settings, section 199 addresses organizational failures where employees or agents misuse AI systems, or where AI governance controls (including model-risk management frameworks, access controls, audit trails, and monitoring for drift or unauthorized override) were not reasonable in the circumstances (Home Office 2025b).
These statutory routes are complementary rather than alternative: Section 196 attributes a human manager’s criminal conduct to the corporate body through an extended identification mechanism, while section 199 imposes direct organizational liability for systems failures, with a statutory defense that is contingent upon demonstrating reasonable procedures. Neither provision eliminates the common law identification doctrine, which continues to apply to non-scheduled offences and organisations falling outside section 199’s “large organization” threshold.
Property classification no longer impedes establishing actus reus where AI-mediated conduct concerns crypto assets or other digital items. The (Property (Digital Assets etc) Act 2025 (c.29)), which received Royal Assent on 2 December 2025, confirms that a thing is not prevented from attracting personal property rights merely because it fits neither the traditional category of chose in possession nor chose in action (Property (Digital Assets etc) Act 2025). The Act deliberately leaves boundary-setting to incremental judicial development, thereby avoiding premature codification that might constrain adaptation to emerging asset classes.
Evidential standards present formidable challenges in AI-mediated prosecutions. The admissibility and probative weight of software-generated evidence and model outputs turn critically on transparency, auditability, and demonstrable reliability. These considerations were highlighted by the Ministry of Justice’s January 2025 call for evidence on computer-generated evidence following the Post Office Horizon scandal (Criminal Cases Review Commission n.d.), and by the Crown Prosecution Service’s commitment to human-in-the-loop oversight for prosecutorial AI use (Ministry of Justice 2025; Crown Prosecution Service 2025). Where AI systems function as “black boxes” lacking explainability, prosecutors face substantial hurdles in satisfying foundational evidential requirements, particularly where algorithmic outputs constitute core elements of the prosecution case rather than peripheral administrative evidence (Ministry of Justice 2025).
Finally, regulatory precedents on pattern-based inference warrant careful contextualization. While UK regulatory proceedings under the Market Abuse Regulation have accepted pattern-based inferences of manipulative intent in algorithmic trading contexts, applying civil standards of proof on the balance of probabilities (Financial Conduct Authority 2025), the translation of such reasoning to criminal prosecution remains conceptually plausible but operationally untested. Criminal trials demand proof beyond reasonable doubt, apply stricter evidential rules, and require demonstration of subjective mens rea rather than objective market impact. Regulatory decisions, therefore, constitute persuasive but non-binding analogies in criminal contexts.

3.3. Case Studies in Regulatory and Enforcement Responses

3.3.1. The OneCoin Fraud Case

Promoted as a revolutionary digital currency, OneCoin operated as a global Ponzi scheme that defrauded investors of over $4 billion between 2014 and 2016. The scheme’s co-founder, Ruja Ignatova has been a fugitive since 2017 (Federal Bureau of Investigation n.d.) while the U.S. Department of Justice has prosecuted several senior figures associated with the scheme, including the Head of Legal and Compliance who pleaded guilty in 2023 (U.S. Attorney’s Office 2023). This underscores the scale and sophistication of fraud.
In its early operations, UK regulatory authorities struggled to assess the legitimacy of OneCoin due to the novelty of cryptocurrency technologies and the absence of a clear legal framework. The FCA initially issued a warning about OneCoin in 2016, but later withdrew the notice in 2017, citing jurisdictional limitations (Bartlett 2020)—albeit it was also alleged that the withdrawal of the notice followed legal pressure from the scheme’s representatives (Hamacher 2020).
As the fraudulent nature of OneCoin became increasingly apparent, UK victims sought redress through civil litigation. In 2024, the High Court issued worldwide freezing orders targeting Ignatova and her associates, following claims by over 400 UK-based OneCoin investors (Alecci 2024; Kirk 2024).
Despite the scale of the fraud, UK prosecutors have not brought criminal charges against OneCoin’s principals. Rather, enforcement efforts have focused on asset recovery and civil remedies. Criminal proceedings have been led primarily by U.S. authorities, who indicted Ignatova in 2017 and secured a 20-year sentence for her business partner, Karl Greenwood (Rahman 2023).
These legal actions reflect the transnational nature of OneCoin cryptocurrency fraud and provide an exemplification of how the Collingridge dilemma impacts complex transnational financial crime cases. Specifically, the case demonstrates the first horn of the Collingridge dilemma—where limited information and institutional uncertainty constrain early regulatory intervention in emerging technologies.

3.3.2. The Wirecard Scandal and Implications

The collapse of Wirecard AG, which was once a flagship European fintech firm, exposed critical regulatory vulnerabilities in the oversight of cross-border financial technology operations. Although the scandal originated in Germany, its ramifications for the UK were substantial due to the involvement of Wirecard Card Solutions Limited (WCSL), a subsidiary authorized by the FCA to issue e-money and provide payment services (FCA 2020). Wirecard’s rapid expansion and opaque corporate structure demonstrate the first horn of the Collingridge dilemma: early regulatory restraint stemming from limited understanding of emerging fintech models.
When Wirecard’s €1.9 billion accounting fraud was exposed in 2020, the FCA suspended WCSL’s operations to safeguard consumer funds (FCA 2020; Dias et al. 2021; Nagarajan 2020). However, the FCA’s enforcement capacity was hindered by jurisdictional fragmentation and WCSL’s operational dependence on its German parent company. The scandal revealed systemic deficiencies in cross-border supervisory coordination, as UK regulators lacked adequate visibility into the risks embedded in Wirecard’s global infrastructure.
Whereas German authorities led criminal investigations, the UK response remained confined to regulatory measures, highlighting the limitations of traditional enforcement mechanisms in transnational fintech cases. The Wirecard episode does underscore the urgent need for anticipatory governance frameworks and strong international regulatory cooperation. The lack of effective cross-border supervisory protocols on FEC, such as those promoted by European Securities and Markets Authority (ESMA) or International Organization of Securities Commissions (IOSCO), contributed to fragmented oversight and delayed enforcement (Yusuf 2020).
The Wirecard case also elucidates an underappreciated aspect of the Collingridge dilemma, which is that regulatory dependence on foreign supervisors creates secondary epistemic uncertainty. The FCA’s initial restraint was rational given BaFin’s ostensible oversight. However, BaFin itself exhibited Horn 1 characteristics, failing to scrutinize Wirecard’s claims adequately and even prosecuting journalists who exposed fraud. This cascading uncertainty, where UK regulators deferred to German regulators who themselves lacked information, does show how the dilemma operates at meta-levels in integrated financial markets.
The aftermath underscores the need for anticipatory governance mechanisms that do not rely solely on national regulators’ expertise. Proposals for European Banking Authority supervision of cross-border fintech firms respond directly to Wirecard’s jurisdictional arbitrage, but implementation lags adoption, which is a microcosm of the Collingridge dilemma at the supranational level.

3.3.3. Cryptocurrency Exchange Enforcement Actions

The enforcement actions against unregistered cryptocurrency exchanges provide insights into regulatory approaches to addressing the Collingridge dilemma. In 2021, the FCA banned Binance Markets Limited, a UK-based affiliate of the global Binance Group, from conducting regulated activities in the UK (Rahman and Schmidt 2025). The FCA cited the failure of Binance to meet AML standards and registration requirements pursuant to the 5AMLD (FCA 2021b). Although the FCA does not directly regulate cryptocurrencies such as Bitcoin or Ether, it oversees crypto asset derivatives and firms offering related financial services. The FCA’s intervention was driven by concerns over investor protection and systemic risks associated with unregulated crypto asset platforms.
This case shows the second horn of the Collingridge dilemma, where technological complexity and entrenched market position of the regulated entity undermines regulatory action that is already delayed. The decentralized global structure of Binance and its continued service provision to UK users via offshore platforms did undermine the FCA’s enforcement capacity, which does reveal the jurisdictional limitations of domestic regulation. In addition, the FCA’s investigation highlighted the resource constraints faced by national regulators in scrutinizing sophisticated digital platforms, especially in the absence of harmonized international standards.

3.4. Adaptive Strategies

3.4.1. The Financial Conduct Authority’s Approach

The FCA has adopted a range of adaptive strategies to address the regulatory challenges posed by emerging financial technologies. These strategies reflect an effort to navigate the Collingridge dilemma, whereby early regulatory uncertainty and later institutional inertia complicate enforcement. This dilemma highlights the difficulty of regulating technologies whose societal impacts are not fully understood during early development yet become entrenched and resistant to change once widely adopted.
One of the FCA’s flagship initiatives is the regulatory sandbox, launched in 2016 to facilitate controlled experimentation with innovative financial products and services. The sandbox allows companies to operate under modified regulatory conditions, subject to appropriate consumer protection safeguards. This enables early engagement with emerging technologies while mitigating potential harm through structured oversight (FCA 2025a). Even though the sandbox has supported innovation in areas such as robo-advice and P2P lending, publicly available FCA reports offer limited systematic insights into its effectiveness in identifying vulnerabilities to FEC. This implies a gap between regulatory experimentation and enforcement intelligence (FCA 2017b; Schilling de Carvalho 2022).
In the same vein, the FCA has adopted a guidance-based regulatory model for cryptocurrency-related activities. Instead of imposing prescriptive rules, the FCA provides interpretive guidance under the MLRs, which seeks to preserve regulatory flexibility while providing clarity to market participants. However, there continues to be enforcement challenges. In 2022, the FCA had registered 31 crypto firms under the MLRs, but no confirmed prosecutions under the POCA had been reported (FCA 2022) until September 2024, when the first criminal prosecution related to crypto assets involving an unregistered crypto ATM operator was initiated that included offences pursuant to the MLRs and POCA (Cross 2024). This underscores the difficulty of translating guidance into effective enforcement mechanisms.

3.4.2. Cross-Agency Coordination Mechanisms

In recognition of the fact that FEC increasingly transcends jurisdictional and regulatory boundaries, the UK has developed several cross-agency coordination mechanisms to enhance its enforcement capacity. The JMLIT, for instance, facilitates intelligence sharing among law enforcement agencies, regulators, and private-sector entities. As of 2025, JMLIT has supported over 1200 operations, contributed to 419 arrests, and enabled the seizure and restraint of over £250 million in criminal assets (National Crime Agency 2025a; Nardello & Co. 2025). These figures underline JMLIT’s role in addressing technology-enabled FEC, particularly cyber-enabled typologies such as online payment fraud, money-mule account networks, crypto-asset laundering, and ransomware cash-outs, which demand data-driven, cross-sector analysis (National Crime Agency 2025b).
Despite these successes, coordination challenges persist, which exemplify the second horn of the Collingridge dilemma, where institutional fragmentation and technical disparities impede effective enforcement. Variations in legal authority, technical capacity and operational priorities across agencies generate coverage gaps and inconsistencies, particularly in relation to critical technology platforms such as decentralized finance and blockchain.
In response, the Economic Crime Strategic Board (ECSB) was established to strengthen strategic coordination among government departments and enforcement bodies. Recent updates to the Economic Crime Plan 2 confirm that the ECSB continues to prioritize technical capacity building and cross-agency training, with some improvements in prosecution rates and asset recovery (HM Treasury and Home Office 2019b; Home Office 2025c). These developments demonstrate the importance of a technically literate enforcement ecosystem in responding to technology-enabled FEC.

3.4.3. Recent Legislative and Regulatory Reform Initiatives

The UK government has pursued a series of legislative reforms aimed at addressing structural deficiencies in FEC law that have been exposed by technological innovation. The Economic Crime (Transparency and Enforcement) Act (ECTEA 2022) has introduced new powers to combat sanctions evasion and improve beneficial ownership transparency. In particular, the Companies House now maintains expanded registers, including the Register of Overseas Entities, to enhance corporate transparency.
Building on this foundation, the Economic Crime and Corporate Transparency Act (ECCTA 2023), introduces further reforms. These include enhanced identity verification powers for Companies House, expanded registrar authority to query and reject information, and new corporate liability provisions. The Act is designed to address the misuse of UK corporate structures in FEC particularly involving cryptocurrency and fintech platforms.
However, these legislative efforts are constrained by the Collingridge dilemma. The rapid pace of technological change often outstrips the legislative process, resulting in reactive rather than anticipatory governance. This dynamic does foster a regulatory lag that undermines the effectiveness of statutory reforms and complicates enforcement in fast-evolving digital ecosystems (HM Treasury and Home Office 2025).

3.5. Comparative Perspective

A comparative overview of selected jurisdictions shows that technology-enabled FEC are often addressed through trading off early legal certainty against adaptive capacity as technologies entrench. The brief discussion on the approaches used by the selected jurisdictions (EU, UK and US) below is illustrative rather than exhaustive of regional or global diversity. It, however, distils how different governance logics, including enforcement-led adaptation, and proactive harmonization either alleviate or reproduce the horns of the Collingridge dilemma (i.e., early uncertainty, and later inertia) and how this provides lessons for UK policymakers that seek to design resilient and forward-looking legal frameworks.
The following rubrics are systematically analyzed across the three jurisdictions: Regulatory model, Adaptability and Predictability. These rubrics are some of the key levers for determining Collingridge dilemma’s dynamics and regulatory approaches. Table 2 presents a brief analytical matrix for these rubrics, which have been analyzed further in Section 3.5.1, Section 3.5.2 and Section 3.5.3.

3.5.1. United States: Manifesting Enforcement-Led Adaptation

The U.S. shows Moderate adaptability and Moderate overall predictability, with mixed signals. The U.S. pursues FEC primarily through “regulation by enforcement”, leveraging existing federal instruments such as the Bank Secrecy Act 1970 and Title 18 provisions on fraud and money laundering. This enforcement posture is operationalized through a multi-agency framework led by the Department of Justice (DOJ), Securities and Exchange Commission (SEC), Commodity Futures Trading Commission (CFTC), Financial Crimes Enforcement Network (FinCEN), and Office of Foreign Assets Control (OFAC) (Gibson Dunn 2025).
Thus, the approach remains case-based and multi-agency as the above agencies collectively set expectations through prosecutions, consent orders and settlements (e.g., the 2023 Binance resolutions), demonstrating responsive disruption at scale. This approach has yielded asset-focused results, exemplified by the June 2025 civil forfeiture complaint that recovered $225.3 million in cryptocurrency linked to investment fraud schemes, which is the largest seizure in U.S. Secret Service history (U.S. Department of Justice 2025; U.S. Secret Service 2025).
Capacity to neutralise harm quickly is clear as a record FY 2024 monetary relief at the CFTC, and high-impact SEC outcomes concentrated in fewer but larger actions. Enforcement capacity is high (U.S. Department of Justice 2025; USAO DC 2025).
The Guiding and Establishing National Innovation for U.S. Stablecoins (GENIUS) Act of 2025 (S.1582) introduces a federal framework for payment stablecoins, but its scope does not extend to the broader crypto ecosystem, including decentralized finance (DeFi), non-fungible tokens (NFTs), and non-stablecoin tokens (Congress.gov 2025). Beyond stablecoins, overlapping mandates (SEC, CFTC, DOJ, FinCEN, OFAC, IRS; state regulators) keep predictability a bit moderate and constrain systemic adaptability (GovFacts 2025). Predictability varies by forum because perimeter clarity is often inferred from enforcement posture and judicial interpretation rather than codified ex-ante (CFTC 2024; U.S. Securities and Exchange Commission 2023).
In Collingridge terms, this model is strong on late-stage disruption (Collingridge 1980) but places heavier burdens on prosecutors and defendants to infer standards from case law and multi-agency guidance, and it can complicate ex-ante investment and evidentiary planning for transnational matters (Cornerstone Research 2024; CFTC 2024).
However, this model is reactive and institutionally fragmented. Decentralized networks and extraterritorial platforms continue to frustrate attribution and recovery, while agency guidance often lags behind emergent typologies (Woods et al. 2021).
Broader legislative efforts such as the (Financial Innovation and Technology for the 21st Century Act 2023) and the (Digital Asset Market Clarity Act 2025) aim to provide regulatory clarity, yet even when enacted, they will leave prosecutors largely reliant on legacy criminal statutes and agency guidance to address technology-mediated offences, including those involving AI. In AI, innovation specific liability remains state led, with Colorado SB24 205 creating consumer protections against algorithmic discrimination for high-risk AI systems (Colorado General Assembly 2024).
Thus, while the U.S. model demonstrates sophisticated asset tracing, and doctrinal flexibility, it also reveals the limitations of enforcement-led adaptation in the face of rapid technological evolution.

3.5.2. European Union: Manifesting Proactive Harmonization

The EU delivers High predictability and Moderate-High adaptability through a single rulebook. Predictability and adaptability are anchored in the evolving EU’s regulatory architecture of FEC—consisting of instruments such as the Markets in Crypto-Assets Regulation (MiCA) that introduces a unified authorization and provides regime for Crypto-Asset Service Providers (CASPs) including prudential governance, and disclosure obligations (European Union 2023); Transfer of Funds Regulation (TFR) that operationalizes the travel rule for crypto-asset transfers, mandating the collection and transmission of originator and beneficiary information (EUR Lex 2023); and AMLR directly imposing applicable AML obligations across Member States (EUR Lex 2024a), with the new Anti-Money Laundering Authority (AMLA) charged to supervise convergence and oversee high-risk entities from 2028 (AMLA 2025); as well as the EU AI Act that adds a risk-based governance layer (EUR Lex 2024b). These establish a framework for crypto-market integrity and AML/CFT in the EU. This framework generally seeks to reduce interpretive divergence and enhance cross-border evidentiary interoperability. However, its effectiveness is temporally constrained as its phased implementation risks a regulatory lag relative to fast-evolving obfuscation techniques and DeFi structures.
The Moderate-High Adaptability is further supported by delegated/implementing acts such as Regulatory Technical Standards (RTS) and Implementing Technical Standards (ITS).
The EU’s approach, thus, provides ex-ante legal clarity by codifying the perimeter of digital-asset activity in MiCA and then adapts through delegated/implementing acts. In practical terms, this reduces early-stage uncertainty for firms, supervisors, and courts (clear authorisation thresholds; uniform conduct and market-abuse rules), while treating late-stage change as an ordinary legislative pace rather than an exceptional fix (European Union 2023; European Commission 2023). These dynamics reflect Collingridge’s timing problem (Collingridge 1980; Liebert and Schmidt 2010).

3.5.3. The UK—Manifesting Technology Neutral Principles

The UK achieves Moderate adaptability and Moderate predictability. The UK implements a technology-neutral, principles-based supervision, plus targeted statutory reform under (ECCTA 2023) that tightens attribution, identity-verification and failure-to-prevent contours. Uncertainty is turned into supervised experimentation via the FCA’s Regulatory Sandbox, which enables iterative learning (Debevoise & Plimpton 2018; FCA 2025a; The National Archives 2023). This is then consolidated doctrinally where experience shows gaps such as operationalised in (ECCTA 2023), which augments crypto asset search/seizure and civil recovery powers within POCA.
Clear AML/CTF baselines stem from the MLRs 2017, with the FCA designated AML/CTF supervisor for UK crypto asset businesses since 10 January 2020 (FCA 2025a). At the same time, a multi-agency architecture (FCA, NCA, SFO, CPS, HMRC, City of London Police) and PPP coordination via JMLIT sustain the Collingridge power/timing constraints.
The FCA’s First Supervisory Notice restricting Binance Markets Limited and mandating consumer warnings exemplifies targeted intervention rather than ex-ante codification (FCA 2021a). This pattern maps onto the Collingridge dilemma’s double bind (Collingridge 1980; Liebert and Schmidt 2010; Nordmann 2010). The iterative learning preserves agility without sacrificing legal coherence as prosecutors gain practical insight from sandboxed telemetry, courts receive clearer statutory anchors, and firms receive process-based predictability through consultation practices. The limitation is that of uneven interoperability as principles-based supervision can yield case-by-case nuance that is harder to port across borders compared with codified EU rules, making mutual recognition and evidence portability more bespoke.

3.5.4. Comparative Insights from the UK, EU and US

The model logic is that EU codifies, UK tests and codifies later, while US enforces then signals. Each resolves early uncertainty and late entrenchment differently, with the EU emphasizing uniformity (predictability), the UK learning and targeted repair (balanced agility), and the US deterrence and asset disruption (speed and scale). The three jurisdictions illustrate Collingridge’s double bind in practice (Collingridge 1980; Liebert and Schmidt 2010; Nordmann 2010). The EU’s harmonized regime reduces epistemic uncertainty (High predictability). The EU legal framework offers a blueprint for structured stakeholder engagement; but it also shows the need for agile legislative mechanisms capable of adapting to rapid technological change and criminal innovation (Jones Day 2025).
The U.S. demonstrates that enforcement first strategies can recover assets and deter misconduct without delivering rule clarity across broader markets (predictability is Medium due to stability in the stablecoin slice). The UK sits between these poles: Moderate/Moderate outcomes emerge from sandbox enabled learning and targeted statutory powers, but fragmentation persists.
Essentially, the evolving regulatory framework of UK, which is anchored by Statutes such as the (ECCTA 2023) and ongoing cryptocurrency consultations, must balance proactive engagement with the risk of enacting regulation that may become obsolete or unaligned with future technological developments. This comparative insight shows that early legal clarity (Garrett 2014), inter-agency coordination, structured experimentation, legal adaptability, and the development of bespoke regulatory tools are essential to navigating the Collingridge dilemma in the context of FEC enforcement and digital asset governance. These avoid reliance on post-hoc deterrence but rather rely more on ex-ante regulatory measures.

3.6. Implications for Legal Practice and Enforcement

3.6.1. Prosecutorial Strategy Adaptations

The Collingridge dilemma necessitates a strategic recalibration of prosecutorial approaches to FEC. Traditional methods—anchored in precedent and conventional evidentiary standards—are increasingly inadequate for addressing offences that have been enabled by emerging technologies. Prosecutors are now required to have interdisciplinary competencies, including blockchain analytics, forensic data recovery, and algorithmic tracing, to navigate complex digital ecosystems and meet evolving evidentiary thresholds (Law Commission 2025a).
The SFO has initiated institutional adaptations to meet these demands, including partnerships with technology firms and “trialing the use of [TAR], utilizing AI, on a live criminal case”, which has “demonstrated that TAR could help meet legal disclosure obligations more efficiently” (UK Parliament 2025c). The SFO’s 2025–26 Business Plan confirms the opportunities for using emerging technologies such as AI in investigations (Serious Fraud Office 2025). These developments reflect a broader institutional shift toward technologically literate prosecution, which aligns with anticipatory governance principles and reinforces the need for doctrinal and operational agility in the face of rapid technological change.

3.6.2. Judicial Education and Adaptation

In the face of increasing technological complexity, the judiciary is confronted with capacity challenges in interpreting technical evidence. Judges and juries are increasingly required to adjudicate cases involving algorithmic systems, blockchain traces, and AI-generated artefacts, which are domains that often exceed conventional legal training. Although the UK’s Judicial College trained over 24,000 judicial office holders in 2023–24, its curriculum remains largely doctrinal and procedural, with limited integration of deep technical modules on emerging technologies (Judicial College 2024). This gap risks epistemic asymmetry in courtrooms, where expert evidence may be accepted uncritically or misunderstood.
To mitigate this, UK courts retain discretion under Civil Procedure Rules Part 35 to appoint expert assessors in complex technical cases. Rule 35.14 permits the use of assessors to assist the court on matters requiring specialized knowledge, offering a mechanism to enhance judicial understanding without compromising independence (Ministry of Justice 2021). Innovative practices such as appointing neutral technical assessors, particularly in cases involving AI, cryptography, or financial algorithms, could strengthen evidentiary scrutiny and procedural fairness. Even that, their use remains sporadic, which underscores the need for institutional investment in judicial technical literacy and procedural innovation.

3.6.3. Defense Bar Adaptations

Defense practitioners face mounting challenges in representing clients accused of technology-enabled FEC. Effective advocacy increasingly demands fluency not only in the alleged technological conduct such as blockchain misuse or algorithmic manipulation, but also in the forensic methodologies employed by prosecutors. This burden is exacerbated by disparities in access to technical expertise, particularly for defendants that have limited financial resources. The Independent Review of Disclosure and Fraud Offences (Home Office 2025d) highlight the systemic risks posed by digital disclosure, including data overload, late access, and the inability to independently verify forensic outputs. These constraints undermine procedural fairness and increase the risk of miscarriages of justice.
These developments show the imperative for systemic legal adaptation across prosecution, adjudication, and defense. To ensure procedural fairness and institutional resilience in the face of technological innovation and disruption, there is the need for both legal reform and investment in technical capacity and interdisciplinary training.

3.7. Emerging Challenges

3.7.1. Emerging Technologies

Emerging technologies continue to reshape FEC enforcement, which tends to intensify the Collingridge dilemma. For instance, quantum computing, though nascent, poses a significant threat to current cryptographic standards. Its potential to break encryption could undermine the security of financial systems and digital assets (Weinberg and Faccia 2024).
According to projections by the National Institute of Standards and Technology (NIST) and the European Union Agency for Cybersecurity (ENISA), quantum computing is expected to pose practical cryptographic threats by the early to mid-2030s. This necessitates pre-emptive or anticipatory regulatory and technological countermeasures (Weinberg and Faccia 2024). These agencies have initiated post-quantum cryptography standards to mitigate future vulnerabilities (NIST 2024; ENISA 2021).
Central Bank Digital Currencies (CBDCs) provide capabilities for enhanced traceability and fraud prevention. However, they also introduce systemic risks, privacy concerns, and cybersecurity vulnerabilities. Their implementation demands innovative regulatory frameworks and cross-border coordination to mitigate FEC risks (Murphy et al. 2024).
In addition, DeFi protocols present significant enforcement challenges. Despite claims of decentralization, most DeFi platforms retain centralized governance features while facilitating pseudonymous, cross-border transactions. The 2023 risk assessment by the U.S. Treasury highlights widespread AML/CFT non-compliance among DeFi services, exploited by ransomware actors and fraudsters (U.S. Treasury 2023). Comparative regulatory responses to DeFi highlight the UK’s need for agile frameworks. Singapore’s Monetary Authority has implemented licensing under the Payment Services Act, while the EU’s MiCA regulation imposes AML/CFT obligations on DeFi platforms. These models offer structured oversight that balances innovation with enforcement (MAS 2025).
Furthermore, AI technologies illustrate the dual-use dilemma. Even though AI enhances fraud detection through real-time anomaly analysis and behavioral modelling, it also enables sophisticated fraud schemes, including deepfakes and synthetic identities. The rapid evolution of AI-driven fraud necessitates adaptive legal and technical responses (Kroll 2025).

3.7.2. Systemic Risk and Long-Term Considerations

The convergence of emerging technologies creates systemic risks that exceed the sum of their parts. The WEF identifies digital interdependencies, shared model vulnerabilities, and regulatory gaps as key sources of systemic risk in financial services (WEF 2025). Systemic risks from emerging technologies are compounded by enforcement fragmentation. As noted in Section 3.6, prosecutorial and judicial capacity gaps hinder coordinated responses. This fragmentation intensifies vulnerabilities in cross-border FEC enforcement, particularly where technologies such as AI and DeFi operate beyond traditional jurisdictional boundaries.
A persistent challenge in FEC enforcement is the lack of institutional mechanisms for strategic foresight and horizon scanning. Despite the increasing complexity of financial technologies, enforcement agencies do not usually have the needed capacity to anticipate technological trajectories and associated risks. Ekblom (2022) posits that horizon scanning must be systematically integrated into security and crime prevention strategies to avoid reactive enforcement and regulatory lag. This foresight gap significantly contributes to delayed responses and fragmented enforcement. Woods (2022) complements this view by proposing a qualitative approach for strategic risk prioritization.
Furthermore, adaptive governance frameworks, characterized by modular regulation, iterative learning and stakeholder engagement, are not fully developed in the UK enforcement ecosystem (Bennear and Wiener 2019; Simon 2024). The absence of institutional agility and feedback loops impedes timely recalibration of enforcement strategies in response to evolving threats.
Another long-term strategic challenge in FEC enforcement lies in the structural limitations of international cooperation. Treaty-based mechanisms, though foundational, are usually too slow and rigid to keep pace with the rapid evolution of technological innovation. This rigidity hinders the development of adaptive, cross-border enforcement strategies. The absence of dynamic, interoperable regulatory standards and real-time intelligence-sharing frameworks reflects a deeper governance gap; one that undermines the effectiveness of national and global enforcement regimes in responding to emerging threats with agility and foresight (FATF 2023; Goldbarsht and Harris 2024).

4. Recommendations for Reform

UK enforcement is consistently confronted with the Collingridge dilemma. Thus, early-stage epistemic uncertainty hinders timely intervention, while late-stage institutional inertia slows disruption once harms scale (Demos Helsinki 2022; Fisher 2025). A credible response must be technology-neutral that combines early legal clarity with late-stage disruption and recovery, and converts private telemetry (e.g., operational data, privately held insights, real-time signals from transaction monitoring systems) into courtroom-ready evidence. The following measures target the frictions identified particularly from Section 3.1, Section 3.2, Section 3.3, Section 3.4, Section 3.5, Section 3.6 and Section 3.7 above. These recommendations are grounded in a normative commitment to proportionality, accountability, and the rule of law.

4.1. Legal Framework Reforms

The UK’s legal framework must evolve to address doctrinal gaps exposed by AI, blockchain, and decentralized systems. To ensure early legal clarity on FEC in relation to technological innovation, in the short (up to 2 years) to medium (2 to 5 years) term, there is the need to modernize the (Computer Misuse Act 1990) with a narrow public-interest defense for bona fide security research, bounded by intent, proportionality, and coordinated disclosure. This would reduce the chill on converting technical signals into admissible proof while preserving core offences (CyberUp Campaign 2025).
Late-stage entrenchment requires targeted disruption and asset-recovery powers. As such, in the short to medium term, building on (ECCTA 2023), Government should prioritize scalable fraud-neutralization tools (e.g., rapid suspension, agile confiscation) pursuant to proportionality safeguards. In the same vein, corporate attribution and failure-to-prevent offences should be strengthened to reinforce accountability where platform-like intermediaries facilitate offending (Home Office et al. 2024; TRM 2023).
Given AI’s evidentiary opacity, Parliament should, in the short term, clarify liability and evidentiary standards on a technology-neutral basis, where responsibility is allocated by control over design, deployment, and oversight, and specifying disclosure duties where model decisions are relied upon in prosecution or defense.
The clarification can be done through establishing technology-neutral standards and assigning responsibility by control (design/deployment). For XAI, there is the need to mandate auditable logs under CPS Disclosure Manual (Ch. 30), enhancing admissibility similar to (R v Bater-James 2020) proportionality, e.g., model explainability reports as “reasonable lines of enquiry,” which could reduce black-box challenges in trials.
Under the CPS Disclosure Manual, digital material must be handled transparently and proportionately, with documented strategies and audit trails for searches (Crown Prosecution Service 2022). The Attorney General’s Guidelines on Disclosure reinforce this by requiring early disclosure management documents in Crown Court cases (Attorney General’s Office 2024). In (R v Bater-James 2020), the Court of Appeal held that access to digital records must be incremental and based on “reasonable lines of enquiry,” thus, rejecting speculative trawls. Where algorithmic evidence is used, CPS guidance on expert witnesses obliges experts to record, retain, and reveal materials that might affect reliability (Crown Prosecution Service 2022), consistent with Criminal Procedure Rules 2025 Part 19 and Criminal Practice Directions 2023, which require transparent methodology (The National Archives 2025; Judiciary of England and Wales 2024, chap. 7). Explainability reports, therefore, constitute a reasonable line of enquiry and, though not guaranteeing admissibility, can mitigate “black-box” challenges by enabling proportional scrutiny of reliability and weight.
This implies that legal responsibility should be assigned to those who exercise meaningful control over the AI system, thus, whether in its creation, operational use, or governance (UK Parliament 2025a; Manheim and Homewood 2025).
At the same time, corporate liability reforms should reflect the role of technology platforms in facilitating FEC. The Law Commission’s recommendation on statutory duties for digital service providers, including due diligence and reporting obligations (Law Commission 2025a), needs to be enforced fully in the medium to long term (5 years and above).

4.2. Institutional and Regulatory Adaptations

Addressing the Collingridge dilemma requires institutional reforms that reflect the complexity and transnational nature of technology-enabled FEC. Though the current enforcement architecture of the UK is robust for traditional offenses, it lacks the required agility and technical depth for emerging threats. In the short term, enforcement bodies (e.g., NCA, HMRC) need structures that shorten the discovery-to-disruption cycle. There is the need for a dedicated financial-technology crime unit (constituting technical analysts, financial investigators, and prosecutors) in the UK’s serious and organised crime framework. This should be supported by modern tooling and disclosure workflows that are already used in complex fraud cases (Nardello & Co. 2025).
To enhance coordination, JMLIT should, in the long term, evolve into near-real-time intelligence cells involving major banks, fintechs, and registered crypto asset firms, with safeguards for purpose-limitation and audit. In support, horizon-scanning and red-team functions should be strengthened to better anticipate the shift from uncertainty to entrenchment across crypto assets, algorithmic trading, and AI (Nardello & Co. 2025).
In the medium term, private telemetry needs to be converted into evidence through FATF-consistent, near-real-time information-sharing across key institutions such as banks, fintechs, VASPs, and law enforcement (FATF Recommendation 2). This should interoperate with the “travel rule” for cross-border crypto-transfers, with strict privacy and audit protocols (FCA 2023).
Furthermore, in the medium term, oversight on payment-systems should impose technology-neutral duties on PSPs to detect and block crimes such as AI-enabled impersonation and synthetic-identity fraud. This should synchronize with proportionate reimbursement and dispute-resolution mechanisms. For crypto crime, supervisors should focus on operational effectiveness of AML controls and ECCTA’s crypto asset toolkit (HM Government 2024a).
Even though the UK’s regulatory sandbox appears successful in promoting innovation, it must, in the short term, systematically integrate FEC risk assessments into its testing protocols. The Home Office and UK Finance pilot on peer-to-peer data sharing demonstrates the feasibility of embedding FEC enforcement considerations into sandbox environments (ICO 2024).
In the short term, fintech firms should be subject to enhanced due diligence, including mandatory suspicious activity reporting and principal vetting. Additionally, international coordination needs to be strengthened through bilateral agreements and shared intelligence platforms. The UK’s use of Overseas Production Orders (Crime Act 2019; Meerza and Cassidy-Taylor 2022) and active participation in FATF initiatives provide a solid foundation for expanded cooperation (Financial Action Task Force 2022).
In the long term, although the flexibility in JMLIT helps it to reach out to as many participants as needed to enrich debates, insights and information sharing, it may risk inertia under Collingridge’s second horn because of lack of statutory mandate to counter criminal risks associated with early-stage developments of a technology. This also applies to ECSB, whereby their soft-law status limits accountability and legal clarity. Therefore, statutory mandate should be considered to empower JMLIT and ECSB to act more decisively. It will codify their powers and duties to enhance their legitimacy, oversight, transparency, participation, sustainability and participation.

4.3. Capacity Building Initiatives

As courts require improved technical competence to deliver justice and rule of law, judicial authorities, investigators and prosecutors should, from short to long term, receive continuous training in technical necessities such as crypto recovery, blockchain analytics, AI evidence and explainability, algorithmic forensics, and market microstructure. This builds on ongoing digital-disclosure reforms (Fisher 2025). The Judicial College should expand modules on the above technical areas and more routinely appoint neutral technical assessors under CPR Part 35 (Ministry of Justice 2021).
These measures form part of a technology-neutral package that should create early courtroom-relevant clarity; accelerate disruption and recovery of FEC; institutionalize secure, FATF-aligned information flows; and raise technical competence in investigation and adjudication. At all material times, they should preserve proportionality, transparency, and procedural fairness that ensure that enforcement remains both effective, legal and legitimate amid rapid technological change.

5. Conclusions

The Collingridge dilemma presents enduring challenges for prosecuting FEC in the UK, particularly as emerging technologies tend to outpace regulatory adaptation (Collingridge 1980; Moses 2013). This paper makes a novel contribution to the literature on FEC enforcement and technology governance by applying the Collingridge dilemma to criminal law contexts, where implications of the dilemma have been underexplored. It reveals the unsettled liability challenges for tech-mediated mens rea and actus reus. The paper integrates doctrinal legal analysis with empirical case studies to bridge theoretical gaps and provides a socio-legal framework for understanding enforcement challenges in the face of rapid technological change.
This paper has demonstrated that early regulatory restraint usually leads to enforcement difficulties once technologies become entrenched and exploited by criminal actors (Coffee 2021; Caldwell et al. 2020). Case studies such as OneCoin, Wirecard, and Binance demonstrate how limited early oversight and fragmented institutional responses encumber effective prosecution (FCA 2021b). However, adaptive strategies such as the FCA’s sandbox (Zetzsche et al. 2017), cross-agency coordination through JMLIT, and international regulatory harmonization provide promising avenues for the mitigation of these constraints or the pacing problem.
The relevance of this study lies in its timely exploration of how emerging technologies such as cryptocurrency, AI, and algorithmic trading disrupt traditional enforcement mechanisms. The case studies demonstrate the real-world consequences of regulatory lag and institutional fragmentation, which underscores the urgency of adaptive legal and policy responses.
The paper recommends authorities invest in technical capacity (Law Commission 2025b), enhance institutional reforms, and reform legal frameworks to address AI and crypto-specific liabilities (OECD 2021). These measures must be supported by flexible governance models that evolve with technological change (Barben et al. 2008; Stilgoe et al. 2013; Law Commission 2025b).
Although the Collingridge dilemma has proven to be analytically valuable in framing the pacing problem in FEC enforcement, its binary structure could benefit from further theoretical refinement. Future research should explore how hybrid or iterative regulatory adaptations can be integrated into the dilemma’s framework to reflect more nuanced enforcement realities (Liebert and Schmidt 2010). Moreover, in-depth comparative studies across jurisdictions and sectors could reveal how complementary models, such as responsive regulation or technological momentum, interact with the dilemma to shape legal outcomes.
Furthermore, precautionary and risk-based frameworks are underutilized in FEC enforcement. Research should examine how these frameworks can be operationalized to anticipate technological threats and guide legal reform (Arner et al. 2017). These conceptual and empirical gaps, if addressed, will enhance the model’s explanatory power and support the development of resilient global legal standards for the prosecution and regulation of FEC, which can adapt to rapid technological innovation while maintaining procedural integrity and the rule of law.
In the end, this paper does not seek to eliminate the tension between innovation and regulatory control, but to effectively manage it through adaptive, collaborative, and empirically informed socio-legal approaches that safeguard financial and economic integrity while enabling responsible innovation.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

No exclusive data used. Most of the materials used can be accessed online, except for a few books.

Acknowledgments

I am very grateful for the APC waiver granted me by the Laws’ Team of the MDPI and editorial support.

Conflicts of Interest

The author declares no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
FCAFinancial Conduct Authority
NCANational Crime Agency
SFOSerious Fraud Office
PRAPrudential Regulation Authority
CPSCrown Prosecution Service
HMRCHM Revenue & Customs
HMICFRS HM Inspectorate of Constabulary and Fire & Rescue Services
ESMAEuropean Securities and Markets Authority
AMLAAnti-Money Laundering Authority
AMLRAnti-Money Laundering Regulation
ECCTAEconomic Crime and Corporate Transparency Act
JMLITJoint Money Laundering Intelligence Taskforce
AIArtificial Intelligence
FATFFinancial Action Task Force
KYCKnow Your Customer
NFTNon-Fungible Tokens
DeFiDecentralized Finance
ECTEAEconomic Crime (Transparency and Enforcement) Act
MLRsMoney Laundering Regulations
NECCNational Economic Crime Centre
POCAProceeds of Crime Act
UKUnited Kingdom
FinCENFinancial Crimes Enforcement Network
CASPsCrypto-Asset Service Providers
ECSBEconomic Crime Strategic Board

References

  1. AA v Persons Unknown, [2019] EWHC 3556 (Comm). 2019. Available online: https://www.judiciary.uk/wp-content/uploads/2022/07/AA-v-Persons-Unknown-summary-case-note-SB-amended-1.pdf (accessed on 5 December 2025).
  2. Alan Turing Institute. 2025. UK Law Enforcement Inadequately Equipped to Tackle AI-Enabled Crime. Available online: https://www.turing.ac.uk/news/uk-law-enforcement-inadequately-equipped-tackle-ai-enabled-crime (accessed on 10 December 2025).
  3. Alecci, Scilla. 2024. A UK Court Ordered a Global Asset Freeze for the ‘Cryptoqueen’ and Her OneCoin Associates. International Consortium of Investigative Journalists (ICIJ). Available online: https://www.icij.org/news/2024/08/a-uk-court-ordered-a-global-asset-freeze-for-the-cryptoqueen-and-her-onecoin-associates/ (accessed on 4 July 2025).
  4. AMLA. 2025. Work Programme 2025: From Vision to Action. European Anti-Money Laundering Authority. Available online: https://www.amla.europa.eu/document/download/b78bee2f-16b9-4742-a3a1-23e7aad394ab_en?filename=AMLA_Work_Programme_July%202025_0.pdf (accessed on 4 July 2025).
  5. Anti-Terrorism, Crime and Security Act. 2001. c. 24. Available online: https://www.legislation.gov.uk/ukpga/2001/24/contents (accessed on 25 May 2025).
  6. APPG on Anti-Corruption & Responsible Tax. 2022. Written Evidence Submitted to the UK Parliament. Available online: https://committees.parliament.uk/writtenevidence/127415/html/ (accessed on 4 July 2025).
  7. Arner, Douglas W., Jànos Barberis, and Ross P. Buckley. 2017. FinTech, RegTech, and the reconceptualization of financial regulation. Northwestern Journal of International Law & Business 37: 371–413. [Google Scholar]
  8. Attorney General’s Office. 2024. Attorney General’s Guidelines on Disclosure. GOV.UK. Available online: https://assets.publishing.service.gov.uk/media/65e1ab9d2f2b3b00117cd803/Attorney_General_s_Guidelines_on_Disclosure_-_2024.pdf (accessed on 8 July 2025).
  9. Ayinde v London Borough of Haringey and Al-Haroun v Qatar National Bank [2025] EWHC 1383 (Admin). 2025. Available online: https://www.judiciary.uk/wp-content/uploads/2025/06/Ayinde-v-London-Borough-of-Haringey-and-Al-Haroun-v-Qatar-National-Bank.pdf (accessed on 8 July 2025).
  10. Ayres, Ian, and John Braithwaite. 1992. Responsive Regulation: Transcending the Deregulation Debate. Oxford: Oxford University Press. Available online: https://johnbraithwaite.com/wp-content/uploads/2016/06/Responsive-Regulation-Transce.pdf (accessed on 12 June 2025).
  11. Baldwin, Robert, and Julia Black. 2008. Really responsive regulation. Modern Law Review 71: 59–94. [Google Scholar] [CrossRef]
  12. Balthazor, Andrew W. 2019. The challenges of cryptocurrency asset recovery. FIU Law Review 13: 1207–234. [Google Scholar] [CrossRef]
  13. Banakar, Reza, and Max Travers. 2005. Theory and Method in Socio-Legal Research. Oxford: Hart Publishing. [Google Scholar]
  14. Bank for International Settlements & FSI. 2024. Regulating AI in the Financial Sector: Recent Developments and Main Trends (FSI Insights No. 63). Available online: https://www.bis.org/fsi/publ/insights63.htm (accessed on 12 June 2025).
  15. Bank of England. 2014. Quarterly Bulletin 2014 Q3. Volume 54, No. 3. pp. 262–79. Available online: https://www.bankofengland.co.uk/-/media/boe/files/quarterly-bulletin/2014/quarterly-bulletin-2014-q3.pdf (accessed on 12 June 2025).
  16. Bank of England PRA. 2025. Prudential Regulation Authority Annual Report. Available online: https://assets.publishing.service.gov.uk/media/685aa557454906840a44d606/PRA_Annual_Report_2024-25.pdf (accessed on 10 December 2025).
  17. Barben, Daniel, Erik Fisher, Cynthia Selin, and David H. Guston. 2008. Anticipatory governance of nanotechnology: Foresight, engagement, and integration. In The Handbook of Science and Technology Studies, 3rd ed. Edited by Edward J. Hackett, Olga Amsterdamska, Michael E. Lynch and Judy Wajcman. Cambridge: MIT Press, pp. 979–1000. [Google Scholar]
  18. Bartlett, Judy. 2020. Missing Cryptoqueen: Why Did the FCA Drop Its Warning About the OneCoin Scam? BBC News. Available online: https://www.bbc.co.uk/news/technology-53721017 (accessed on 12 June 2025).
  19. Bennear, Lori S., and J Jonathan B. Wiener. 2019. Adaptive Regulation: Instrument Choice for Policy Learning over Time. Harvard Kennedy School Working Paper. Available online: https://www.hks.harvard.edu/sites/default/files/centers/mrcbg/files/Regulation%20-%20adaptive%20reg%20-%20Bennear%20Wiener%20on%20Adaptive%20Reg%20Instrum%20Choice%202019%2002%2012%20clean.pdf (accessed on 12 June 2025).
  20. Bernstein, H. Marver. 1955. Regulating Business by Independent Commission. Princeton: Princeton University Press. [Google Scholar]
  21. BIS. 2023. Financial Stability Risks from Cryptoassets in Emerging Market Economies (BIS Papers No. 138). Available online: https://www.bis.org/publ/bppdf/bispap138.pdf (accessed on 12 June 2025).
  22. Black, Julia. 2005. The emergence of risk-based regulation and the new public risk management in the United Kingdom. Public Law 10: 512–548. [Google Scholar]
  23. Bouncken, Ricarda Wojciech, Czakon, and Florian Schmitt. 2025. Purposeful sampling and saturation in qualitative research methodologies: Recommendations and review. Review of Managerial Science. [Google Scholar] [CrossRef]
  24. Bribery Act. 2010. UK Public General Acts2010 c. 23. Available online: https://www.legislation.gov.uk/ukpga/2010/23/contents (accessed on 15 December 2025).
  25. Brownstein, Greg. 2023. Three Challenges in Cryptocurrency Regulation. Available online: https://www.atlanticcouncil.org/blogs/econographics/three-challenges-in-cryptocurrency-regulation/ (accessed on 12 June 2025).
  26. Burton, Joe, Ardi Janjeva, Simon Moseley, and Alice. 2025. AI and Serious Online Crime. Alan Turing Institute Research Report. Available online: https://cetas.turing.ac.uk/publications/ai-and-serious-online-crime (accessed on 10 December 2025).
  27. Caldwell, Matthew, Jerone TA Andrews, Thomas Tanay, and Lewis D. Griffin. 2020. AI-enabled future crime. Crime Science 9: 14. [Google Scholar] [CrossRef]
  28. Cane, Peter, and Herbert M. Kritzer, eds. 2010. The Oxford Handbook of Empirical Legal Research. Oxford: Oxford University Press. [Google Scholar]
  29. Cartea, Álvaro, Sebastian Jaimungal, and Yixuan Wang. 2020. Spoofing and Price Manipulation in Order Driven Markets. Oxford-Man Institute of Quantitative Finance. Available online: https://www.oxford-man.ox.ac.uk/wp-content/uploads/2020/05/Spoofing-and-Price-Manipulation-in-Order-Driven-Markets.pdf (accessed on 14 May 2025).
  30. CFTC. 2015. CFTC Charges U.K. Resident Navinder Singh Sarao and His Company with Price Manipulation and Spoofing. Available online: https://www.cftc.gov/PressRoom/PressReleases/7156-15 (accessed on 14 May 2025).
  31. CFTC. 2024. CFTC Releases FY 2024 Enforcement Results (Release No. 9011-24). Available online: https://www.cftc.gov/PressRoom/PressReleases/9011-24 (accessed on 5 December 2025).
  32. City of London Police. n.d. National Lead Force. Available online: https://www.cityoflondon.police.uk/police-forces/city-of-london-police/areas/city-of-london/about-us/about-us/national-lead-force/ (accessed on 12 June 2025).
  33. Coffee, John C., Jr. 2021. Corporate Crime and Punishment: The Crisis of Underenforcement. Oakland: Berrett-Koehler Publishers. [Google Scholar]
  34. Cohen, E. Lawrence, and Marcus Felson. 1979. Social change and crime rate trends: A routine activity approach. American Sociological Review 44: 588–608. [Google Scholar] [CrossRef]
  35. Collin, Matthew. 2019. Illicit financial flows: Concepts, measurement, and evidence. The World Bank Research Observer 35: 44–86. [Google Scholar] [CrossRef]
  36. Collingridge, David. 1980. The Social Control of Technology. London: Frances Pinter. [Google Scholar]
  37. Colorado General Assembly. 2024. Senate Bill 24 205 (Concerning Consumer Protections in Interactions with Artificial Intelligence Systems)-Enrolled. Available online: https://leg.colorado.gov/bills/sb24-205 (accessed on 12 June 2025).
  38. Computer Misuse Act 1990: Chapter 18. legislation.gov.uk. 1990. Available online: https://www.legislation.gov.uk/ukpga/1990/18/contents (accessed on 12 June 2025).
  39. Congress.gov. 2025. S.1582-GENIUS Act (Public Law 119 27). July 18. Available online: https://www.congress.gov/bill/119th-congress/senate-bill/1582/text (accessed on 10 December 2025).
  40. Cornerstone Research. 2024. SEC Cryptocurrency Enforcement—2024 Update. Available online: https://www.cornerstone.com/wp-content/uploads/2025/01/SEC-Cryptocurrency-Enforcement-2024-Update.pdf (accessed on 12 June 2025).
  41. Cornish, Derek. 1994. The procedural analysis of offending and its relevance for situational prevention. In Crime Prevention Studies. Edited by Clarke Ronald. Sydney: Criminal Justice Press, vol. 3, pp. 151–96. [Google Scholar]
  42. Crime (Overseas Production Orders) Act 2019 (Crime Act) c.5. 2019. Available online: https://www.legislation.gov.uk/ukpga/2019/5/contents (accessed on 12 June 2025).
  43. Criminal Cases Review Commission. n.d. Post Office ‘Horizon’ Cases. Available online: https://ccrc.gov.uk/post-office-horizon-cases/?utm_source=chatgpt.com (accessed on 19 December 2025).
  44. Criminal Finances Act 2017, c. 22. 2017. Available online: https://www.legislation.gov.uk/ukpga/2017/22/contents (accessed on 12 June 2025).
  45. Cross, Michael. 2024. FCA Charges Alleged Operator of Multiple Crypto ATMs in First Prosecution of Unregistered Crypto Asset. Law Gazette. Available online: https://www.lawgazette.co.uk/news/alleged-atm-operator-charged-in-cryptoasset-first/5120793.article (accessed on 8 June 2025).
  46. Crown Prosecution Service. 2022. Disclosure Manual: Chapter 30—Digital Material. Crown Prosecution Service. Available online: https://www.cps.gov.uk/prosecution-guidance/disclosure-manual-chapter-30-digital-material (accessed on 8 June 2025).
  47. Crown Prosecution Service. 2025. Economic Crime Strategy 2025—Final Progress Report (May). Available online: https://www.cps.gov.uk/publication/economic-crime-strategy-2025-final-progress-report-may-2025 (accessed on 10 December 2025).
  48. CyberUp Campaign. 2025. Briefing on the Crime and Policing Bill: A Critical Opportunity to Improve National Resilience Against Cyberattacks and Grow the Domestic Cyber Security Sector. Available online: https://bills.parliament.uk/publications/60406/documents/6385 (accessed on 5 June 2025).
  49. Dal Bó, Ernesto. 2006. Regulatory capture: A review. Oxford Review of Economic Policy 22: 203–25. [Google Scholar] [CrossRef]
  50. D’Aloia v Persons Unknown & Ors, [2022] EWHC 1723 (Ch). 2022. Available online: https://www.giambronelaw.com/cms/document/d-aloia-v-person-unknown-final-approved-judgment_24.06.2022.pdf (accessed on 10 December 2025).
  51. D’Aloia v Persons Unknown & Ors, [2024] EWHC 2342 (Ch). 2024. Available online: https://www.rahmanravelli.co.uk/assets/Uploads/7fb6021a96/Fabrizio-DAloia-v-Persons-Unknown-Category-A-Ors.pdf (accessed on 5 June 2025).
  52. Debevoise & Plimpton. 2018. Thinking Inside the Box: The UK FCA Sandbox, a Playground for Innovation. Available online: https://www.debevoise.com/-/media/files/insights/publications/2018/03/20180306_thinking_inside_the_box_client_update.pdf (accessed on 8 June 2025).
  53. Demos Helsinki. 2022. What Is the Collingridge Dilemma and Why Is It Important for Tech Policy? Available online: https://demoshelsinki.fi/what-is-the-collingridge-dilemma-tech-policy/ (accessed on 8 June 2025).
  54. Department for Science, Innovation & Technology. 2025. Trusted Third-Party AI Assurance Roadmap. GOV.UK. Available online: https://www.gov.uk/government/publications/trusted-third-party-ai-assurance-roadmap/trusted-third-party-ai-assurance-roadmap (accessed on 10 December 2025).
  55. Dias, Cristina, Kristina Grigaite, Marcel Magnus, R. Segall, G. Gotti, and K. Komazec. 2021. Update on Wirecard Case: Public Hearing. European Parliament. Available online: https://www.europarl.europa.eu/RegData/etudes/BRIE/2021/659639/IPOL_BRI%282021%29659639_EN.pdf (accessed on 8 June 2025).
  56. Diebold, Francis X., Neil A. Doherty, and Richard J. Herring, eds. 2010. The Known, the Unknown, and the Unknowable in Financial Risk Management: Measurement and Theory Advancing Practice. Princeton: Princeton University Press, pp. 145–63. [Google Scholar]
  57. Digital Asset Market Clarity Act. 2025. H.R. 3633. 119th Congress (2025–2026). Available online: https://www.congress.gov/bill/119th-congress/house-bill/3633 (accessed on 10 December 2025).
  58. Directive 2014/65/EU of the European Parliament and of the Council of 15 May 2014 on markets in financial instruments and amending Directive 2002/92/EC and Directive 2011/61/EU (recast) Text with EEA relevance. 2014. OJ L 173, 12.6.2014. pp. 349–496. Available online: https://eur-lex.europa.eu/eli/dir/2014/65/oj/eng (accessed on 5 June 2025).
  59. Dobinson, Ian, and Francis Johns. 2017. Legal research as qualitative research. In Research Methods for Law. Edited by Mike McConville and Wing H. Chui. Scotland: Edinburgh University Press, pp. 18–47. [Google Scholar]
  60. Downes, Larry. 2009. The Laws of Disruption: Harnessing the New Forces That Govern Life and Business in the Digital Age. New York: Basic Books. [Google Scholar]
  61. Dsouza, Mark. 2020. Don’t panic: Artificial intelligence and criminal law 101. In Artificial Intelligence and the Law, 1st ed. Milton Park: Routledge. [Google Scholar]
  62. ECCTA. 2023. Economic Crime and Corporate Transparency Act 2023 UK Public General Acts2023 c. 56. Available online: https://www.legislation.gov.uk/ukpga/2023/56/section/196 (accessed on 13 July 2025).
  63. ECTEA. 2022. Economic Crime (Transparency and Enforcement) Act 2022, c. 10 (UK). Available online: https://www.legislation.gov.uk/ukpga/2022/10/contents (accessed on 6 June 2025).
  64. Eisenhardt, Kathleen M. 1989. Building theories from case study research. Academy of Management Review 14: 532–50. [Google Scholar] [CrossRef]
  65. Eisenhardt, Kathleen M., and Melissa E. Graebner. 2007. Theory building from cases: Opportunities and challenges. Academy of Management Journal 50: 25–32. [Google Scholar] [CrossRef]
  66. Ekblom, Paul. 2022. Facing the Future: The Role of Horizon-Scanning in Helping Security Keep Up with the Changes to Come. In The Handbook of Security. Edited by Martin Gill. Cham: Springer, pp. 821–45. Available online: https://link.springer.com/chapter/10.1007/978-3-030-91735-7_38 (accessed on 5 August 2025).
  67. Emmel, Nick. 2013. Sampling and Choosing Cases in Qualitative Research: A Realist Approach. Thousand Oaks: SAGE Publications. [Google Scholar] [CrossRef]
  68. ENISA. 2021. Post-Quantum Cryptography: Current State and Quantum Mitigation. Available online: https://www.enisa.europa.eu/publications/post-quantum-cryptography-current-state-and-quantum-mitigation (accessed on 14 May 2025).
  69. EUR Lex. 2023. Regulation (EU) 2023/1113 (TFR). Available online: https://eur-lex.europa.eu/eli/reg/2023/1113/oj/eng (accessed on 6 December 2025).
  70. EUR Lex. 2024a. Regulation (EU) 2024/1624 (AMLR). Available online: https://eur-lex.europa.eu/eli/reg/2024/1624/oj/eng (accessed on 6 December 2025).
  71. EUR Lex. 2024b. Regulation (EU) 2024/1689 (AI Act). Available online: https://eur-lex.europa.eu/eli/reg/2024/1689/oj/eng (accessed on 6 December 2025).
  72. European Commission. 2023. Commission Delegated Regulation … Supplementing Regulation (EU) 2023/1114 of the European Parliament and of the Council by Specifying Certain Criteria for Classifying Asset-Referenced Tokens and e-Money Tokens as Significant (Ares (2023)7582240). Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=intcom:Ares%282023%297582240 (accessed on 6 December 2025).
  73. European Parliament. 2024. Artificial Intelligence Act. Available online: https://www.europarl.europa.eu/RegData/etudes/BRIE/2021/698792/EPRS_BRI(2021)698792_EN.pdf (accessed on 3 August 2025).
  74. European Union. 2023. Regulation (EU) 2023/1114 of the European Parliament and of the Council of 31 May 2023 on Markets in Crypto-Assets [MiCA], and Amending Regulations (EU) No 1093/2010 and (EU) No 1095/2010 and Directives 2013/36/EU and (EU) 2019/1937 (Text with EEA Relevance) PE/54/2022/REV/1 OJ L 150, 9.6.2023. pp. 40–205. Available online: https://eur-lex.europa.eu/eli/reg/2023/1114/oj/eng (accessed on 10 December 2025).
  75. Fanusie, Yaya J., and Tom Robinson. 2018. Bitcoin Laundering: An Analysis of Illicit Flows into Digital Currency Services. Foundation for Defense of Democracies. Available online: https://www.fdd.org/wp-content/uploads/2018/01/MEMO_Bitcoin_Laundering.pdf (accessed on 3 August 2025).
  76. FATF. 2023. Annual Report 2023–2024. FATF. Available online: https://www.fatf-gafi.org/content/dam/fatf-gafi/annual-reports/FATF-AR-2023-2024.pdf.coredownload.pdf (accessed on 3 August 2025).
  77. FATF. 2025. Guidance on Financial Inclusion and Anti-Money Laundering and Terrorist Financing Measures. Available online: https://www.fatf-gafi.org/en/publications/Financialinclusionandnpoissues/guidance-financial-inclusion-aml-tf-measures.html (accessed on 4 September 2025).
  78. FCA. 2017a. Consumer Warning About the Risks of Investing in Cryptocurrency CFDs. FCA. Available online: https://www.fca.org.uk/news/news-stories/consumer-warning-about-risks-investing-cryptocurrency-cfds (accessed on 3 August 2025).
  79. FCA. 2017b. Regulatory Sandbox Lessons Learned Report. Available online: https://www.fca.org.uk/publications/research/regulatory-sandbox-lessons-learned-report (accessed on 3 August 2025).
  80. FCA. 2018. Algorithmic Trading Compliance in Wholesale Markets: Multi-Firm Review. Available online: https://www.fca.org.uk/publication/multi-firm-reviews/algorithmic-trading-compliance-wholesale-markets.pdf (accessed on 3 August 2025).
  81. FCA. 2019a. FCA to Supervise Cryptoasset Businesses Under the AML/CFT Regime from 10 January 2020. FCA. Available online: https://www.ashfords.co.uk/insights/articles/fca-to-supervise-cryptoasset-businesses-under-the-amlcft-regime-from-10-january-2020 (accessed on 3 August 2025).
  82. FCA. 2019b. Independent Investigation into London Capital & Finance. Available online: https://www.fca.org.uk/transparency/independent-investigation-london-capital-finance (accessed on 3 August 2025).
  83. FCA. 2019c. PS19/14: Loan-Based (Peer-to-Peer’) and Investment-Based Crowdfunding Platforms: Feedback to CP18/20 and Final rules. Available online: https://www.fca.org.uk/publications/policy-statements/ps19-14-loan-based-peer-to-peer-investment-based-crowdfunding-platforms-feedback-final-rules (accessed on 3 August 2025).
  84. FCA. 2020. Wirecard Can Resume Regulated Activity. Available online: https://www.fca.org.uk/news/news-stories/requirements-imposed-wirecard-authorisation (accessed on 3 August 2025).
  85. FCA. 2021a. Consumer Warning on Binance Markets Limited and the Binance Group. Available online: https://www.fca.org.uk/news/news-stories/consumer-warning-binance-markets-limited-and-binance-group (accessed on 3 August 2025).
  86. FCA. 2021b. First Supervisory Notice: Binance Markets Limited. FCA. Available online: https://www.fca.org.uk/publication/supervisory-notices/first-supervisory-notice-binance-markets-limited.pdf (accessed on 3 August 2025).
  87. FCA. 2022. FCA Regulated Crypto Companies—February 2022. Financial Conduct Authority. Available online: https://www.fca.org.uk/freedom-information/fca-regulated-crypto-companies-february-2022 (accessed on 3 August 2025).
  88. FCA. 2023. FCA Sets Out Expectations for UK Cryptoasset Businesses Complying with the Travel Rule. Available online: https://www.fca.org.uk/news/statements/fca-sets-out-expectations-uk-cryptoasset-businesses-complying-travel-rule (accessed on 3 August 2025).
  89. FCA. 2024. Cryptoassets: AML/CTF Regime. Available online: https://www.fca.org.uk/firms/financial-crime/cryptoassets-aml-ctf-regime (accessed on 3 August 2025).
  90. FCA. 2025a. PS25/5: Our Enforcement Guide and Greater Transparency of Our Enforcement Investigations. Available online: https://www.fca.org.uk/publications/policy-statements/ps25-5-enforcement-guide-greater-transparency-enforcement-investigations (accessed on 10 December 2025).
  91. FCA. 2025b. Regulatory Sandbox. Available online: https://www.fca.org.uk/firms/innovation/regulatory-sandbox (accessed on 10 December 2025).
  92. FCA. n.d. Financial Conduct Authority—GOV.UK. UK Government. Available online: https://www.gov.uk/government/organisations/financial-conduct-authority (accessed on 3 August 2025).
  93. FCA v Da Vinci Invest Ltd., [2015] EWHC 2401 (Ch). 2015. Available online: https://www.casemine.com/judgement/uk/5a8ff7bd60d03e7f57eb1afe (accessed on 26 May 2025).
  94. Federal Bureau of Investigation. n.d. Ruja Ignatova: FBI Ten Most Wanted Fugitive. FBI. Available online: https://www.fbi.gov/wanted/topten/ruja-ignatova/@@download.pdf (accessed on 3 August 2025).
  95. Ferran, Eilís. 2023. International Competitiveness and Financial Regulators’ Mandates: Coming Around Again in the UK Journal. Journal of Financial Regulation 9: 30–54. [Google Scholar] [CrossRef]
  96. Financial Action Task Force (FATF). 2022. United Kingdom: Member Since 1990. Available online: https://www.fatf-gafi.org/en/countries/detail/united-kingdom.html (accessed on 26 May 2025).
  97. Financial Conduct Authority. 2023. Regulation of Markets in Financial Instruments. FCA. Available online: https://www.fca.org.uk/markets/regulation-markets-financial-instruments (accessed on 26 May 2025).
  98. Financial Conduct Authority. 2025. Market Abuse Regulation (UK MAR) Overview. Available online: https://www.fca.org.uk/markets/market-abuse/regulation (accessed on 10 December 2025).
  99. Financial Innovation and Technology for the 21st Century Act (FIT21, H.R. 4763). 2023, 118th Congress (2023–2024). Available online: https://www.congress.gov/bill/118th-congress/house-bill/4763 (accessed on 10 December 2025).
  100. Financial Services and Markets Act 2000, c.8. 2000. Available online: https://www.legislation.gov.uk/ukpga/2000/8/contents (accessed on 26 May 2025).
  101. Fisher, Jonathan. 2025. Disclosure in the Digital Age: Independent Review of Disclosure and Fraud Offences. Available online: https://www.gov.uk/government/publications/independent-review-of-disclosure-and-fraud-offences (accessed on 10 December 2025).
  102. Fraud Act 2006, c. 35. 2006. Available online: https://www.legislation.gov.uk/ukpga/2006/35/contents (accessed on 5 June 2025).
  103. GAO. 2016. Financial Regulation: Complex and Fragmented Structure Could Be Streamlined to Improve Effectiveness (GAO-16-175). Available online: https://www.gao.gov/products/gao-16-175 (accessed on 26 May 2025).
  104. Garrett, Brandon L. 2014. Too Big to Jail: How Prosecutors Compromise with Corporations. Cambridge, MA: Harvard University Press. Available online: https://www.hup.harvard.edu/books/9780674659919 (accessed on 15 July 2025).
  105. Genus, Audley, and Andy Stirling. 2018. Collingridge and the dilemma of control: Towards responsible and accountable innovation. Research Policy 47: 61–69. [Google Scholar] [CrossRef]
  106. Gibson Dunn. 2025. Digital Assets: U.S. Regulatory and Enforcement Considerations in the New Administration [Webcast Slides]. January 22. Available online: https://www.gibsondunn.com/wp-content/uploads/2025/01/WebcastSlides-Crypto-Regulation-and-Enforcement-Developments-Trends-and-Expectations-for-the-New-Administration-22-JAN-2025.pdf (accessed on 10 December 2025).
  107. Gilmour, Paul Michael. 2021. Exploring the barriers to policing financial crime in England and Wales. Policing: A Journal of Policy and Practice 15: 1507–1521. [Google Scholar] [CrossRef]
  108. Goldbarsht, Doron, and Hannah Harris. 2024. Enhancing integrity in the implementation of FATF recommendations: Robust Governance Frameworks to Combat Financial Crime in an Age of Intergovernmental Rulemaking. In Financial crime, law and governance: Navigating Challenges in Different Contexts. Edited by Doron Goldbarsht and Louis de Koker. Cham: Springer, pp. 141–67. [Google Scholar] [CrossRef]
  109. Gonzalez, Sheth, and Urra v Financial Conduct Authority, [2025] UKUT 00214 (TCC). 2025. Available online: https://www.gov.uk/tax-and-chancery-tribunal-decisions/1-mr-jorge-lopez-gonzalez-2-mr-poojan-sheth-3-mr-diego-urra-v-the-financial-conduct-authority-2025-ukut-00214-tcc (accessed on 10 December 2025).
  110. Government Office for Science. 2012. Foresight: The Future of Computer Trading in Financial Markets—An International Perspective (Final Project Report). The Government Office for Science. Available online: https://assets.publishing.service.gov.uk/media/5a7498e3e5274a44083b7f4a/12-1086-future-of-computer-trading-in-financial-markets-report.pdf (accessed on 3 June 2025).
  111. GovFacts. 2025. Bitcoin and Cryptocurrency Regulation in the United States. Available online: https://govfacts.org/money/investing-retirement/cryptocurrency-digital-assets/bitcoin-and-cryptocurrency-regulation-in-the-united-states/ (accessed on 10 December 2025).
  112. Gov.UK. 2022. Closing the Gap: Literature Review of Existing Papers Addressing the Theme of Innovation and Regulation (Annex B). Available online: https://www.gov.uk/government/publications/closing-the-gap-getting-from-principles-to-practice-for-innovation-friendly-regulation/closing-the-gap-literature-review-of-existing-papers-addressing-the-theme-of-innovation-and-regulation-annex-b (accessed on 3 June 2025).
  113. Gunningham, Neil, Peter Grabosky, and Darren Sinclair. 1998. Smart Regulation: Designing Environmental Policy. Oxford: Oxford University Press. [Google Scholar]
  114. Hamacher, Adriana. 2020. OneCoin Lawyers Persuaded UK’s FCA to Take Down Scam Warning. Decrypt. Available online: https://decrypt.co/37912/onecoin-lawyers-persuaded-uks-fca-to-take-down-scam-warning (accessed on 3 June 2025).
  115. High Court of England and Wales. 2016. Sarao v The Government of the United States of America [2016] EWHC 2737 (Admin). Available online: https://www.bailii.org/ew/cases/EWHC/Admin/2016/2737.html (accessed on 3 June 2025).
  116. Hill, Charlotte. 2024. Landmark Ruling on Digital Asset Fraud. Law Gazette. Available online: https://www.lawgazette.co.uk/practice-points/landmark-ruling-on-digital-asset-fraud/5121507.article (accessed on 13 June 2025).
  117. HM Government. 2024a. Economic Crime and Corporate Transparency Act: Cryptoassets Legislation. Available online: https://www.gov.uk/government/publications/economic-crime-and-corporate-transparency-act-2023-factsheets/economic-crime-and-corporate-transparency-act-cryptoassets-legislation (accessed on 24 May 2025).
  118. HMICFRS. 2018. The National Tasking, Coordination and Governance of the Response to Serious and Organised Crime. Available online: https://assets-hmicfrs.justiceinspectorates.gov.uk/uploads/national-tasking-coordination-and-governance-of-the-response-to-serious-and-organised-crime.pdf (accessed on 24 May 2025).
  119. HMRC. 2025. HMRC Economic Crime Supervision Annual Assessment Report: 2022 to 2024. GOV.UK. Available online: https://www.gov.uk/government/publications/hmrc-economic-crime-supervision-annual-assessment-report-2022-to-2024 (accessed on 10 December 2025).
  120. HM Treasury. 2025. Economic Crime Levy Report 2023–24. Available online: https://www.gov.uk/government/publications/economic-crime-levy-report-2023-24 (accessed on 10 December 2025).
  121. HM Treasury, and Home Office. 2019a. Economic Crime Plan 2019 to 2022. GOV.UK. Available online: https://www.gov.uk/government/publications/economic-crime-plan-2019-to-2022 (accessed on 24 May 2025).
  122. HM Treasury, and Home Office. 2019b. Economic Crime Strategic Board January 2019 Agenda and Minutes. GOV.UK. Available online: https://www.gov.uk/government/publications/economic-crime-strategic-board-minutes-and-agenda-january-2019/economic-crime-strategic-board-january-2019-agenda-and-minutes (accessed on 24 May 2025).
  123. HM Treasury, and Home Office. 2025. National Risk Assessment of Money Laundering and Terrorist Financing 2025. UK Government. Available online: https://assets.publishing.service.gov.uk/media/6877be59760bf6cedaf5bd4f/National_Risk_Assessment_of_Money_Laundering_and_Terrorist_Financing_2025_FINAL.pdf (accessed on 10 December 2025).
  124. Home Office. 2025a. AI to help police catch criminals before they strike [Press release]. Available online: https://www.gov.uk/government/news/ai-to-help-police-catch-criminals-before-they-strike (accessed on 10 December 2025).
  125. Home Office. 2025b. Economic Crime and Corporate Transparency Act 2023: Guidance to Organisations on the Offence of Failure to Prevent Fraud (Commencement 1 September 2025). Available online: https://www.gov.uk/government/publications/offence-of-failure-to-prevent-fraud-introduced-by-eccta/economic-crime-and-corporate-transparency-act-2023-guidance-to-organisations-on-the-offence-of-failure-to-prevent-fraud-accessible-version (accessed on 10 December 2025).
  126. Home Office. 2025c. Economic Crime Plan 2: Outcomes Progress Report. Available online: https://www.gov.uk/government/publications/economic-crime-plan-2-outcomes-progress-report/economic-crime-plan-2-outcomes-progress-report (accessed on 10 December 2025).
  127. Home Office. 2025d. Independent Review of Disclosure and Fraud Offences. GOV.UK. Available online: https://www.gov.uk/government/collections/independent-review-of-disclosure-and-fraud-offences (accessed on 10 December 2025).
  128. Home Office, Serious Fraud Office, HM Treasury, Department for Business and Trade, and Ministry of Justice and Companies House. 2024. Economic Crime and Corporate Transparency Act 2023: Factsheets. Available online: https://www.gov.uk/government/publications/economic-crime-and-corporate-transparency-act-2023-factsheets (accessed on 16 July 2025).
  129. Hughes, Thomas P. 1994. Technological momentum. In Does Technology Drive History? The Dilemma of Technological Determinism. Edited by Merritt Roe Smith and Leo Marx. Cambridge: MIT Press, pp. 101–13. [Google Scholar]
  130. Hutchinson, Terry, and Nigel Duncan. 2012. Defining and Describing What We Do: Doctrinal Legal Research. Deakin Law Review 17: 83–120. [Google Scholar] [CrossRef]
  131. ICO. 2024. Regulatory Sandbox Final Report: Financial Institutions: Home Office and UK Finance. Available online: https://ico.org.uk/media2/for-organisations/documents/4028120/home-office-uk-finance-financial-institutions-regulatory-sandbox-final-report-v1-0.pdf (accessed on 15 June 2025).
  132. Jones Day. 2025. Crypto-Assets, CASPs, and AML/CFT Compliance: The New European Regulatory Landscape Under MiCA and AMLR. Jones Day Insights. Available online: https://www.jonesday.com/en/insights/2025/07/crypto-assets-casps-and-amlcft-compliance-the-new-european-regulatory-landscape-under-mica-and-amlr (accessed on 10 December 2025).
  133. Judicial College. 2024. Judicial College Activities Report 2023–2024. Judiciary UK. Available online: https://www.judiciary.uk/wp-content/uploads/2025/01/Judicial-College-Activities-Report-2023-2024.pdf (accessed on 10 December 2025).
  134. Judiciary of England and Wales. 2024. Criminal Practice Directions 2023. Available online: https://www.judiciary.uk/wp-content/uploads/2024/07/Criminal-Practice-Directions-2023-as-amended-July-2024.pdf (accessed on 15 June 2025).
  135. Kálmán, János. 2025. The Role of Regulatory Sandboxes in FinTech Innovation: A Comparative Case Study of the UK, Singapore, and Hungary. FinTech 4: 26. [Google Scholar] [CrossRef]
  136. Kirilenko, A. Andrei, and Andrew W. Lo. 2013. Moore’s law versus Murphy’s law: Algorithmic trading and its discontents. Journal of Economic Perspectives 27: 51–72. [Google Scholar] [CrossRef]
  137. Kirk, Tristan. 2024. ‘Cryptoqueen’ Ruja Ignatova Accused of $4.5 Billion Scam Hit with Global Asset Freeze. Evening Standard. Available online: https://www.standard.co.uk/news/crime/cryptoqueen-fbi-london-high-court-ruja-ignatova-asset-freeze-b1175402.html (accessed on 10 December 2025).
  138. Kliewer, Andrew. 2022. The P2P Fraud Conundrum. The Regulatory Review. Available online: https://www.theregreview.org/2022/09/01/kliewer-the-p2p-fraud-conundrum/ (accessed on 15 June 2025).
  139. Kroll. 2025. Financial Crime Report: AI and Geopolitical Uncertainty Fuel a New Wave of Financial Crime. Available online: https://www.kroll.com/en/publications/financial-crime-report-2025 (accessed on 10 December 2025).
  140. Kwak, James. 2014. Cultural capture and the financial crisis. In Preventing Regulatory Capture: Special Interest Influence and How to Limit It. Edited by Daniel Carpenter and David A. Moss. Cambridge: Cambridge University Press, pp. 71–98. [Google Scholar]
  141. Langley, Paul, and Andrew Leyshon. 2023. Fintech platform regulation: Regulating with/against platforms in the UK and China. Cambridge Journal of Regions, Economy and Society 16: 257–68. [Google Scholar] [CrossRef]
  142. Law Commission. 2025a. Artificial Intelligence and the Law: A Discussion Paper. UK Law Commission. Available online: https://lawcom.gov.uk/publication/artificial-intelligence-and-the-law-a-discussion-paper/ (accessed on 10 December 2025).
  143. Law Commission. 2025b. The Law Commission’s Annual Report 2024–25 and Business Plan 2025–2026. Available online: https://lawcom.gov.uk/news/the-law-commissions-annual-report-2024-25-and-business-plan-2025-2026/ (accessed on 10 December 2025).
  144. Leonard, Greg, Marlene Haas, and Oliver Pegden. 2024. Order and trade data analysis in recent spoofing investigations. Journal of Financial Compliance 7: 346–352. [Google Scholar] [CrossRef]
  145. Liebert, Wolfgang, and Jan C. Schmidt. 2010. Collingridge’s dilemma and technoscience: An attempt to provide a clarification from the perspective of the philosophy of science. Poiesis & Praxis 7: 55–71. [Google Scholar] [CrossRef]
  146. Lui, Alison, and Nicholas Ryder, eds. 2021. FinTech, Artificial Intelligence and the Law: Regulation and Crime Prevention. Milton Park: Routledge. [Google Scholar] [CrossRef]
  147. Manheim, David, and Aidan Homewood. 2025. Limits of safe AI deployment: Differentiating oversight and control. arXiv arXiv:2507.03525v1. [Google Scholar] [CrossRef]
  148. Maple, Carsten, Lukasz Szpruch, Gregory Epiphaniou, Kalina Staykova, Simran Singh, William Penwarden, Yisi Wen, Zijian Wang, Jagdish Hariharan, and Pavle Avramovic. 2023. The AI Revolution: Opportunities and Challenges for the Finance Sector. The Alan Turing Institute & Financial Conduct Authority. Available online: https://www.turing.ac.uk/news/publications/ai-revolution-opportunities-and-challenges-finance-sector (accessed on 20 July 2025).
  149. Marchant, Gary E., Braden R. Allenby, and Joseph R. Herkert, eds. 2011. The Growing Gap Between Emerging Technologies and Legal-Ethical Oversight: The Pacing Problem. Dordrecht: Springer. [Google Scholar] [CrossRef]
  150. MAS. 2025. Guidelines on licensing for payment service providers (PS-G01). Available online: https://www.mas.gov.sg/regulation/guidelines/ps-g01-guidelines-on-licensing-for-payment-service-providers (accessed on 13 August 2025).
  151. Meerza, Zulfi, and Francesca Cassidy-Taylor. 2022. Overseas Production Orders. Rahman Ravelli. Available online: https://www.rahmanravelli.co.uk/expertise/multi-agency-and-multi-jurisdictional-investigations/articles/overseas-production-orders/ (accessed on 17 June 2025).
  152. Meridian Global Funds Management Asia Ltd. v Securities Commission, [1995] 2 AC 500 (PC). 1995. Available online: https://www.uniset.ca/other/cs2/19952AC500.html (accessed on 17 June 2025).
  153. Ministry of Justice. 2021. Civil Procedure Rules Part 35—Experts and Assessors. Available online: https://www.justice.gov.uk/courts/procedure-rules/civil/rules/part35 (accessed on 17 June 2025).
  154. Ministry of Justice. 2025. Use of Evidence Generated by Software in Criminal Proceedings: Call for Evidence. Available online: https://www.gov.uk/government/calls-for-evidence/use-of-evidence-generated-by-software-in-criminal-proceedings/use-of-evidence-generated-by-software-in-criminal-proceedings-call-for-evidence (accessed on 10 December 2025).
  155. Money Laundering, Terrorist Financing and Transfer of Funds (Information on the Payer) Regulations 2017/692. 2017. Available online: https://www.legislation.gov.uk/uksi/2017/692/contents (accessed on 17 June 2025).
  156. Moses, Lyria Bennett. 2013. How to think about law, regulation and technology: Problems with ‘technology’ as a regulatory target. Law, Innovation and Technology 5: 1–20. [Google Scholar] [CrossRef]
  157. Murphy, Kieran P., Tao Sun, Yong Sarah Zhou, Natsuki Tsuda, Nicolas Zhang, Victor Budau, Frankosiligi Solomon, Kathleen Kao, Morana Vucinic, and Kristina Miggiani. 2024. Central Bank Digital Currency Data Use and Privacy Protection. IMF Fintech Notes. Washington, DC: International Monetary Fund, vol. 2024. [Google Scholar] [CrossRef]
  158. Nagarajan, Shalini. 2020. How Wirecard Went from Analyst Darling to a $2.2 Billion Accounting Scandal—And Cost SoftBank Hundreds of Millions in the Process. Business Insider. June 24. Available online: https://markets.businessinsider.com/news/stocks/wirecard-timeline-what-you-need-to-know-2bn-fintech-scandal-2020-6-1029337346 (accessed on 17 June 2025).
  159. Nair, Lakshmi, Balachandran, Michael Gibbert, and Bareerah Hafeez Hoorani. 2023. Do it again. In Combining Case Study Designs for Theory Building: A New Sourcebook for Rigorous Social Science Researchers. Cambridge: Cambridge University Press, pp. 50–77. [Google Scholar] [CrossRef]
  160. Nardello & Co. 2025. A Decade of Partnership in the Fight Against Financial Crime: The Joint Money Laundering Intelligence Taskforce. Available online: https://nardelloandco.com/our-insights/article/a-decade-of-partnership-in-the-fight-against-financial-crime-the-joint-money-laundering-intelligence-taskforce-ten-years-on/ (accessed on 10 December 2025).
  161. National Crime Agency. 2025a. 10-year Anniversary of the UK’s Public-Private Partnerships. Available online: https://nationalcrimeagency.gov.uk/news/10-year-anniversary-of-the-uks-public-private-partnerships (accessed on 10 December 2025).
  162. National Crime Agency. 2025b. National Strategic Assessment of Serious and Organised Crime. Available online: https://nationalcrimeagency.gov.uk/threats-2025/nsa-fraud-2025 (accessed on 10 December 2025).
  163. NCA. 2024. Operation Destabilise: NCA Disrupts Multi-Billion Russian Money Laundering Networks with Links to Drugs, Ransomware and Espionage Resulting in 84 Arrests. Available online: https://nationalcrimeagency.gov.uk/news/operation-destabilise-nca-disrupts-multi-billion-russian-money-laundering-networks-with-links-to-drugs-ransomware-and-espionage-resulting-in-84-arrests (accessed on 17 June 2025).
  164. NCA. 2025. Illicit Finance. National Strategic Assessment 2025. Available online: https://www.nationalcrimeagency.gov.uk/threats-2025/nsa-illicit-finance-2025 (accessed on 10 December 2025).
  165. NECC. 2024. National Economic Crime Centre Annual Report 2023–2024. National Crime Agency. Available online: http://nationalcrimeagency.gov.uk/who-we-are/publications/730-national-economic-crime-centre-annual-report-2023-2024 (accessed on 10 December 2025).
  166. NIST. 2024. NIST Releases First 3 Finalized Post-Quantum Encryption Standards. Available online: https://www.nist.gov/news-events/news/2024/08/nist-releases-first-3-finalized-post-quantum-encryption-standards (accessed on 8 July 2025).
  167. Nitschke, Florian. 2018. Algorithmic Trading Under MiFID II: Increased Regulatory Expectations and Annual Self-Assessment. Kroll. Available online: https://www.kroll.com/en/publications/financial-compliance-regulation/algorithmic-trading-under-mifid-ii (accessed on 8 July 2025).
  168. Nordmann, Alfred. 2010. A forensics of wishing: Technology assessment in the age of technoscience. Poiesis & Praxis 7: 5–15. [Google Scholar] [CrossRef]
  169. OCCRP. 2014. The Russian Laundromat. Available online: https://www.occrp.org/en/laundromat/ (accessed on 8 July 2025).
  170. OECD. 2021. State of Implementation of the OECD AI Principles: Insights from National AI Policies. Available online: https://www.oecd.org/en/publications/state-of-implementation-of-the-oecd-ai-principles_1cd40c44-en.html (accessed on 8 July 2025).
  171. ONS. 2025. Crime in England and Wales: Year Ending September 2024. Available online: https://www.ons.gov.uk/releases/crimeinenglandandwalesyearendingseptember2024 (accessed on 8 July 2025).
  172. O’Reilly, Gary. 2025. HMCTS Is Accelerating the Responsible Adoption of Artificial Intelligence (AI) to Transform the Courts and Tribunals. Inside HMCTS. Available online: https://insidehmcts.blog.gov.uk/2025/09/03/hmcts-is-accelerating-the-responsible-adoption-of-artificial-intelligence-ai-to-transform-the-courts-and-tribunals/ (accessed on 10 December 2025).
  173. Pagallo, Ugo. 2013. The Laws of Robots: Crimes, Contracts, and Torts. Dordrecht: Springer. [Google Scholar]
  174. Practical Law, Aaron Stephens, Zach Fardon, Katherine Kirkpatrick, Michael Watling, Matthew Wissa, Margaret Nettesheim, and King & Spalding LLP. 2019. “Spoofing”: US Law and Enforcement. Atlanta: King & Spalding LLP. Available online: https://www.kslaw.com/attachments/000/007/109/original/Spoofing_US_Law_and_Enforcement.pdf?1564767398 (accessed on 10 June 2025).
  175. Proceeds of Crime Act 2002, c. 29. 2002. Available online: https://www.legislation.gov.uk/ukpga/2002/29/contents (accessed on 10 June 2025).
  176. Property (Digital Assets etc) Act 2025, c.29 (UK). 2025. Available online: https://www.legislation.gov.uk/ukpga/2025/29 (accessed on 10 December 2025).
  177. Rahman, Syedur. 2023. The OneCoin Group Action Claim. Available online: https://www.rahmanravelli.co.uk/expertise/cryptocurrency/articles/the-onecoin-group-action-claim/ (accessed on 10 June 2025).
  178. Rahman, Syedur, and Ulrich Schmidt. 2025. Crypto and FCA Enforcement. Rahman Ravelli, May 20. Available online: https://www.rahmanravelli.co.uk/expertise/cryptocurrency/articles/crypto-and-fca-enforcement/ (accessed on 10 December 2025).
  179. Rumsfeld, Donald H. 2002. Transcript: Department of Defense News Briefing. U.S. Department of Defense, February 12. Available online: https://usinfo.org/wf-archive/2002/020212/epf202.htm (accessed on 10 June 2025).
  180. Ruof, Christopher. 2023. The information problem under fintech. In Regulating Financial Innovation. Cham: Palgrave Macmillan, pp. 187–218. [Google Scholar] [CrossRef]
  181. R v Bater-James [2020] EWCA Crim. 2020. Available online: https://www.casemine.com/judgement/uk/5ef58e5f2c94e07a569c1a72 (accessed on 14 June 2025).
  182. R v E [2018] EWCA Crim 2426. 2018. Available online: https://www.cps.gov.uk/sites/default/files/documents/publications/R-v-E-2018-EWCA-2426-Crim.pdf (accessed on 14 June 2025).
  183. R v Kelly & Anor, [1999] QB 621 (CA). 1999. Available online: https://www.e-lawresources.co.uk/r-v-kelly-anor-1999 (accessed on 6 December 2025).
  184. R v Woollin, [1999] 1 AC 82 (HL). 1999. Available online: https://publications.parliament.uk/pa/ld199798/ldjudgmt/jd980722/wool.htm (accessed on 6 December 2025).
  185. Salmon, John, Sébastien Gros, Jeffrey Greenbaum, Elisabetta Zeppieri, and Richard Reimer. 2025. The EU’s Markets in Crypto-Assets (MiCA) Regulation—A Status Update. 20 February. Hogan Lovells. Available online: https://www.hoganlovells.com/en/publications/the-eus-markets-in-crypto-assets-mica-regulation-a-status-update (accessed on 10 December 2025).
  186. Sanctions and Anti-Money Laundering Act 2018, c. 13. 2018. Available online: https://www.legislation.gov.uk/ukpga/2018/13/contents (accessed on 14 June 2025).
  187. Schilling de Carvalho, Pedro. 2022. Retaining influence in post-Brexit international financial regulation: Lessons from the UK’s FinTech framework. Journal of Financial Regulation 8: 104–31. [Google Scholar] [CrossRef]
  188. Seawright, Jason, and John Gerring. 2008. Case selection techniques in case study research: A menu of qualitative and quantitative options. Political Research Quarterly 61: 294–308. [Google Scholar] [CrossRef]
  189. Sentencing Bill 2025 [Policy Paper]. 2025, GOV.UK. Available online: https://www.gov.uk/government/publications/sentencing-bill-2025 (accessed on 10 December 2025).
  190. Serious Crime Act 2015, c. 9. 2015. Available online: https://www.legislation.gov.uk/ukpga/2015/9/contents (accessed on 14 June 2025).
  191. Serious Fraud Office. 2025. Serious Fraud Office Sets Out Next Steps in Ambitious Plan. GOV.UK. Available online: https://www.gov.uk/government/news/serious-fraud-office-sets-out-next-steps-in-ambitious-plan (accessed on 10 December 2025).
  192. SFO. 2025. SFO Annual Report and Accounts 2024–25. UK Government. Available online: https://www.gov.uk/government/publications/sfo-annual-report-and-accounts-2024-25 (accessed on 10 December 2025).
  193. Simon, Manon. 2024. Adaptive governance as a normative and analytical framework. In Learning from Weather Modification Law for the Governance of Regional Solar Radiation Management. Singapore: Springer, pp. 43–63. [Google Scholar] [CrossRef]
  194. Society for Computers and Law. 2020. AA v Persons Unknown: Unmasking the Proprietary Status of Cryptoassets. Available online: https://www.scl.org/10799-aa-v-persons-unknown-unmasking-the-proprietary-status-of-cryptoassets/ (accessed on 3 August 2025).
  195. Spotlight on Corruption. 2022. Government Spends Equivalent of Just 0.042% of GDP on Fighting Economic Crime—New Analysis. January 24. Available online: https://www.spotlightcorruption.org/press-release-government-spends-equivalent-of-just-0-042-of-gdp-on-fighting-economic-crime-new-analysis/ (accessed on 3 August 2025).
  196. Stigler, G. J. 1971. The theory of economic regulation. The Bell Journal of Economics and Management Science 2: 3–21. [Google Scholar] [CrossRef]
  197. Stilgoe, Jack, Richard Owen, and Phil Macnaghten. 2013. Developing a framework for responsible innovation. Research Policy 42: 1568–1580. [Google Scholar] [CrossRef]
  198. Tan, Evrim, Maxime Petit Jean, Anthony Simonofski, Thomas Tombal, Bjorn Kleizen, Mathias Sabbe, Lucas Bechoux, and Pauline Willem. 2023. Artificial intelligence and algorithmic decisions in fraud detection: An interpretive structural model. Data & Policy 5: E25. [Google Scholar] [CrossRef]
  199. Terrorism Act 2000, c. 11. 2000. Available online: https://www.legislation.gov.uk/ukpga/2000/11/contents (accessed on 5 June 2025).
  200. The National Archives. 2023. Economic Crime and Corporate Transparency Act 2023 (c. 56). Legislation.gov.uk. Available online: https://www.legislation.gov.uk/ukpga/2023/56/contents (accessed on 3 July 2025).
  201. The National Archives. 2025. Criminal Procedure Rules 2025: Part 19—Expert Evidence (SI 2025/909). Available online: https://www.legislation.gov.uk/uksi/2025/909 (accessed on 10 December 2025).
  202. Thierer, Adam. 2018. The Pacing Problem and the Future of Technology Regulation. Mercatus Center. George Mason University, August 10. Available online: https://www.mercatus.org/economic-insights/expert-commentary/pacing-problem-and-future-technology-regulation (accessed on 3 July 2025).
  203. TRM. 2023. Cryptoasset Seizure Powers Expanded in the UK. Insights. Available online: https://www.trmlabs.com/resources/blog/cryptoasset-seizure-powers-expanded-in-the-uk (accessed on 3 July 2025).
  204. UK Parliament. 2025a. Artificial Intelligence (Regulation) Bill [HL]. Available online: https://bills.parliament.uk/bills/3942 (accessed on 10 December 2025).
  205. UK Parliament. 2025b. Joint Committee on Human Rights: Inquiry into Human Rights and the Regulation of AI. Available online: https://committees.parliament.uk/committee/93/human-rights-joint-committee/news/208676 (accessed on 10 December 2025).
  206. UK Parliament. 2025c. Serious Fraud Office: Artificial Intelligence [Written question UIN 26405]. Available online: https://questions-statements.parliament.uk/written-questions/detail/2025-01-27/26405 (accessed on 10 December 2025).
  207. United Nations. 1992. Rio Declaration on Environment and Development (U.N. Doc. A/CONF.151/26 (Vol. I)). United Nations. Available online: https://www.un.org/en/development/desa/population/migration/generalassembly/docs/globalcompact/A_CONF.151_26_Vol.I_Declaration.pdf (accessed on 17 June 2025).
  208. USAO DC. 2025. Largest Ever Seizure of Funds Related to Crypto Confidence Scams. Available online: https://www.justice.gov/usao-dc/pr/largest-ever-seizure-funds-related-crypto-confidence-scams (accessed on 10 December 2025).
  209. U.S. Attorney’s Office. 2023. Head of Legal and Compliance for Multibillion-Dollar Cryptocurrency Pyramid Scheme “OneCoin” Pleads Guilty. United States Department of Justice. Available online: https://www.justice.gov/usao-sdny/pr/head-legal-and-compliance-multibillion-dollar-cryptocurrency-pyramid-scheme-onecoin (accessed on 25 June 2025).
  210. U.S. Department of Justice. 2016. Futures Trader Pleads Guilty to Illegally Manipulating the Futures Market in Connection with 2010 “Flash Crash”. Available online: https://www.justice.gov/archives/opa/pr/futures-trader-pleads-guilty-illegally-manipulating-futures-market-connection-2010-flash (accessed on 25 June 2025).
  211. U.S. Department of Justice. 2025. United States Files Civil Forfeiture Complaint Against $225M in Funds Involved in Cryptocurrency Investment Fraud Money Laundering [Press Release]. Available online: https://www.justice.gov/opa/pr/united-states-files-civil-forfeiture-complaint-against-225m-funds-involved-cryptocurrency (accessed on 10 December 2025).
  212. U.S. Government Accountability Office. 2023. Blockchain in Finance: Legislative and Regulatory Actions Are Needed to Ensure Comprehensive Oversight of Crypto Assets (GAO-23-105346). Available online: https://www.gao.gov/assets/gao-23-105346.pdf (accessed on 25 June 2025).
  213. U.S. Secret Service. 2025. Largest Ever Seizure of Funds Related to Crypto Confidence Scams [News Release]. Available online: https://www.secretservice.gov/newsroom/releases/2025/06/largest-ever-seizure-funds-related-crypto-confidence-scams (accessed on 10 December 2025).
  214. U.S. Securities and Exchange Commission. 2023. SEC Announces Enforcement Results for Fiscal Year 2023 (Press Release No. 2023-234). Available online: https://www.sec.gov/newsroom/press-releases/2023-234 (accessed on 25 June 2025).
  215. U.S. Treasury. 2023. Illicit Finance Risk Assessment of Decentralized Finance. Available online: https://home.treasury.gov/system/files/136/DeFi-Risk-Full-Review.pdf (accessed on 10 December 2025).
  216. Walker, Martin C. W. 2021. Designed to Avoid Regulation—The Real Roots of Bitcoin. LSE. Available online: https://blogs.lse.ac.uk/businessreview/2021/09/28/designed-to-avoid-regulation-the-real-roots-of-bitcoin/ (accessed on 25 June 2025).
  217. WEF. 2025. Pushing Through Undercurrents: Sectoral and Regional Forces Influencing Technology-Driven Systemic Risk. Available online: https://www.weforum.org/publications/pushing-through-undercurrents-sectoral-and-regional-forces-influencing-technology-driven-systemic-risk-and-resulting-mitigation-opportunities/ (accessed on 10 December 2025).
  218. Weinberg, A. I., and A. Faccia. 2024. Quantum algorithms: A new frontier in financial crime prevention. arXiv. [Google Scholar] [CrossRef]
  219. Woods, Dulani, John S. Hollywood, Jeremy D. Barnum, Danielle Fenimore, Michael J. D. Vermeer, and Brian A. Jackson. 2021. Cryptocurrency and Blockchain Needs for Law Enforcement. RAND Corporation. Available online: https://www.rand.org/pubs/research_reports/RRA108-17.html (accessed on 25 June 2025).
  220. Woods, Rodney B. 2022. Beyond traditional risk management: Integrating horizon scanning and strategic risk prioritization. In Regent Research Roundtables Proceedings. Virginia Beach: Regent University, pp. 209–17. Available online: https://cdn.regent.edu/wp-content/uploads/2022/10/Regent-Research-Roundtables-2022-Strategic-Foresight-Woods.pdf (accessed on 25 June 2025).
  221. World Economic Forum. 2025. Artificial Intelligence in Financial Services. Available online: https://reports.weforum.org/docs/WEF_Artificial_Intelligence_in_Financial_Services_2025.pdf (accessed on 10 December 2025).
  222. Yeung, Karen. 2018. Algorithmic regulation: A critical interrogation. Regulation & Governance 12: 505–23. [Google Scholar]
  223. Yin, Robert K. 2018. Case Study Research and Applications: Design and Methods, 6th ed. Thousand Oaks: Sage Publications. [Google Scholar]
  224. Yusuf, Zara. 2020. The Wirecard Scandal: What Does This Mean for the Fintech Sector? September 14. Available online: https://thestudentlawyer.com/2020/09/14/55742/ (accessed on 25 June 2025).
  225. Zetzsche, Dirk A., Ross P. Buckley, Janos N. Barberis, and Douglas W. Arner. 2017. Regulating a revolution: From regulatory sandboxes to smart regulation. Fordham Journal of Corporate & Financial Law 23: 31–103. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Article metric data becomes available approximately 24 hours after publication online.