Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (563)

Search Parameters:
Keywords = chain rule

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 1865 KB  
Article
Loop-Constrained Connectivity Calculation for Planar Multi-Loop Mechanisms: Base–End-Effector Localization and Functional-Constraint Screening
by Xiaoxiong Li and Huafeng Ding
Machines 2026, 14(4), 455; https://doi.org/10.3390/machines14040455 - 20 Apr 2026
Abstract
Planar multi-loop mechanisms often generate a large number of non-isomorphic candidate topological graphs during automatic synthesis, making it difficult to efficiently identify configurations that satisfy engineering-oriented functional requirements. To address this issue, a loop-constrained connectivity calculation method and a connectivity-based localization and screening [...] Read more.
Planar multi-loop mechanisms often generate a large number of non-isomorphic candidate topological graphs during automatic synthesis, making it difficult to efficiently identify configurations that satisfy engineering-oriented functional requirements. To address this issue, a loop-constrained connectivity calculation method and a connectivity-based localization and screening procedure are proposed. The proposed connectivity calculation is directly formulated for general planar non-fractionated kinematic chains (NFKCs), including those with multiple joints. For planar fractionated kinematic chains (FKCs), however, the present method is not applied directly at the full-system level, but only to decomposed non-fractionated subchains after system-level decomposition. Starting from a structurally admissible set of candidate topological graphs, a connectivity matrix is established for automatic localization of the base and the end-effector (EE). Functional screening is then performed by combining the connectivity criterion with object-oriented rules on hydraulic driving-pair arrangement and driving-redundancy patterns. The method was validated using the 10-link, 3-DOF single-joint equivalent of the KC1 subchain of a mine scaler manipulator arm. Under the prescribed structural and functional constraints, 249 admissible configurations were obtained. The results indicate that the proposed method provides an effective basis for application-oriented topological screening and subsequent dimensional synthesis. Full article
(This article belongs to the Section Machine Design and Theory)
Show Figures

Figure 1

26 pages, 639 KB  
Article
Advancing Life Cycle Assessment of Pasture-Based Beef Systems: A High-Resolution Cradle-to-Grave Framework for Global Benchmarking
by Rodolfo Bongiovanni, Leticia Tuninetti, Javier Echazarreta, Ana Muzlera Klappenbach, Javier Lozano, Leonel Alisio and Mariano Avilés
Sustainability 2026, 18(8), 3930; https://doi.org/10.3390/su18083930 - 15 Apr 2026
Viewed by 232
Abstract
Beef production is widely recognized as a significant contributor to global greenhouse gas emissions, making robust and transparent environmental assessments essential for advancing sustainability within supply chains. This study applies a comprehensive cradle-to-grave Life Cycle Assessment (LCA) to evaluate the environmental performance of [...] Read more.
Beef production is widely recognized as a significant contributor to global greenhouse gas emissions, making robust and transparent environmental assessments essential for advancing sustainability within supply chains. This study applies a comprehensive cradle-to-grave Life Cycle Assessment (LCA) to evaluate the environmental performance of beef destined for export, following ISO 14040, ISO 14044 and ISO 14067 standards and the Product Category Rules for meat of mammals. Sixteen impact categories were quantified for 1 kg of vacuum-packed beef using detailed primary data from a pasture-based production system and a representative processing facility. The total climate change impact was 3.27 × 101 kg CO2eq, with enteric methane and feed production jointly responsible for over 70% of overall impacts. Slaughtering and distribution were associated mainly with fossil energy use and ozone depletion, while soil carbon sequestration partially compensated biogenic emissions. The results were consistent with international benchmarks, highlighting the environmental advantages of pasture-based systems, low fertilizer use, and stable land management. Key hotspots were identified in animal growth, feed efficiency, and manure management, with logistics also contributing notably. Overall, the study provides a high-resolution environmental baseline that can support Environmental Product Declarations and guide targeted mitigation strategies across beef supply chains. While the results are derived from a specific pasture-based production system, the study is positioned as a case-study-based application of a high-resolution LCA framework, illustrating how detailed inventories can support environmental benchmarking and hotspot identification without implying statistical representativeness of all beef production systems. Full article
Show Figures

Graphical abstract

11 pages, 1719 KB  
Article
Investigations of the α-Olefin Polymerization Process Using the Classic α-Diimine Nickel Catalyst
by Ying Wang, Jingjing Lai, Zhihui Song, Rong Gao, Qingqiang Gou, Bingyi Li, Gang Zheng, Randi Zhang, Qiang Yue and Yuanning Gu
Polymers 2026, 18(8), 961; https://doi.org/10.3390/polym18080961 - 15 Apr 2026
Viewed by 236
Abstract
This work provides a comprehensive exploration of α-olefin polymerization characteristics catalyzed by the classic α-diimine Ni catalyst. The polymerization process exhibited quasi-living behaviour, and a reaction kinetic model for the monomer coordination–insertion process was established. It was observed that the reaction exhibits living [...] Read more.
This work provides a comprehensive exploration of α-olefin polymerization characteristics catalyzed by the classic α-diimine Ni catalyst. The polymerization process exhibited quasi-living behaviour, and a reaction kinetic model for the monomer coordination–insertion process was established. It was observed that the reaction exhibits living polymerization features during the first 10 min, and the coordination–insertion rate constant was determined to be 1.08 L·mol−1·s−1 at 30 °C. The regulation rules for factors including co-catalyst amount, monomer concentration, polymerization temperature, monomer type on the molecular weight, molecular weight distribution and chain structure of poly(α-olefin)s were clarified. The co-catalyst (methylaluminoxane) primarily served to activate the catalyst without inducing a chain transfer effect, suggesting that chain stagnation is likely the primary cause of the deviation from typical living polymerization behaviour. Based on temperature-controlled experiments, the activation energy for the coordination–insertion reaction was calculated to be 28.40 kJ·mol−1 through GPC curve analysis. The kinetic model established in this study, along with the revealed chain branching rules, provides a theoretical foundation for the design of poly(α-olefin)s with novel structures and functions. Full article
(This article belongs to the Section Polymer Chemistry)
Show Figures

Figure 1

22 pages, 4784 KB  
Article
Comparative Study on Continuous and Discrete Design Optimization for the Fairlead Chain Stopper of Large-Scale Floating Offshore Wind Turbines
by Min-Seok Cheong and Chang-Yong Song
Energies 2026, 19(8), 1893; https://doi.org/10.3390/en19081893 - 14 Apr 2026
Viewed by 307
Abstract
This study presents a comparative investigation of continuous and discrete design optimization for the fairlead chain stopper of large-scale 10 MW floating offshore wind turbines. The fairlead chain stopper plays a key role in ensuring mooring integrity, rapid port evacuation, and efficient maintenance [...] Read more.
This study presents a comparative investigation of continuous and discrete design optimization for the fairlead chain stopper of large-scale 10 MW floating offshore wind turbines. The fairlead chain stopper plays a key role in ensuring mooring integrity, rapid port evacuation, and efficient maintenance under extreme weather conditions driven by global warming. The objective is to minimize structural weight while maintaining safety in accordance with the international classification rules of Det Norske Veritas. Three representative design load scenarios covering mooring and towing conditions are defined, and finite element analysis confirmed that the baseline design satisfies allowable stress limits. In the optimization stage, the thicknesses of nine principal components are selected as design variables. Continuous and discrete formulations are solved using particle swarm optimization, a non-dominated sorting genetic algorithm, and an evolutionary algorithm, and their convergence behavior and computational efficiency are compared. The results show that discrete optimization, which reflects actual manufacturing plate thicknesses, achieves nearly the same weight reduction as the continuous approach while offering superior practical applicability. Among the three techniques, the evolutionary algorithm provided the best convergence characteristics and attained up to 3.73 percent weight reduction. The proposed comparative methodology offers a useful guideline for rational weight-efficient design of core mooring equipment on large floating offshore wind power platforms. Full article
(This article belongs to the Special Issue Latest Challenges in Wind Turbine Maintenance, Operation, and Safety)
Show Figures

Figure 1

32 pages, 2407 KB  
Article
Continuous-Time Scheduling of Berths and Onshore Power Supply in Cold-Chain Logistics: A Chance-Constrained Stochastic Programming Model and RL-ALNS Algorithm
by Zheyin Zhao and Jin Zhu
Mathematics 2026, 14(8), 1292; https://doi.org/10.3390/math14081292 - 13 Apr 2026
Viewed by 162
Abstract
Amid tightening emission rules and growing cold-chain demand, ports face complex multi-objective scheduling under dual uncertainties in vessel arrivals and operations. This work develops a multi-objective chance-constrained stochastic MILP model for joint berth, QC, and OPS scheduling. Heavy-tailed operational delays are managed via [...] Read more.
Amid tightening emission rules and growing cold-chain demand, ports face complex multi-objective scheduling under dual uncertainties in vessel arrivals and operations. This work develops a multi-objective chance-constrained stochastic MILP model for joint berth, QC, and OPS scheduling. Heavy-tailed operational delays are managed via chance constraints, converting Weibull distributions to time buffers, while convex formulations allow piecewise cargo damage penalties to be computed linearly. A reinforcement learning-based adaptive large neighborhood search (RL-ALNS) algorithm is proposed to solve this NP-hard continuous-time problem, integrating a spatiotemporal decoder and an MDP-based selector to ensure microgrid limits and efficiency. Simulations demonstrate RL-ALNS’s superior Pareto convergence versus conventional heuristics. The model cuts the 95th-percentile tail risk by 46.59% and actual costs by 24.44% under mild delays, compared to deterministic scheduling. Overall, it quantifies the non-linear cost–emission–reliability trade-off, providing a robust tool for port decision-making. Full article
36 pages, 7325 KB  
Article
Intelligent Scheduling of Rail-Guided Shuttle Cars via Deep Reinforcement Learning Integrating Dynamic Graph Neural Networks and Transformer Model
by Fang Zhu and Shanshan Peng
Algorithms 2026, 19(4), 289; https://doi.org/10.3390/a19040289 - 8 Apr 2026
Viewed by 210
Abstract
With the rapid development of e-commerce and smart manufacturing, automated warehouse systems have become critical infrastructure for modern logistics. In China’s vast market, the dynamic scheduling of Rail-Guided Vehicles (RGVs) faces significant challenges due to complex task uncertainties, hierarchical supply chain structures, and [...] Read more.
With the rapid development of e-commerce and smart manufacturing, automated warehouse systems have become critical infrastructure for modern logistics. In China’s vast market, the dynamic scheduling of Rail-Guided Vehicles (RGVs) faces significant challenges due to complex task uncertainties, hierarchical supply chain structures, and real-time collision avoidance requirements. Traditional rule-based methods and static optimization models often fail to adapt to such dynamic environments. To address these issues, this paper proposes a novel hybrid deep reinforcement learning framework integrating a Dynamic Graph Neural Network (DGNN) and a Transformer model. The DGNN captures the spatiotemporal dependencies of the warehouse network topology, while the Transformer mechanism enhances long-range feature extraction for task prioritization. Furthermore, we design a centralized Deep Q-network (DQN) framework with parameterized action spaces to coordinate multiple RGVs collaboratively. While the system manages multiple physical vehicles, the learning architecture employs a single-agent global scheduler to avoid the non-stationarity issues inherent in multi-agent reinforcement learning. Experimental results based on real-world data from a large-scale electronics manufacturing warehouse demonstrate that our method reduces average task completion time by 18.5% and improves system throughput by 22.3% compared to state-of-the-art baselines. The proposed approach demonstrates potential for intelligent warehouse management in dynamic industrial scenarios. Full article
Show Figures

Figure 1

23 pages, 1155 KB  
Review
Evidence-Based Clinical Management of Canine Cognitive Dysfunction Syndrome: Diagnostic Algorithms, Practical Guidelines, Critical Appraisal of Biomarkers and Translational Limitations
by Maurizio Dondi, Ezio Bianchi, Paolo Borghetti, Valentina Buffagni, Rosanna Di Lecce, Giacomo Gnudi, Chiara Guarnieri, Francesca Ravanetti, Roberta Saleri and Attilio Corradi
Animals 2026, 16(7), 1114; https://doi.org/10.3390/ani16071114 - 4 Apr 2026
Viewed by 766
Abstract
Canine Cognitive Dysfunction Syndrome (CCDS) is a progressive neurodegenerative disease affecting older dogs that shares many pathological mechanisms with human Alzheimer’s disease (AD). Although it is common in geriatric dogs, CCDS is often underdiagnosed in veterinary medicine. Both CCDS and AD involve a [...] Read more.
Canine Cognitive Dysfunction Syndrome (CCDS) is a progressive neurodegenerative disease affecting older dogs that shares many pathological mechanisms with human Alzheimer’s disease (AD). Although it is common in geriatric dogs, CCDS is often underdiagnosed in veterinary medicine. Both CCDS and AD involve a gradual decline in cognitive functions such as memory, learning and executive abilities. From a pathological perspective, dogs with CCDS show brain changes similar to those seen in AD, including cerebral atrophy, loss of neurons and accumulation of amyloid-beta plaques. CCDS is diagnosed by exclusion, meaning that other medical or neurological conditions that could cause similar behavioural signs must first be ruled out. Clinical evaluation mainly relies on structured questionnaires completed by owners. Magnetic resonance imaging is used to confirm cerebral atrophy and, at the same time, to exclude other brain disorders, such as cerebrovascular accidents and neoplasia. Current research focuses on identifying fluid biomarkers, such as amyloid-beta, neurofilament light chain and glial fibrillary acidic protein, to support an early and objective diagnosis. The most effective management combines pharmacological therapy, targeted nutrition and non-pharmacological strategies, including environmental enrichment and behavioural support. Early intervention, ideally during mild cognitive impairment, is crucial to slow disease progression and maintain quality of life. Full article
(This article belongs to the Special Issue Cognitive Dysfunction and Neurodegenerative Diseases in Dogs and Cats)
Show Figures

Figure 1

33 pages, 6049 KB  
Article
Blockchain-Based Mixed-Node Auction Mechanism
by Xu Liu and Junwu Zhu
Electronics 2026, 15(7), 1516; https://doi.org/10.3390/electronics15071516 - 4 Apr 2026
Viewed by 281
Abstract
Blockchain-based auctions often utilize smart contracts to automate auction rules, with much research focusing on enhancing privacy and fairness through cryptographic techniques. However, the authenticity of external data input into these systems is frequently overlooked. In particular, rational nodes may manipulate bidding data [...] Read more.
Blockchain-based auctions often utilize smart contracts to automate auction rules, with much research focusing on enhancing privacy and fairness through cryptographic techniques. However, the authenticity of external data input into these systems is frequently overlooked. In particular, rational nodes may manipulate bidding data by submitting false types to maximize their utility, compromising market fairness and the reliability of auction outcomes. The aim of this study is to propose an alternative blockchain-based auction mechanism to incentivize nodes to report types honestly. We propose the Mixed-Node Advertising Auction (MNAA) mechanism for digital advertising auctions on blockchain systems. MNAA integrates quasi-linear and value maximization utility models to design allocation and pricing rules that eliminate nodes’ incentives to misreport their types, ensuring the authenticity of data submitted to the auction. To enhance efficiency, MNAA employs state channel technology and off-chain smart contracts, reducing main chain interactions. Theoretical analysis confirms that MNAA incentivizes truthful behavior and ensures security and correctness. Simulation results show that MNAA outperforms Generalized Second Price (GSP), Mixed Bidders with Private Classes (MPR), and Vickrey–Clarke–Grooves (VCG) auctions in terms of liquid social welfare (LSW), publisher revenue, and allocation efficiency, while also improving the transaction throughput and showing good performance in terms of transaction costs and latency. Full article
(This article belongs to the Special Issue Novel Methods Applied to Security and Privacy Problems, Volume II)
Show Figures

Figure 1

39 pages, 556 KB  
Article
Rent Extraction or Collaborative Financing? Digital Spillovers of Major Customers on Supplier Trade Credit Scale and Quality
by Shang Gao, Feng Ding, Jiaxuan Li and Qiliang Liu
Sustainability 2026, 18(7), 3394; https://doi.org/10.3390/su18073394 - 31 Mar 2026
Viewed by 398
Abstract
Does the digital transformation of major customers foster collaborative financing for upstream suppliers, or does it amplify their bargaining power for rent extraction? This study investigates these competing hypotheses by examining the digital spillovers from major customers to supplier trade credit. Using a [...] Read more.
Does the digital transformation of major customers foster collaborative financing for upstream suppliers, or does it amplify their bargaining power for rent extraction? This study investigates these competing hypotheses by examining the digital spillovers from major customers to supplier trade credit. Using a unique hand-collected dataset linking Chinese listed suppliers with their top five customers by accounts receivable from 2010 to 2021, we document a “dual enhancement effect”: major customer digitalization significantly increases trade credit scale and improves trade credit quality, effectively rejecting the rent extraction hypothesis. Specifically, trade credit quality is reflected in lower bad debt provision ratios, shorter receivable aging, and lower material default risk. Mechanism tests suggest that improved information transparency and stronger customer market competitiveness are important channels through which digitalization affects supplier trade credit. Cross-sectional analyses show that these effects are more pronounced for non-state-owned or low-asset-specificity suppliers, and for customers with higher asset specificity or lower importance. After ruling out alternative explanations, we further find that this digital spillover strengthens supply chain resilience. Overall, the evidence is more consistent with the collaborative financing view than with the rent extraction view, suggesting that major customer digitalization may help foster more sustainable and cooperative financing relationships within supply chains. Full article
(This article belongs to the Section Economic and Business Aspects of Sustainability)
Show Figures

Figure 1

30 pages, 4413 KB  
Article
Dotsformer: Capturing Chain-Loop Structures for Transformer in Dots-and-Boxes
by Ranran Zhang, Changming Xu, Kuo Wu, Mingze Zheng, Xingcan Liu and Junwei Wang
Appl. Sci. 2026, 16(7), 3395; https://doi.org/10.3390/app16073395 - 31 Mar 2026
Viewed by 392
Abstract
In many board games, AlphaZero has demonstrated superhuman abilities. Dots-and-Boxes is a classic board game with simple rules but requiring skills to win. This paper proposes Dotsformer, which extracts chain-loop structures from the game board. These structures connect distant boxes, providing long-range relational [...] Read more.
In many board games, AlphaZero has demonstrated superhuman abilities. Dots-and-Boxes is a classic board game with simple rules but requiring skills to win. This paper proposes Dotsformer, which extracts chain-loop structures from the game board. These structures connect distant boxes, providing long-range relational information as input to the Transformer. We employ multiple convolutional kernels to generate Q, K, and V, and incorporate information about the box structure itself into the attention scores. We also incorporate auxiliary training tasks, including an initiative task and a classification task. These tasks determine whether to retain or relinquish the initiative in the current situation, and classify actions into forbidden, conceding, safe, and scoring moves. They provide additional supervisory signals and accelerate learning. The experimental results show that Dotsformer outperforms AlphaZero in both rollback speed and playing strength: it achieved a winning rate of 87.6% and an ELO rating lead of 340 points against the baseline. Additionally, ablation studies verify the effectiveness of each key module. Full article
(This article belongs to the Special Issue Advances in Intelligent Decision-Making Systems)
Show Figures

Figure 1

42 pages, 656 KB  
Article
Operational Resilience Under Carbon Constraints: A Socio-Technical Multi-Agentic Approach to Global Supply Chains
by Rashanjot Kaur, Triparna Kundu, Bhanu Sharma, Kathleen Marshall Park and Eugene Pinsky
Systems 2026, 14(4), 374; https://doi.org/10.3390/systems14040374 - 31 Mar 2026
Viewed by 284
Abstract
High-stakes logistics, defined as supply chains where delays, quality loss, or noncompliance have serious human, safety, financial, or geopolitical consequences, are a prominent case of a broader reality: global supply chains are safety-, cost-, and time-critical socio-technical systems where forecasting quality, vendor coordination, [...] Read more.
High-stakes logistics, defined as supply chains where delays, quality loss, or noncompliance have serious human, safety, financial, or geopolitical consequences, are a prominent case of a broader reality: global supply chains are safety-, cost-, and time-critical socio-technical systems where forecasting quality, vendor coordination, and operational decisions shape service levels and stakeholder welfare. At the same time, decarbonization pressures and the growing use of AI for planning and control introduce new risks and trade-offs across energy, computation, and physical logistics. We develop a multi-agent framework that models supply chain system-of-systems dynamics drawing on (1) supply chain decision functions (shipment planning, sourcing and vendor management), (2) national energy-transition conditions that determine grid carbon intensity, and (3) carbon-aware computation accounting for AI-enabled decision support. Methodologically, we combine predictive analytics, unsupervised segmentation, and a carbon-cost-of-intelligence layer in a scenario-based assessment of how national energy-transition profiles–from Norway to India–affect the intensity of AI compute carbon, meaning the carbon emissions generated by the hardware and data centers required to train and run AI models. We introduce the carbon-adjusted supply chain performance (CASP) metric that integrates physical transport carbon, cold-chain overhead where applicable, and AI compute carbon into a per-package-type performance measure. Our analysis yields three actionable outputs for systems engineering and environmental management: carbon, service, and cost trade-off frontiers; governance levers (sourcing portfolio rules, buffers, and compute policies); and system-level early-warning indicators for disruption amplification. This study implements a tool-augmented multi-agent system (orchestrator, risk, and sourcing agents) using AWS bedrock and strands agents, where LLM-based agents orchestrate deterministic analytical engines through structured tool interfaces with adaptive query generation. Theoretically, we extend previous systems-of-systems and sustainable supply chain findings by formalizing package-type-specific carbon–service frontiers and by embedding AI compute carbon into a socio-technical resilience framework. Practically, the CASP benchmark, governance lever analysis, and multi-agent implementation provide decision-makers with concrete tools to compare carriers, routes, and compute strategies across countries while making transparent the trade-offs between service reliability and total carbon. Full article
Show Figures

Graphical abstract

25 pages, 502 KB  
Article
Digitalising Social Value for Sustainable Urban Regeneration: Governance, Co-Production Gaps and Delivery Burdens in London
by Maria Christina Georgiadou and Jade Rochelle Julien
Sustainability 2026, 18(7), 3303; https://doi.org/10.3390/su18073303 - 28 Mar 2026
Viewed by 362
Abstract
This paper investigates how social value is operationalised in urban regeneration and how digital reporting platforms shape the measurement and governance of social sustainability. Drawing on semi-structured interviews with UK social value professionals and a resident survey conducted within the Elephant and Castle [...] Read more.
This paper investigates how social value is operationalised in urban regeneration and how digital reporting platforms shape the measurement and governance of social sustainability. Drawing on semi-structured interviews with UK social value professionals and a resident survey conducted within the Elephant and Castle regeneration programme in London, the study examines how platform-based systems translate procurement commitments into auditable performance categories. These systems embed predefined classification schemas, proxy valuation metrics and rule-based validation procedures that structure how outcomes become visible and comparable across projects. The findings indicate that digital reporting platforms enhance oversight and inter-project benchmarking but prioritise outcomes that align with measurable procurement indicators. Employment generation, apprenticeships and local procurement expenditure dominate reported performance, while relational and place-based outcomes, such as trust, belonging and neighbourhood continuity, remain marginal. Reporting requirements generate substantial evidencing burdens across supply chains, may introduce data distortions through proxy-based and threshold-led reporting, and can concentrate engagement at early project stages, limiting sustained community influence and creating technical barriers to participation. The analysis highlights how digital reporting platforms can operate as governance infrastructures within smart city environments, shaping what is prioritised, funded and recognised as credible impact. The findings provide practical insights for the design of more inclusive and proportionate digital accountability systems for sustainable local development. Full article
Show Figures

Figure 1

44 pages, 643 KB  
Article
A Hybrid Multi-Agent System for Early Scam Detection in Crypto-Assets
by Mario Trerotola, Mimmo Parente and Davide Calvaresi
Appl. Sci. 2026, 16(7), 3122; https://doi.org/10.3390/app16073122 - 24 Mar 2026
Viewed by 628
Abstract
The rapid expansion of crypto-asset markets and the introduction of the Markets in Crypto-Assets Regulation (MiCAR) pose novel supervisory challenges. Existing blockchain intelligence platforms focus predominantly on on-chain surveillance, leaving gaps in off-chain documentary due diligence automation. This paper presents a Multi-Agent System [...] Read more.
The rapid expansion of crypto-asset markets and the introduction of the Markets in Crypto-Assets Regulation (MiCAR) pose novel supervisory challenges. Existing blockchain intelligence platforms focus predominantly on on-chain surveillance, leaving gaps in off-chain documentary due diligence automation. This paper presents a Multi-Agent System (MAS) integrating Large Language Model (LLM) capabilities with rule-based compliance frameworks. The architecture comprises seven specialized agents: a Coordinator Agent for orchestration; data acquisition agents (Searcher, Crawler); three parallel analytical agents—Heuristic Agent (LLM-powered qualitative risk assessment), Compliance Agent (hybrid-AI MiCAR asset classification and regulatory requirement verification), and On-Chain Agent (machine learning-based fraud detection); and a Reconciliator Agent synthesizing findings into unified alerts. Component-level empirical validation on 150 projects indicates 95% output reproducibility (identical alert tier and score deviation 0.05 across five reruns) and 210 s mean latency, providing proof-of-concept evidence for the integrated pipeline. A pilot user evaluation (six researchers/master students and two experts from regulatory authorities) provides preliminary usability evidence and surfaces domain-specific feedback from regulatory-authority experts. The architecture advances proactive regulatory technology by enabling scalable analysis combining off-chain documentary evidence with on-chain forensics. Full article
Show Figures

Figure 1

22 pages, 848 KB  
Article
Digital Specimen Tracking- and ISO 15189-Oriented Risk Management in Anatomic Pathology: A Qualitative Study of Expert Perspectives in Western Austria
by Pius Sommeregger, Natalie Pallua, Bettina Zelger, Riem Kahlil and Johannes Dominikus Pallua
Diagnostics 2026, 16(6), 949; https://doi.org/10.3390/diagnostics16060949 - 23 Mar 2026
Viewed by 398
Abstract
Background: Breakpoints in the pre-examination processes and at organizational interfaces are a significant source of failures in specimen identification and tracking in anatomic pathology. While ISO 15189 emphasizes end-to-end traceability and risk-based quality management, implementing these principles in complex, multi-actor specimen pathways [...] Read more.
Background: Breakpoints in the pre-examination processes and at organizational interfaces are a significant source of failures in specimen identification and tracking in anatomic pathology. While ISO 15189 emphasizes end-to-end traceability and risk-based quality management, implementing these principles in complex, multi-actor specimen pathways remains challenging. This study explores expert perspectives on specimen process chains, tracking mechanisms, and ISO 15189-oriented quality and risk management in pathology. Methods: We conducted 10 semi-structured expert interviews across three settings. Interviews were audio-recorded, transcribed, pseudonymized, and analyzed using structured qualitative content analysis (Mayring) supported by MAXQDA. A deductive category system derived from the theoretical framework and interview guide comprised six main categories and twelve subcategories. Results: Across 512 coded text segments, participants identified several factors as critical for effective implementation, including: (i) interface management along the specimen pathway, with recurrent vulnerabilities at handovers between operating theater/ward/transport and accessioning; (ii) the central role of barcode-based identification and the need for closed-loop traceability; (iii) the importance of measurable quality indicators and incident learning systems to operationalize risk management; (iv) persistent paper–digital handoffs and heterogeneous IT landscapes that undermine data integrity; (v) the need for clearly assigned responsibilities, training, and SOP governance; and (vi) implementation barriers including resources, change management, and vendor integration, alongside practical enablers such as incremental roll-out and cross-professional governance. Conclusions: Experts converge on a pragmatic ISO 15189-aligned roadmap: prioritize interface risks, standardize identifiers and handover rules, define a minimal KPI set for tracking and misidentification events, and reduce paper–digital handoffs by interoperable IT. Future work should quantify baseline error rates and evaluate the impact of digital tracking interventions on patient safety and turnaround times. Full article
(This article belongs to the Section Pathology and Molecular Diagnostics)
Show Figures

Figure 1

23 pages, 7446 KB  
Article
MCMC Correction of Score-Based Diffusion Models for Model Composition
by Anders Sjöberg, Jakob Lindqvist, Magnus Önnheim, Mats Jirstrand and Lennart Svensson
Entropy 2026, 28(3), 351; https://doi.org/10.3390/e28030351 - 20 Mar 2026
Viewed by 325
Abstract
Diffusion models can be parameterized in terms of either score or energy function. The energy parameterization is attractive as it enables sampling procedures such as Markov Chain Monte Carlo (MCMC) that incorporates a Metropolis–Hastings (MH) correction step based on energy differences between proposed [...] Read more.
Diffusion models can be parameterized in terms of either score or energy function. The energy parameterization is attractive as it enables sampling procedures such as Markov Chain Monte Carlo (MCMC) that incorporates a Metropolis–Hastings (MH) correction step based on energy differences between proposed samples. Such corrections can significantly improve sampling quality, particularly in the context of model composition, where pre-trained models are combined to generate samples from novel distributions. Score-based diffusion models, on the other hand, are more widely adopted and come with a rich ecosystem of pre-trained models. However, they do not, in general, define an underlying energy function, making MH-based sampling inapplicable. In this work, we address this limitation by retaining score parameterization and introducing a novel MH-like acceptance rule based on line integration of the score function. This allows the reuse of existing diffusion models while still combining the reverse process with various MCMC techniques, viewed as an instance of annealed MCMC. Through experiments on synthetic and real-world data, we show that our MH-like samplers yield relative improvements of similar magnitude to those observed with energy-based models, without requiring explicit energy parameterization. Full article
(This article belongs to the Section Statistical Physics)
Show Figures

Figure 1

Back to TopTop