Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (532)

Search Parameters:
Keywords = model lifecycle management

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 3270 KB  
Article
Reliability Case Study of COTS Storage on the Jilin-1 KF Satellite: On-Board Operations, Failure Analysis, and Closed-Loop Management
by Chunjuan Zhao, Jianan Pan, Hongwei Sun, Xiaoming Li, Kai Xu, Yang Zhao and Lei Zhang
Aerospace 2026, 13(2), 116; https://doi.org/10.3390/aerospace13020116 (registering DOI) - 24 Jan 2026
Abstract
In recent years, the rapid development of commercial satellite projects, such as low-Earth orbit (LEO) communication and remote sensing constellations, has driven the satellite industry toward low-cost, rapid development, and large-scale deployment. Commercial off-the-shelf (COTS) components have been widely adopted across various commercial [...] Read more.
In recent years, the rapid development of commercial satellite projects, such as low-Earth orbit (LEO) communication and remote sensing constellations, has driven the satellite industry toward low-cost, rapid development, and large-scale deployment. Commercial off-the-shelf (COTS) components have been widely adopted across various commercial satellite platforms due to their advantages of low cost, high performance, and plug-and-play availability. However, the space environment is complex and hostile. COTS components were not originally designed for such conditions, and they often lack systematically flight-verified protective frameworks, making their reliability issues a core bottleneck limiting their extensive application in critical missions. This paper focuses on COTS solid-state drives (SSDs) onboard the Jilin-1 KF satellite and presents a full-lifecycle reliability practice covering component selection, system design, on-orbit operation, and failure feedback. The core contribution lies in proposing a full-lifecycle methodology that integrates proactive design—including multi-module redundancy architecture and targeted environmental stress screening—with on-orbit data monitoring and failure cause analysis. Through fault tree analysis, on-orbit data mining, and statistical analysis, it was found that SSD failures show a significant correlation with high-energy particle radiation in the South Atlantic Anomaly region. Building on this key spatial correlation, the on-orbit failure mode was successfully reproduced via proton irradiation experiments, confirming the mechanism of radiation-induced SSD damage and providing a basis for subsequent model development and management decisions. The study demonstrates that although individual COTS SSDs exhibit a certain failure rate, reasonable design, protection, and testing can enhance the on-orbit survivability of storage systems using COTS components. More broadly, by providing a validated closed-loop paradigm—encompassing design, flight verification and feedback, and iterative improvement—we enable the reliable use of COTS components in future cost-sensitive, high-performance satellite missions, adopting system-level solutions to balance cost and reliability without being confined to expensive radiation-hardened products. Full article
(This article belongs to the Section Astronautics & Space Science)
62 pages, 4036 KB  
Systematic Review
Quantization of Deep Neural Networks for Medical Image Analysis: A Systematic Review and Meta-Analysis
by Edgar Fabián Rivera-Guzmán, Luis Fernando Guerrero-Vásquez and Vladimir Espartaco Robles-Bykbaev
Technologies 2026, 14(1), 76; https://doi.org/10.3390/technologies14010076 (registering DOI) - 22 Jan 2026
Viewed by 34
Abstract
Neural network quantization has become established as a key strategy for transitioning medical imaging models from research environments to clinical devices and resource-constrained edge platforms; however, the available evidence remains fragmented and focused on highly heterogeneous use cases. This study presents a systematic [...] Read more.
Neural network quantization has become established as a key strategy for transitioning medical imaging models from research environments to clinical devices and resource-constrained edge platforms; however, the available evidence remains fragmented and focused on highly heterogeneous use cases. This study presents a systematic review of 72 studies on quantization applied to medical images, following PRISMA guidelines, with the aim of characterizing the relationship among quantization technique, network architecture, imaging modality, and execution environment, as well as their impact on latency, memory footprint, and clinical deployment. Based on a structured variable matrix, we analyze—through tailored visualizations—usage patterns of Post-Training Quantization (PTQ), Quantization-Aware Training (QAT), mixed precision, and binary/low-bit schemes across frameworks such as PyTorch V 2.6.0, TensorFlow 2.19.0, and TensorFlow Lite, executed on server-class GPUs, edge/embedded devices, and specialized hardware. The results reveal a strong concentration of evidence in PyTorch/TensorFlow pipelines using INT8 or mixed precision on GPUs and edge platforms, contrasted with limited attention to PACS/RIS interoperability, model lifecycle management, energy consumption, cost, and regulatory traceability. We conclude that, although quantization can approximate real-time performance and reduce memory footprint, its clinical adoption remains constrained by integration challenges, model governance requirements, and the maturity of the hardware–software ecosystem. Full article
(This article belongs to the Special Issue Application of Artificial Intelligence in Medical Image Analysis)
Show Figures

Figure 1

25 pages, 3014 KB  
Article
MIO-BDT: Construction of Basic Models and Formal Verification of Building Digital Twins That Supports Multiple Interactive Objects
by Rongwei Zou, Qiliang Yang, Qizhen Zhou, Chao Mou and Zhiwei Zhang
Smart Cities 2026, 9(1), 16; https://doi.org/10.3390/smartcities9010016 - 20 Jan 2026
Viewed by 340
Abstract
As a high-fidelity digital mapping of the physical built environment, the Building Digital Twin (BDT) relies on physical–virtual interaction as a core enabler for lifecycle management. However, existing BDT conceptual models predominantly focus on unidirectional or single-threaded physical–virtual interactions, neglecting the dynamic, concurrent [...] Read more.
As a high-fidelity digital mapping of the physical built environment, the Building Digital Twin (BDT) relies on physical–virtual interaction as a core enabler for lifecycle management. However, existing BDT conceptual models predominantly focus on unidirectional or single-threaded physical–virtual interactions, neglecting the dynamic, concurrent exchanges among multiple digital twins and human users. To overcome this limitation, the Multi-Interactive-Object BDT (MIO-BDT) framework is proposed. The central hypothesis is that explicitly modeling concurrent, multi-party interactions within a formalized conceptual structure can address a key representational gap in current BDT paradigms. The work pursues two testable objectives: (1) to formally define the components, relationships, and rules of the MIO-BDT framework and (2) to validate through a representative use case that the framework can model complex interaction scenarios that are inadequately supported by existing approaches. A systematic analysis of the state of the art is first conducted to ground the framework’s design. The MIO-BDT is then elaborated at both the system level (supporting dynamic interactions among twins, users, and physical entities) and the component level (integrating visual, physical, and interaction sub-models). Formal modeling and verification demonstrate that the framework is logically consistent and deadlock-free and effectively coordinates multi-entity data flows. These findings confirm that the MIO-BDT framework provides enhanced representational capacity, structural clarity for system design, and a unified model for diverse interaction types, thereby establishing a validated conceptual foundation for next-generation, interaction-aware BDT systems. Full article
Show Figures

Figure 1

25 pages, 1313 KB  
Article
How Does Digital Intelligence Empower Green Transformation in Manufacturing Companies? A Case Study Based on FAW-Volkswagen
by Chaohui Zhang and Yuhong Xu
Sustainability 2026, 18(2), 1045; https://doi.org/10.3390/su18021045 - 20 Jan 2026
Viewed by 116
Abstract
Despite the immense potential of digital intelligence technologies to enhance corporate profitability, manufacturing enterprises often face the “digital–green paradox”, which indicates that while companies invest in digital and intelligent transformation, their energy consumption increases rather than promoting green transition. To provide reasonable transformation [...] Read more.
Despite the immense potential of digital intelligence technologies to enhance corporate profitability, manufacturing enterprises often face the “digital–green paradox”, which indicates that while companies invest in digital and intelligent transformation, their energy consumption increases rather than promoting green transition. To provide reasonable transformation solutions for manufacturers still caught in this paradox, this paper adopts a single-case study approach from a product lifecycle perspective. Focusing on FAW-Volkswagen—a manufacturing enterprise demonstrating outstanding performance in digital-intelligent green transformation—this study conducts an in-depth investigation into the stage characteristics and underlying mechanisms. The results show that the following: (1) The digital-intelligent green transformation of manufacturing enterprises is an iterative process evolving from “green design, low-carbon production, intelligent service to enterprise spiral value-added”, with distinct digital-intelligent empowerment models at each stage. (2) By leveraging digital-intelligent technologies, manufacturing enterprises can build a multi-tiered “internal-external dual circulation” green development system encompassing the “enterprise—industrial chain—full ecosystem,” driving comprehensive green upgrades across the entire industry and ecosystem. This paper reveals the intrinsic mechanisms through which digital-intelligent technologies facilitate manufacturing enterprises’ green transformation. It expands and enriches the research context and theoretical implications of product lifecycle management, offering management insights and strategic references for other enterprises pursuing green transformation and upgrading pathways in the digital-intelligent economy era. Full article
Show Figures

Figure 1

33 pages, 7152 KB  
Article
DRADG: A Dynamic Risk-Adaptive Data Governance Framework for Modern Digital Ecosystems
by Jihane Gharib and Youssef Gahi
Information 2026, 17(1), 102; https://doi.org/10.3390/info17010102 - 19 Jan 2026
Viewed by 114
Abstract
In today’s volatile digital environments, conventional data governance practices fail to adequately address the dynamic, context-sensitive, and risk-hazardous nature of data use. This paper introduces DRADG (Dynamic Risk-Adaptive Data Governance), a new paradigm that unites risk-aware decision-making with adaptive data governance mechanisms to [...] Read more.
In today’s volatile digital environments, conventional data governance practices fail to adequately address the dynamic, context-sensitive, and risk-hazardous nature of data use. This paper introduces DRADG (Dynamic Risk-Adaptive Data Governance), a new paradigm that unites risk-aware decision-making with adaptive data governance mechanisms to enhance resilience, compliance, and trust in complex data environments. Drawing on the convergence of existing data governance models, best practice risk management (DAMA-DMBOK, NIST, and ISO 31000), and real-world enterprise experience, this framework provides a modular, expandable approach to dynamically aligning governance strategy with evolving contextual factors and threats in data management. The contribution is in the form of a multi-layered paradigm combining static policy with dynamic risk indicator through application of data sensitivity categorization, contextual risk scoring, and use of feedback loops to continuously adapt. The technical contribution is in the governance-risk matrix formulated, mapping data lifecycle stages (acquisition, storage, use, sharing, and archival) to corresponding risk mitigation mechanisms. This is embedded through a semi-automated rules-based engine capable of modifying governance controls based on predetermined thresholds and evolving data contexts. Validation was obtained through simulation-based training in cross-border data sharing, regulatory adherence, and cloud-based data management. Findings indicate that DRADG enhances governance responsiveness, reduces exposure to compliance risks, and provides a basis for sustainable data accountability. The research concludes by providing guidelines for implementation and avenues for future research in AI-driven governance automation and policy learning. DRADG sets a precedent for imbuing intelligence and responsiveness at the heart of data governance operations of modern-day digital enterprises. Full article
(This article belongs to the Special Issue Information Management and Decision-Making)
Show Figures

Graphical abstract

33 pages, 326 KB  
Article
Intelligent Risk Identification in Construction Projects: A Case Study of an AI-Based Framework
by Kristijan Vilibić, Zvonko Sigmund and Ivica Završki
Buildings 2026, 16(2), 409; https://doi.org/10.3390/buildings16020409 - 19 Jan 2026
Viewed by 143
Abstract
Risk management in large-scale construction projects is a critical yet complex process influenced by financial, safety, environmental, scheduling, and regulatory uncertainties. Effective risk management contributes directly to project optimization by minimizing disruptions, controlling costs, and enhancing decision-making efficiency. Early identification and mitigation of [...] Read more.
Risk management in large-scale construction projects is a critical yet complex process influenced by financial, safety, environmental, scheduling, and regulatory uncertainties. Effective risk management contributes directly to project optimization by minimizing disruptions, controlling costs, and enhancing decision-making efficiency. Early identification and mitigation of risks allow resources to be allocated where they have the greatest effect, thereby optimizing overall project outcomes. However, conventional methods such as expert judgment and probabilistic modeling often struggle to process extensive datasets and complex interdependencies among risk factors. This study explores the potential of an AI-based framework for risk identification, utilizing artificial intelligence to analyze project documentation and generate a preliminary set of identified risks. The proposed methodology is implemented on the ‘Trg pravde’ judicial infrastructure project in Zagreb, Croatia, applying AI models (GPT-5, Gemini 2.5, Sonnet 4.5) to identify phase-specific risks throughout the project lifecycle. The approach aims to improve the efficiency of risk identification, reduce human bias, and align with established project management methodologies such as PM2. Initial findings suggest that the use of AI may broaden the range of identified risks and support more structured risk analysis, indicating its potential value as a complementary tool in risk management processes. However, human expertise remains crucial for prioritization, contextual interpretation, and mitigation. The study demonstrates that AI augments, rather than replaces, traditional risk management practices, enabling more proactive and data-driven decision-making in construction projects. Full article
(This article belongs to the Special Issue Applying Artificial Intelligence in Construction Management)
33 pages, 1706 KB  
Article
Codify, Condition, Capacitate: Expert Perspectives on Institution-First Blockchain–BIM Governance for PPP Transparency in Nigeria
by Akila Pramodh Rathnasinghe, Ashen Dilruksha Rahubadda, Kenneth Arinze Ede and Barry Gledson
FinTech 2026, 5(1), 10; https://doi.org/10.3390/fintech5010010 - 16 Jan 2026
Viewed by 184
Abstract
Road infrastructure underpins Nigeria’s economic competitiveness, yet Public–Private Partnership (PPP) performance is constrained not by inadequate legislation but by persistent weaknesses in enforcement and governance. Transparency deficits across procurement, design management, certification, and toll-revenue reporting have produced chronic delays, cost overruns, and declining [...] Read more.
Road infrastructure underpins Nigeria’s economic competitiveness, yet Public–Private Partnership (PPP) performance is constrained not by inadequate legislation but by persistent weaknesses in enforcement and governance. Transparency deficits across procurement, design management, certification, and toll-revenue reporting have produced chronic delays, cost overruns, and declining public trust. This study offers the first empirical investigation of blockchain–Building Information Modelling (BIM) integration as a transparency-enhancing mechanism within Nigeria’s PPP road sector, focusing on Lagos State. Using a qualitative design, ten semi-structured interviews with stakeholders across the PPP lifecycle were thematically analysed to diagnose systemic governance weaknesses and assess the contextual feasibility of digital innovations. Findings reveal entrenched opacity rooted in weak enforcement, discretionary decision-making, and informal communication practices—including biased bidder evaluations, undocumented design alterations, manipulated certifications, and toll-revenue inconsistencies. While respondents recognised BIM’s potential to centralise project information and blockchain’s capacity for immutable records and smart-contract automation, they consistently emphasised that technological benefits cannot be realised absent credible institutional foundations. The study advances an original theoretical contribution: the Codify–Condition–Capacitate framework, which explains the institutional preconditions under which digital governance tools can improve transparency. This framework argues that effectiveness depends on: codifying digital standards and legal recognition; conditioning enforcement mechanisms to reduce discretionary authority; and capacitating institutions through targeted training and phased pilots. The research generates significant practical implications for policymakers in Nigeria and comparable developing contexts seeking institution-aligned digital transformation. Methodological rigour was ensured through purposive sampling, thematic saturation assessment, and documented analytical trails. Full article
Show Figures

Figure 1

25 pages, 2452 KB  
Article
Predicting GPU Training Energy Consumption in Data Centers Using Task Metadata via Symbolic Regression
by Xiao Liao, Yiqian Li, Shaofeng Zhang, Xianzheng Wei and Jinlong Hu
Energies 2026, 19(2), 448; https://doi.org/10.3390/en19020448 - 16 Jan 2026
Viewed by 141
Abstract
With the rapid advancement of artificial intelligence (AI) technology, training deep neural networks has become a core computational task that consumes significant energy in data centers. Researchers often employ various methods to estimate the energy usage of data center clusters or servers to [...] Read more.
With the rapid advancement of artificial intelligence (AI) technology, training deep neural networks has become a core computational task that consumes significant energy in data centers. Researchers often employ various methods to estimate the energy usage of data center clusters or servers to enhance energy management and conservation efforts. However, accurately predicting the energy consumption and carbon footprint of a specific AI task throughout its entire lifecycle before execution remains challenging. In this paper, we explore the energy consumption characteristics of AI model training tasks and propose a simple yet effective method for predicting neural network training energy consumption. This approach leverages training task metadata and applies genetic programming-based symbolic regression to forecast energy consumption prior to executing training tasks, distinguishing it from time series forecasting of data center energy consumption. We have developed an AI training energy consumption environment using the A800 GPU and models from the ResNet{18, 34, 50, 101}, VGG16, MobileNet, ViT, and BERT families to collect data for experimentation and analysis. The experimental analysis of energy consumption reveals that the consumption curve exhibits waveform characteristics resembling square waves, with distinct peaks and valleys. The prediction experiments demonstrate that the proposed method performs well, achieving mean relative errors (MRE) of 2.67% for valley energy, 8.42% for valley duration, 5.16% for peak power, and 3.64% for peak duration. Our findings indicate that, within a specific data center, the energy consumption of AI training tasks follows a predictable pattern. Furthermore, our proposed method enables accurate prediction and calculation of power load before model training begins, without requiring extensive historical energy consumption data. This capability facilitates optimized energy-saving scheduling in data centers in advance, thereby advancing the vision of green AI. Full article
Show Figures

Figure 1

64 pages, 10763 KB  
Review
The State of HBIM in Digital Heritage: A Critical and Bibliometric Assessment of Six Emerging Frontiers (2015–2025)
by Fabrizio Banfi and Wanqin Liu
Appl. Sci. 2026, 16(2), 906; https://doi.org/10.3390/app16020906 - 15 Jan 2026
Viewed by 169
Abstract
After nearly two decades of developments in Historic/Heritage Building Information Modeling (HBIM), the field has reached a stage of maturity that calls for a critical reassessment of its evolution, achievements, and remaining challenges. Digital representation has become a central component of contemporary heritage [...] Read more.
After nearly two decades of developments in Historic/Heritage Building Information Modeling (HBIM), the field has reached a stage of maturity that calls for a critical reassessment of its evolution, achievements, and remaining challenges. Digital representation has become a central component of contemporary heritage conservation, enabling advanced methods for analysis, management, and communication. This review examines the maturation of HBIM as a comprehensive framework that integrates extended reality (XR), artificial intelligence (AI), machine learning (ML), semantic segmentation and Digital Twin (DT). Six major research domains that have shaped recent progress are outlined: (1) the application of HBIM to restoration and conservation workflows; (2) the expansion of public engagement through XR, virtual museums, and serious games; (3) the stratigraphic documentation of building archaeology, historical phases, and material decay; (4) data-exchange mechanisms and interoperability with open formats and Common Data Environments (CDEs); (5) strategies for modeling geometric and semantic complexity using traditional, applied, and AI-driven approaches; and (6) the emergence of heritage DT as dynamic, semantically enriched systems integrating real-time and lifecycle data. A comparative assessment of international case studies and bibliometric trends (2015–2025) illustrates how HBIM is transforming proactive and data-informed conservation practice. The review concludes by identifying persistent gaps and outlining strategic directions for the next phase of research and implementation. Full article
Show Figures

Figure 1

22 pages, 1943 KB  
Article
Repairing the Urban Metabolism: A Dynamic Life-Cycle and HJB Optimization Model for Resolving Spatio-Temporal Conflicts in Shared Parking Systems
by Jiangfeng Li, Jianlong Xiang, Fujian Chen, Longxin Zeng, Haiquan Wang, Yujie Li and Zhongyi Zhai
Systems 2026, 14(1), 91; https://doi.org/10.3390/systems14010091 - 14 Jan 2026
Viewed by 126
Abstract
Urban shared parking systems represent a complex socio-technical challenge. Despite vast potential, utilization remains persistently low (<15%), revealing a critical policy failure. To address this, this study develops a dynamic system framework based on Life-Cycle Cost (LCC) and Hamilton-Jacobi-Bellman (HJB) optimization to analyze [...] Read more.
Urban shared parking systems represent a complex socio-technical challenge. Despite vast potential, utilization remains persistently low (<15%), revealing a critical policy failure. To address this, this study develops a dynamic system framework based on Life-Cycle Cost (LCC) and Hamilton-Jacobi-Bellman (HJB) optimization to analyze and calibrate the key policy levers influencing owner participation timing (T*). The model, resolved using finite difference methods, captures the system’s non-linear threshold effects by simulating critical system parameters, including system instability (price volatility, σp), internal friction (management fee, wggt), and demand signals (transaction ratio, Q). Simulations reveal extreme non-linear system responses: a 100% increase in system instability (σp) delays participation by 325.5%. More critically, a 100% surge in internal friction (management fees) delays T* by 492% and triggers a 95% revenue collapse—demonstrating the risk of systemic collapse. Conversely, a 20% rise in the demand signal (Q) advances T* by 100% (immediate participation), indicating the system can be rapidly shifted to a new equilibrium by activating positive feedback loops. These findings support a sequenced calibration strategy: regulators must first manage instability via price stabilization, then counteract high friction with subsidies (e.g., 60%), and amplify demand loops. The LCC framework provides a novel dynamic decision support system for calibrating complex urban transportation systems, offering policymakers a tool for scenario testing to accelerate policy adoption and alleviate urban congestion. Full article
(This article belongs to the Section Complex Systems and Cybernetics)
Show Figures

Figure 1

23 pages, 5292 KB  
Article
Research on Rapid 3D Model Reconstruction Based on 3D Gaussian Splatting for Power Scenarios
by Huanruo Qi, Yi Zhou, Chen Chen, Lu Zhang, Peipei He, Xiangyang Yan and Mengqi Zhai
Sustainability 2026, 18(2), 726; https://doi.org/10.3390/su18020726 - 10 Jan 2026
Viewed by 310
Abstract
As core infrastructure of power transmission networks, power towers require high-precision 3D models, which are critical for intelligent inspection and digital twin applications of power transmission lines. Traditional reconstruction methods, such as LiDAR scanning and oblique photogrammetry, suffer from issues including high operational [...] Read more.
As core infrastructure of power transmission networks, power towers require high-precision 3D models, which are critical for intelligent inspection and digital twin applications of power transmission lines. Traditional reconstruction methods, such as LiDAR scanning and oblique photogrammetry, suffer from issues including high operational risks, low modeling efficiency, and loss of fine details. To address these limitations, this paper proposes a 3D Gaussian Splatting (3DGS)-based method for power tower 3D reconstruction to enhance reconstruction efficiency and detail preservation capability. First, a multi-view data acquisition scheme combining “unmanned aerial vehicle + oblique photogrammetry” was designed to capture RGB images acquired by Unmanned Aerial Vehicle (UAV) platforms, which are used as the primary input for 3D reconstruction. Second, a sparse point cloud was generated via Structure from Motion. Finally, based on 3DGS, Gaussian model initialization, differentiable rendering, and adaptive density control were performed to produce high-precision 3D models of power towers. Taking two typical power tower types as experimental subjects, comparisons were made with the oblique photogrammetry + ContextCapture method. Experimental results demonstrate that 3DGS not only achieves high model completeness (with the reconstructed model nearly indistinguishable from the original images) but also excels in preserving fine details such as angle steels and cables. Additionally, the final modeling time is reduced by over 70% compared to traditional oblique photogrammetry. 3DGS enables efficient and high-precision reconstruction of power tower 3D models, providing a reliable technical foundation for digital twin applications in power transmission lines. By significantly improving reconstruction efficiency and reducing operational costs, the proposed method supports sustainable power infrastructure inspection, asset lifecycle management, and energy-efficient digital twin applications. Full article
Show Figures

Figure 1

25 pages, 2007 KB  
Article
Symmetric–Asymmetric Security Synergy: A Quantum-Resilient Hybrid Blockchain Framework for Incognito IoT Data Sharing
by Chimeremma Sandra Amadi, Simeon Okechukwu Ajakwe and Taesoo Jun
Symmetry 2026, 18(1), 142; https://doi.org/10.3390/sym18010142 - 10 Jan 2026
Viewed by 208
Abstract
Secure and auditable data sharing in large-scale Internet of Things (IoT) environments remains a significant challenge due to weak trust coordination, limited scalability, and susceptibility to emerging quantum attacks. This study introduces a hybrid blockchain-based framework that integrates post-quantum cryptography with intelligent anomaly [...] Read more.
Secure and auditable data sharing in large-scale Internet of Things (IoT) environments remains a significant challenge due to weak trust coordination, limited scalability, and susceptibility to emerging quantum attacks. This study introduces a hybrid blockchain-based framework that integrates post-quantum cryptography with intelligent anomaly detection to ensure end-to-end data integrity and resilience. The proposed system utilizes Hyperledger Fabric for permissioned device lifecycle management and Ethereum for public auditability of encrypted telemetry, thereby providing both private control and transparent verification. Device identities are established using quantum-entropy-seeded credentials and safeguarded with lattice-based encryption to withstand quantum adversaries. A convolutional long short-term memory (CNN–LSTM) model continuously monitors device behavior, facilitating real-time trust scoring and autonomous revocation via smart contract triggers. Experimental results demonstrate 97.4% anomaly detection accuracy and a 0.968 F1-score, supporting up to 1000 transactions per second with cross-chain latency below 6 s. These findings indicate that the proposed architecture delivers scalable, quantum-resilient, and computationally efficient data sharing suitable for mission-critical IoT deployments. Full article
(This article belongs to the Special Issue Applications Based on Symmetry in Quantum Computing)
Show Figures

Figure 1

29 pages, 4853 KB  
Article
ROS 2-Based Architecture for Autonomous Driving Systems: Design and Implementation
by Andrea Bonci, Federico Brunella, Matteo Colletta, Alessandro Di Biase, Aldo Franco Dragoni and Angjelo Libofsha
Sensors 2026, 26(2), 463; https://doi.org/10.3390/s26020463 - 10 Jan 2026
Viewed by 461
Abstract
Interest in the adoption of autonomous vehicles (AVs) continues to grow. It is essential to design new software architectures that meet stringent real-time, safety, and scalability requirements while integrating heterogeneous hardware and software solutions from different vendors and developers. This paper presents a [...] Read more.
Interest in the adoption of autonomous vehicles (AVs) continues to grow. It is essential to design new software architectures that meet stringent real-time, safety, and scalability requirements while integrating heterogeneous hardware and software solutions from different vendors and developers. This paper presents a lightweight, modular, and scalable architecture grounded in Service-Oriented Architecture (SOA) principles and implemented in ROS 2 (Robot Operating System 2). The proposed design leverages ROS 2’s Data Distribution System-based Quality-of-Service model to provide reliable communication, structured lifecycle management, and fault containment across distributed compute nodes. The architecture is organized into Perception, Planning, and Control layers with decoupled sensor access paths to satisfy heterogeneous frequency and hardware constraints. The decision-making core follows an event-driven policy that prioritizes fresh updates without enforcing global synchronization, applying zero-order hold where inputs are not refreshed. The architecture was validated on a 1:10-scale autonomous vehicle operating on a city-like track. The test environment covered canonical urban scenarios (lane-keeping, obstacle avoidance, traffic-sign recognition, intersections, overtaking, parking, and pedestrian interaction), with absolute positioning provided by an indoor GPS (Global Positioning System) localization setup. This work shows that the end-to-end Perception–Planning pipeline consistently met worst-case deadlines, yielding deterministic behaviour even under stress. The proposed architecture can be deemed compliant with real-time application standards for our use case on the 1:10 test vehicle, providing a robust foundation for deployment and further refinement. Full article
(This article belongs to the Special Issue Sensors and Sensor Fusion for Decision Making for Autonomous Driving)
Show Figures

Figure 1

12 pages, 279 KB  
Perspective
Energy Demand, Infrastructure Needs and Environmental Impacts of Cryptocurrency Mining and Artificial Intelligence: A Comparative Perspective
by Marian Cătălin Voica, Mirela Panait and Ștefan Virgil Iacob
Energies 2026, 19(2), 338; https://doi.org/10.3390/en19020338 - 9 Jan 2026
Viewed by 353
Abstract
This perspective paper aims to set the stage for current development in the field of energy consumption and environmental impacts in two major digital industries: cryptocurrency mining and artificial intelligence (AI). To better understand current developments, this paper uses a comparative analytical framework [...] Read more.
This perspective paper aims to set the stage for current development in the field of energy consumption and environmental impacts in two major digital industries: cryptocurrency mining and artificial intelligence (AI). To better understand current developments, this paper uses a comparative analytical framework of life-cycle assessment principles and high-resolution grid modeling to explore the energy impacts from academic and industry data. On the one hand, while both sectors convert energy into digital value, they operate according to completely different logics, in the sense that cryptocurrencies rely on specialized hardware (application-specific integrated circuits) and seek cheap energy, where they can function as “virtual batteries” for the network, quickly shutting down at peak times, with increasing hardware efficiency. On the other hand, AI is a much more rigid emerging energy consumer, in the sense that it needs high-quality, uninterrupted energy and advanced infrastructure for high-performance Graphics Processing Units (GPUs). The training and inference stages generate massive consumption, difficult to quantify, and AI data centers put great pressure on the electricity grid. In this sense, the transition from mining to AI is limited due to differences in infrastructure, with the only reusable advantage being access to electrical capacity. Regarding competition between the two industries, this dynamic can fragment the energy grid, as AI tends to monopolize quality energy, and how states will manage this imbalance will influence the energy and digital security of the next decade. Full article
29 pages, 1499 KB  
Article
An Interoperable User-Centred Digital Twin Framework for Sustainable Energy System Management
by Aleeza Adeel, Mark Apperley and Timothy Gordon Walmsley
Energies 2026, 19(2), 333; https://doi.org/10.3390/en19020333 - 9 Jan 2026
Viewed by 375
Abstract
This paper presents an Interoperable User-Centred Digital Twin (I-UCDT) framework for sustainable energy system management, addressing the growing complexity of energy generation, storage, demand, and grid interaction across industrial and community-scale systems. The proposed framework provides a unified environment for the visual representation [...] Read more.
This paper presents an Interoperable User-Centred Digital Twin (I-UCDT) framework for sustainable energy system management, addressing the growing complexity of energy generation, storage, demand, and grid interaction across industrial and community-scale systems. The proposed framework provides a unified environment for the visual representation and management of interconnected energy components, supporting informed decision-making among diverse stakeholder groups. The I-UCDT framework adopts a modular plug-and-play architecture based on the Functional Mock-up Interface (FMI) standard, enabling scalable and interoperable integration of heterogeneous energy models from platforms such as Modelica, MATLAB/Simulink, and EnergyPlus. A standardised data layer processes and structures raw model inputs, while an interactive visualisation layer translates complex energy flows into intuitive, user-accessible insights. By applying human–computer interaction principles, the framework reduces cognitive load and enables users with varying technical backgrounds to explore supply–demand balancing, decarbonisation pathways, and optimisation strategies. It supports the full lifecycle of energy system design, planning, and operation, offering flexibility for both industrial and community-scale applications. A case study demonstrates the framework’s potential to enhance transparency, usability, and energy efficiency. Overall, this work advances digital twin research for energy systems by combining technical interoperability with explicitly formalised user-centred design characteristics (C1–C10) to promote flexible and sustainable energy system management. Full article
Show Figures

Figure 1

Back to TopTop