Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (45)

Search Parameters:
Keywords = interoperability model for test cases

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
44 pages, 2081 KB  
Systematic Review
Digital Twins Across the Asset Lifecycle: Technical, Organisational, Economic, and Regulatory Challenges
by Kangxing Dong and Taofeeq Durojaye Moshood
Buildings 2026, 16(5), 1084; https://doi.org/10.3390/buildings16051084 - 9 Mar 2026
Viewed by 329
Abstract
The construction industry faces persistent challenges in productivity, efficiency, and sustainability. Digital twin (DT) technology has emerged as a promising pathway for lifecycle optimisation, yet its construction adoption remains limited. Key barriers include fragmentation across project phases, weak data continuity at handover, and [...] Read more.
The construction industry faces persistent challenges in productivity, efficiency, and sustainability. Digital twin (DT) technology has emerged as a promising pathway for lifecycle optimisation, yet its construction adoption remains limited. Key barriers include fragmentation across project phases, weak data continuity at handover, and conceptual ambiguity between DT and Building Information Modelling (BIM). This systematic literature review analyses 160 peer-reviewed studies (2018–2026) selected from 463 Scopus records using a PRISMA-guided process and inter-rater reliability testing (Cohen’s κ = 0.83). The review clarifies that DTs extend beyond BIM in three ways: they enable bidirectional, automated physical-digital data exchange; integrate heterogeneous real-time sources such as IoT sensors and operational systems; and maintain lifecycle continuity from design through to end-of-life. Select advanced implementations report notable performance gains. These include rework and logistics reductions of up to 80%, cost savings of approximately 5%, schedule acceleration of around two months, energy reductions of 15–30%, and maintenance cost reductions of 10–25%. These figures reflect case-level outcomes from high-performing pilots and should not be read as typical industry benchmarks. Broader adoption remains constrained by interoperability gaps, data quality challenges, digital maturity deficits, misaligned stakeholder incentives, and paper-based regulatory environments. DTs represent a socio-technical transformation, not a standalone technology upgrade. Realising their potential requires coordinated progress in standards development, governance frameworks, collaborative delivery models, and workforce capability. Future research should focus on scalable interoperability, longitudinal lifecycle value validation, human-centred adoption strategies, and sustainability assessment methods to support evidence-based diffusion of DTs in the built environment. Full article
Show Figures

Figure 1

27 pages, 8178 KB  
Article
A Hierarchical, Modular Interface and Verification Architecture for Mission Planning in Swarm Unmanned Surface Vehicles: Simulation and Sea-Trial Validation
by Hee-Mun Park, Jin-Hyeon Sung, Hong-Sun Park, Yeong-Hyun Lim, Joono Sur and Kyung-Min Seo
J. Mar. Sci. Eng. 2026, 14(3), 302; https://doi.org/10.3390/jmse14030302 - 4 Feb 2026
Viewed by 469
Abstract
Swarm Unmanned Surface Vehicles (SUSV) must sustain real-time coordination as fleet scale and formation change. We propose hierarchical and modular architecture that decouples mission-planning algorithms from interface evolution, composed of an Interface Adapter System (IAS), a Mission Execution System (MES), and an Interoperation [...] Read more.
Swarm Unmanned Surface Vehicles (SUSV) must sustain real-time coordination as fleet scale and formation change. We propose hierarchical and modular architecture that decouples mission-planning algorithms from interface evolution, composed of an Interface Adapter System (IAS), a Mission Execution System (MES), and an Interoperation Integration Verification System (IIVS). The IAS standardizes and integrates data from diverse sensors and subsystems through modular adapters, facilitating flexible subsystem integration. The MES employs a discrete-event system specification (DEVS)-based modeling approach, providing independent and efficient mission execution capability without necessitating interface modifications. The IIVS utilizes LabVIEW-based analytical methods to verify and validate subsystem interoperability continuously, enabling rapid and reliable scaling. The architecture was implemented in an operational SUSV program and evaluated through simulation experiments and sea trials. During scale-up from 10 to 20 USVs, mission-cycle deadlines (≤250 ms) were met in 99% of cases, while integration lead time for scale-up decreased by about 80%. Message-level tests confirmed robust interoperation under increased load, and algorithm-level tests showed stable plan re-computation under dynamic tasking. These results indicate improved interoperability, scalability, and reliability, offering a practical blueprint for mission planning in maritime swarms. Full article
Show Figures

Figure 1

18 pages, 12523 KB  
Article
Automatic Generation of NGSI-LD Data Models from RDF Ontologies: Developmental Studies of Children and Adolescents Use Case
by Franc Drobnič, Gregor Starc, Gregor Jurak, Andrej Kos and Matevž Pustišek
Appl. Sci. 2026, 16(2), 992; https://doi.org/10.3390/app16020992 - 19 Jan 2026
Viewed by 322
Abstract
In the era of ever-greater data production and collection, public health research is often limited by the scarcity of data. To improve this, we propose data sharing in the form of Data Spaces, which provide technical, business, and legal conditions for an easier [...] Read more.
In the era of ever-greater data production and collection, public health research is often limited by the scarcity of data. To improve this, we propose data sharing in the form of Data Spaces, which provide technical, business, and legal conditions for an easier and trustworthy data exchange for all the participants. The data must be described in a commonly understandable way, which can be assured by machine-readable ontologies. We compared the semantic interoperability technologies used in the European Data Spaces initiatives and adopted them in our use case of physical development in children and youth. We propose an ontology describing data from the Analysis of Children’s Development in Slovenia (ACDSi) study in the Resource Description Framework (RDF) format and a corresponding Next Generation Systems Interface-Linked Data (NGSI-LD) data model. For this purpose, we have developed a tool to generate an NGSI-LD data model using information from an ontology in RDF format. The tool builds on the declaration from the standard that the NGSI-LD information model follows the graph structure of RDF, so that such translation is feasible. The source RDF ontology is analyzed using the standardized SPARQL Protocol and RDF Query Language (SPARQL), specifically using Property Path queries. The NGSI-LD data model is generated from the definitions collected in the analysis. The translation has been verified on Smart Applications REFerence (SAREF) ontology SAREF4BLDG and its corresponding Smart Data Models (52 models at the time). The generated artifacts have been tested on a Context Broker reference implementation. The tool supports basic ontology structures, and for it to translate more complex structures, further development is needed. Full article
Show Figures

Figure 1

32 pages, 5439 KB  
Article
Architectural and Structural Interoperability in the BIM Design Workflow
by Piero Colajanni, Laura Inzerillo, Alessandro Pisciotta, Francesco Acuto, Konstantinos Mantalovas and Gaetano Di Mino
Buildings 2025, 15(24), 4540; https://doi.org/10.3390/buildings15244540 - 16 Dec 2025
Viewed by 1537
Abstract
Achieving reliable interoperability between architectural and structural models remains one of the main challenges in BIM-based design workflows. Despite the widespread adoption of Building Information Modeling, the automatic transfer of information between modeling software and FEM analysis tools continues to generate inconsistencies, information [...] Read more.
Achieving reliable interoperability between architectural and structural models remains one of the main challenges in BIM-based design workflows. Despite the widespread adoption of Building Information Modeling, the automatic transfer of information between modeling software and FEM analysis tools continues to generate inconsistencies, information loss, and the need for manual interventions. This study examines these issues through the case study of a reinforced-concrete residential building located in Palermo, used to evaluate BIM-to-FEM exchanges between Revit®, Robot Structural Analysis®, PRO_SAP®, and JASP®. The interoperability tests highlight significant limitations in both native and IFC-based workflows. The direct Revit–Robot link ensures good geometric consistency but still requires manual correction of analytical axes, connections, and boundary conditions. Indirect transfers via IFC exhibit greater instability: both IFC2x3 Coordination View 2.0 and IFC4 Reference View show difficulties in correctly interpreting structural elements and do not adequately preserve analytical relationships, resulting in unconnected slabs, disconnected nodes, and missing constraint information. In PRO_SAP®, several elements are also absent after IFC import. To address these issues, the study proposes a workflow based on the integration of Revit® and JASP® aimed at generating a reliable federated model. This model was further validated in Navisworks®, Solibri Anywhere®, BIM Vision®, and Enscape® to assess its correct interpretation across different software environments. This approach enhances interdisciplinary coordination, supports clash detection, facilitates immersive VR-based review, and centralizes architectural, structural, and MEP models into a unified environment. The results show that structured workflows and careful management of native and IFC transfers significantly improve model reliability and reduce design inconsistencies. Full article
Show Figures

Figure 1

35 pages, 2154 KB  
Article
Real-Time Digital Twins for Building Energy Optimization Through Blind Control: Functional Mock-Up Units, Docker Container-Based Simulation, and Surrogate Models
by Cristina Nuevo-Gallardo, Iker Landa del Barrio, Markel Flores Iglesias, Juan B. Echeverría Trueba and Carlos Fernández Bandera
Appl. Sci. 2025, 15(24), 12888; https://doi.org/10.3390/app152412888 - 6 Dec 2025
Cited by 1 | Viewed by 993
Abstract
The transition toward energy-efficient and smart buildings requires Digital Twins (DTs) that can couple real-time data with physics-based Building Energy Models (BEMs) for predictive and adaptive operation. Yet, despite rapid digitalisation, there remains a lack of practical guidance and real-world implementations demonstrating how [...] Read more.
The transition toward energy-efficient and smart buildings requires Digital Twins (DTs) that can couple real-time data with physics-based Building Energy Models (BEMs) for predictive and adaptive operation. Yet, despite rapid digitalisation, there remains a lack of practical guidance and real-world implementations demonstrating how calibrated BEMs can be effectively integrated into Building Management Systems (BMSs). This study addresses that gap by presenting a complete and reproducible end-to-end framework for embedding physics-based BEMs into operational DTs using two setups: (i) encapsulation as Functional Mock-up Units (FMUs) and (ii) containerisation via Docker. Both approaches were deployed and tested in a real educational building in Cáceres (Spain), equipped with a LoRaWAN-based sensing and actuation infrastructure. A systematic comparison highlights their respective trade-offs: FMUs offer faster execution but limited weather inputs and higher implementation effort, whereas Docker-based workflows provide full portability, scalability, and native interoperability with Internet of Things (IoT) and BMS architectures. To enable real-time operation, a surrogate modelling framework was embedded within the Docker architecture to replicate the optimisation logic of the calibrated BEM and generate predictive blind control schedules in milliseconds—bypassing simulation overhead and enabling continuous actuation. The combined Docker + surrogate setup achieved 10–15% heating energy savings during winter operation without any HVAC retrofit. Beyond the case study, this work provides a step-by-step, in-depth guideline for practitioners to integrate calibrated BEMs into real-time control loops using existing toolchains. The proposed approach demonstrates how hybrid physics- and data-driven DTs can transform building management into a scalable, energy-efficient, and operationally deployable reality. Full article
Show Figures

Figure 1

27 pages, 1802 KB  
Perspective
Toward Artificial Intelligence in Oncology and Cardiology: A Narrative Review of Systems, Challenges, and Opportunities
by Visar Vela, Ali Yasin Sonay, Perparim Limani, Lukas Graf, Besmira Sabani, Diona Gjermeni, Andi Rroku, Arber Zela, Era Gorica, Hector Rodriguez Cetina Biefer, Uljad Berdica, Euxhen Hasanaj, Adisa Trnjanin, Taulant Muka and Omer Dzemali
J. Clin. Med. 2025, 14(21), 7555; https://doi.org/10.3390/jcm14217555 - 24 Oct 2025
Cited by 1 | Viewed by 2172
Abstract
Background: Artificial intelligence (AI), the overarching field that includes machine learning (ML) and its subfield deep learning (DL), is rapidly transforming clinical research by enabling the analysis of high-dimensional data and automating the output of diagnostic and prognostic tests. As clinical trials become [...] Read more.
Background: Artificial intelligence (AI), the overarching field that includes machine learning (ML) and its subfield deep learning (DL), is rapidly transforming clinical research by enabling the analysis of high-dimensional data and automating the output of diagnostic and prognostic tests. As clinical trials become increasingly complex and costly, ML-based approaches (especially DL for image and signal data) offer promising solutions, although they require new approaches in clinical education. Objective: Explore current and emerging AI applications in oncology and cardiology, highlight real-world use cases, and discuss the challenges and future directions for responsible AI adoption. Methods: This narrative review summarizes various aspects of AI technology in clinical research, exploring its promise, use cases, and its limitations. The review was based on a literature search in PubMed covering publications from 2019 to 2025. Search terms included “artificial intelligence”, “machine learning”, “deep learning”, “oncology”, “cardiology”, “digital twin”. and “AI-ECG”. Preference was given to studies presenting validated or clinically applicable AI tools, while non-English articles, conference abstracts, and gray literature were excluded. Results: AI demonstrates significant potential in improving diagnostic accuracy, facilitating biomarker discovery, and detecting disease at an early stage. In clinical trials, AI improves patient stratification, site selection, and virtual simulations via digital twins. However, there are still challenges in harmonizing data, validating models, cross-disciplinary training, ensuring fairness, explainability, as well as the robustness of gold standards to which AI models are built. Conclusions: The integration of AI in clinical research can enhance efficiency, reduce costs, and facilitate clinical research as well as lead the way towards personalized medicine. Realizing this potential requires robust validation frameworks, transparent model interpretability, and collaborative efforts among clinicians, data scientists, and regulators. Interoperable data systems and cross-disciplinary education will be critical to enabling the integration of scalable, ethical, and trustworthy AI into healthcare. Full article
(This article belongs to the Section Clinical Research Methods)
Show Figures

Figure 1

28 pages, 3015 KB  
Article
Systemic Assessment of IoT Readiness and Economic Impact in Postal Services
by Kristína Kováčiková, Martin Baláž, Martina Kováčiková and Andrej Novák
Systems 2025, 13(10), 910; https://doi.org/10.3390/systems13100910 - 17 Oct 2025
Viewed by 728
Abstract
This research develops and applies the IoTRIM model to assess the economic and operational implications of IoT integration in postal and courier enterprises in Slovakia. Combining a multi-criteria evaluation framework with an extended Cobb–Douglas production function, the analysis captures both readiness levels and [...] Read more.
This research develops and applies the IoTRIM model to assess the economic and operational implications of IoT integration in postal and courier enterprises in Slovakia. Combining a multi-criteria evaluation framework with an extended Cobb–Douglas production function, the analysis captures both readiness levels and their translation into output performance. The IoTRIM assessment reveals heterogeneous distributions of strengths across four strategic and technical pillars, with notable disparities between connectivity, data analytics, and interoperability capacities. Monte Carlo simulations under pessimistic, realistic, and optimistic scenarios highlight divergent digital trajectories among enterprises, with some demonstrating accelerated gains from IoT readiness while others face structural bottlenecks in infrastructure and process integration. Hypothesis testing indicates that while a positive and statistically significant relationship between IoT readiness and output is observed in selected cases, this effect is not universal across all enterprises and scenarios. However, the inclusion of IoT readiness consistently improves the explanatory power of the production function models. The findings underline that digital transformation outcomes depend not only on investment scale but also on systemic absorption capacity, including interoperability, data governance, and organizational alignment. The proposed approach offers both a methodological contribution for measuring digital readiness impacts and practical insights for strategic planning in the postal and courier sector. Full article
(This article belongs to the Section Systems Practice in Social Science)
Show Figures

Figure 1

15 pages, 884 KB  
Article
Enhancing Sustainability Through Quality Controlled Energy Data: The Horizon 2020 EnerMaps Project
by Simon Pezzutto, Dario Bottino-Leone and Eric John Wilczynski
Sustainability 2025, 17(17), 7684; https://doi.org/10.3390/su17177684 - 26 Aug 2025
Cited by 1 | Viewed by 1261
Abstract
The Horizon 2020 EnerMaps project addresses the fragmentation and variable reliability of European energy datasets by developing a reproducible quality control (QC) framework aligned with FAIR principles. This research supports sustainability goals by enabling better decision making in energy management, resource optimization, and [...] Read more.
The Horizon 2020 EnerMaps project addresses the fragmentation and variable reliability of European energy datasets by developing a reproducible quality control (QC) framework aligned with FAIR principles. This research supports sustainability goals by enabling better decision making in energy management, resource optimization, and sustainable policy development. This study applies this framework to an initial inventory of 50 spatially referenced energy datasets, classifying them into three assessment levels and subjecting each level to progressively deeper checks: expert consultation, metadata verification against a customized “DataCite/schema.org” schema, documentation review, completeness analysis, consistency testing via simple linear regressions, comparative descriptive statistics, and community feedback preparation. The results show that all datasets are findable and accessible, yet critical FAIR attributes remain weak: 68% lack explicit licenses and 96% omit terms-of-use statements; methodology descriptions are present in 77% of cases, while quantitative accuracy information appears in only 43%. Completeness screening reveals that more than half of the datasets exhibit over 20% missing values in one or more key dimensions. Consistency analyses nevertheless indicate statistically significant correlations (p < 0.05) for the majority of paired comparisons, supporting basic reliability. By improving the FAIRness (Findable, Accessible, Interoperable, Reusable) of energy data, this study directly contributes to more effective sustainability assessments and interventions. The proposed QC workflow therefore provides a scalable route to improve the transparency, comparability, and reusability of heterogeneous energy data, and its adoption could accelerate open energy modelling and policy analysis across Europe. Full article
Show Figures

Figure 1

23 pages, 3864 KB  
Article
Co-Optimization of Market and Grid Stability in High-Penetration Renewable Distribution Systems with Multi-Agent
by Dongli Jia, Zhaoying Ren and Keyan Liu
Energies 2025, 18(12), 3209; https://doi.org/10.3390/en18123209 - 19 Jun 2025
Cited by 5 | Viewed by 1536
Abstract
The large-scale integration of renewable energy and electric vehicles(EVs) into power distribution systems presents complex operational challenges, particularly in coordinating market mechanisms with grid stability requirements. This study proposes a new dispatching method based on dynamic electricity prices to coordinate the relationship between [...] Read more.
The large-scale integration of renewable energy and electric vehicles(EVs) into power distribution systems presents complex operational challenges, particularly in coordinating market mechanisms with grid stability requirements. This study proposes a new dispatching method based on dynamic electricity prices to coordinate the relationship between the market and the physical characteristics of the power grid. The proposed approach introduces a multi-agent transaction model incorporating voltage regulation metrics and network loss considerations into market bidding mechanisms. For EV integration, a differentiated scheduling strategy categorizes vehicles based on usage patterns and charging elasticity. The methodological innovations primarily include an enhanced scheduling algorithm for coordinated optimization of renewable energy and energy storage, and a dynamic coordinated optimization method for EV clusters. Implemented on a modified IEEE test system, the framework demonstrates improved voltage stability through price-guided energy storage dispatch, with coordinated strategies effectively balancing peak demand management and renewable energy utilization. Case studies verify the system’s capability to align economic incentives with technical objectives, where time-of-use pricing dynamically regulates storage operations to enhance reactive power support during critical periods. This research establishes a theoretical linkage between electricity market dynamics and grid security constraints, providing system operators with a holistic tool for managing high-renewable penetration networks. By bridging market participation with operational resilience, this work contributes actionable insights for developing interoperable electricity market architectures in energy transition scenarios. Full article
Show Figures

Figure 1

11 pages, 535 KB  
Review
Data-Driven Defragmentation: Achieving Value-Based Sarcoma and Rare Cancer Care Through Integrated Care Pathway Mapping
by Bruno Fuchs and Philip Heesen
J. Pers. Med. 2025, 15(5), 203; https://doi.org/10.3390/jpm15050203 - 19 May 2025
Cited by 1 | Viewed by 1258
Abstract
Sarcomas, a rare and complex group of cancers, require multidisciplinary care across multiple healthcare settings, often leading to delays, redundant testing, and fragmented data. This fragmented care landscape obstructs the implementation of Value-Based Healthcare (VBHC), where care efficiency is tied to measurable patient [...] Read more.
Sarcomas, a rare and complex group of cancers, require multidisciplinary care across multiple healthcare settings, often leading to delays, redundant testing, and fragmented data. This fragmented care landscape obstructs the implementation of Value-Based Healthcare (VBHC), where care efficiency is tied to measurable patient outcomes.ShapeHub, an interoperable digital platform, aims to streamline sarcoma care by centralizing patient data across providers, akin to a logistics system tracking an item through each stage of delivery. ShapeHub integrates diagnostics, treatment records, and specialist consultations into a unified dataset accessible to all care providers, enabling timely decision-making and reducing diagnostic delays. In a case study within the Swiss Sarcoma Network, ShapeHub has shown substantial impact, improving diagnostic pathways, reducing unplanned surgeries, and optimizing radiotherapy protocols. Through AI-driven natural language processing, Fast Healthcare Interoperability Resources, and Health Information Exchanges, HIEs, the platform transforms unstructured records into real-time, actionable insights, enhancing multidisciplinary collaboration and clinical outcomes. By identifying redundancies, ShapeHub also contributes to cost efficiency, benchmarking treatment costs across institutions and optimizing care pathways. This data-driven approach creates a foundation for precision medicine applications, including digital twin technology, to predict treatment responses and personalize care plans. ShapeHub offers a scalable model for managing rare cancers and complex diseases, harmonizing care pathways, improving precision oncology, and transforming VBHC into a reality. This article outlines the potential of ShapeHub to overcome fragmented data barriers and improve patient-centered care. Full article
(This article belongs to the Section Methodology, Drug and Device Discovery)
Show Figures

Figure 1

17 pages, 4337 KB  
Article
Building Information Modeling (BIM)-Based Building Life Cycle Assessment (LCA) Using Industry Foundation Classes (IFC) File Format
by Ksenia Strelets, Daria Zaborova, David Kokaya, Marina Petrochenko and Egor Melekhin
Sustainability 2025, 17(7), 2848; https://doi.org/10.3390/su17072848 - 24 Mar 2025
Cited by 8 | Viewed by 3086
Abstract
In the realm of sustainable construction, Life Cycle Assessment (LCA) plays a key role as a tool for quantifying the environmental impacts of building materials and products. The integration of LCA and Building Information Modeling (BIM) makes it possible to evaluate the environmental [...] Read more.
In the realm of sustainable construction, Life Cycle Assessment (LCA) plays a key role as a tool for quantifying the environmental impacts of building materials and products. The integration of LCA and Building Information Modeling (BIM) makes it possible to evaluate the environmental performance of buildings at the design stage. This integration can help to improve the LCA process for buildings thanks to the potential for automation and interoperability. The goal of this study is to establish a BIM-based LCA workflow using the Industry Foundation Classes (IFC) open file format. The interoperability of BIM data exchange is achieved by applying IFC. The steps of the assessment process are described in accordance with the LCA phases outlined in the ISO 14040 standard. The impact assessment and results interpretation phases are automated by means of a program code for IFC file processing. The proposed BIM-based LCA is validated for a case study of a BIM model constructed for a three-story educational building. The GWP of the building materials and products of envelope and load-bearing structures at the A1–A3 life cycle stages are calculated for the purpose of proposed workflow testing. The resulting workflow allows for the calculation of negative environmental impacts to be agile, depending on the goal and scope set. Full article
Show Figures

Figure 1

34 pages, 12341 KB  
Article
Development and Validation of Digital Twin Behavioural Model for Virtual Commissioning of Cyber-Physical System
by Roman Ruzarovsky, Tibor Horak, Roman Zelník, Richard Skypala, Martin Csekei, Ján Šido, Eduard Nemlaha and Michal Kopcek
Appl. Sci. 2025, 15(5), 2859; https://doi.org/10.3390/app15052859 - 6 Mar 2025
Cited by 8 | Viewed by 4598
Abstract
Modern manufacturing systems are influenced by the growing complexity of mechatronics, control systems, IIoT, and communication technologies integrated into cyber-physical systems. These systems demand flexibility, modularity, and rapid project execution, making digital tools critical for their design. Virtual commissioning, based on digital twins, [...] Read more.
Modern manufacturing systems are influenced by the growing complexity of mechatronics, control systems, IIoT, and communication technologies integrated into cyber-physical systems. These systems demand flexibility, modularity, and rapid project execution, making digital tools critical for their design. Virtual commissioning, based on digital twins, enables the testing and validation of control systems and designs in virtual environments, reducing risks and accelerating time-to-market. This research explores the development of digital twin models to bridge the gap between simulation and real-world validation. The models identify design flaws, validate the PLC control code, and ensure interoperability across software platforms. A case study involving a modular Festo manufacturing system modelled in Tecnomatix Process Simulate demonstrates the ability of digital twins to detect inefficiencies, such as collision risks, and to validate automation systems virtually. This study highlights the advantages of virtual commissioning for optimizing manufacturing systems. Communication testing showed compatibility across platforms but revealed limitations with certain data types due to software constraints. This research provides practical insights into creating robust digital twin models, improving the flexibility, efficiency, and quality of manufacturing system design. It also offers recommendations to address current challenges in interoperability and system performance. Full article
Show Figures

Figure 1

32 pages, 2442 KB  
Article
Federated Learning System for Dynamic Radio/MEC Resource Allocation and Slicing Control in Open Radio Access Network
by Mario Martínez-Morfa, Carlos Ruiz de Mendoza, Cristina Cervelló-Pastor and Sebastia Sallent-Ribes
Future Internet 2025, 17(3), 106; https://doi.org/10.3390/fi17030106 - 26 Feb 2025
Cited by 2 | Viewed by 2637
Abstract
The evolution of cellular networks from fifth-generation (5G) architectures to beyond 5G (B5G) and sixth-generation (6G) systems necessitates innovative solutions to overcome the limitations of traditional Radio Access Network (RAN) infrastructures. Existing monolithic and proprietary RAN components restrict adaptability, interoperability, and optimal resource [...] Read more.
The evolution of cellular networks from fifth-generation (5G) architectures to beyond 5G (B5G) and sixth-generation (6G) systems necessitates innovative solutions to overcome the limitations of traditional Radio Access Network (RAN) infrastructures. Existing monolithic and proprietary RAN components restrict adaptability, interoperability, and optimal resource utilization, posing challenges in meeting the stringent requirements of next-generation applications. The Open Radio Access Network (O-RAN) and Multi-Access Edge Computing (MEC) have emerged as transformative paradigms, enabling disaggregation, virtualization, and real-time adaptability—which are key to achieving ultra-low latency, enhanced bandwidth efficiency, and intelligent resource management in future cellular systems. This paper presents a Federated Deep Reinforcement Learning (FDRL) framework for dynamic radio and edge computing resource allocation and slicing management in O-RAN environments. An Integer Linear Programming (ILP) model has also been developed, resulting in the proposed FDRL solution drastically reducing the system response time. On the other hand, unlike centralized Reinforcement Learning (RL) approaches, the proposed FDRL solution leverages Federated Learning (FL) to optimize performance while preserving data privacy and reducing communication overhead. Comparative evaluations against centralized models demonstrate that the federated approach improves learning efficiency and reduces bandwidth consumption. The system has been rigorously tested across multiple scenarios, including multi-client O-RAN environments and loss-of-synchronization conditions, confirming its resilience in distributed deployments. Additionally, a case study simulating realistic traffic profiles validates the proposed framework’s ability to dynamically manage radio and computational resources, ensuring efficient and adaptive O-RAN slicing for diverse and high-mobility scenarios. Full article
(This article belongs to the Special Issue AI and Security in 5G Cooperative Cognitive Radio Networks)
Show Figures

Figure 1

17 pages, 2115 KB  
Article
Expanding IMO Compendium with NAVTEX Messages for Maritime Single Window
by Changui Lee and Seojeong Lee
J. Mar. Sci. Eng. 2024, 12(12), 2328; https://doi.org/10.3390/jmse12122328 - 19 Dec 2024
Cited by 1 | Viewed by 2636
Abstract
The International Maritime Organization (IMO) introduced the Maritime Service Portfolio (MSP) and Maritime Single Window (MSW) to enhance the digitalization and efficiency of maritime transportation. While the MSP defines 16 maritime services focused on safety, security, efficiency, and environmental protection, the MSW provides [...] Read more.
The International Maritime Organization (IMO) introduced the Maritime Service Portfolio (MSP) and Maritime Single Window (MSW) to enhance the digitalization and efficiency of maritime transportation. While the MSP defines 16 maritime services focused on safety, security, efficiency, and environmental protection, the MSW provides a unified digital platform for submitting and processing information related to a ship’s operations. To support the implementation of MSW, the IMO Compendium provides standardized data sets and reference models to enable seamless information exchange across maritime systems. This paper proposes an expansion of the IMO Compendium to integrate the MSP’s maritime safety information service into the MSW environment. The study focuses on the integration of NAVTEX messages, a key source of navigational safety information, by identifying their key attributes and structuring them according to the IHO S-124 standard. A case study demonstrates the feasibility of the proposed data structure by transforming a sample NAVTEX message into the expanded IMO Compendium format and testing its transmission using an open-source MQTT library. This paper provides a structured methodology for integrating NAVTEX messages, effectively bridging legacy systems with modern digital infrastructures and facilitating enhanced interoperability in maritime operations. The proposed data structure will be presented to standardization bodies for further consideration, contributing to ongoing efforts to improve maritime operational efficiency and support digital transformation. Full article
Show Figures

Figure 1

21 pages, 1023 KB  
Article
Enablers to Digitalization in Agriculture: A Case Study from Italian Field Crop Farms in the Po River Valley, with Insights for Policy Targeting
by Azzurra Giorgio, Laura Priscila Penate Lopez, Danilo Bertoni, Daniele Cavicchioli and Giovanni Ferrazzi
Agriculture 2024, 14(7), 1074; https://doi.org/10.3390/agriculture14071074 - 3 Jul 2024
Cited by 8 | Viewed by 3459
Abstract
The prosperity of Po River Valley’s quality agri-food system depends on the efficiency of its field crops, which are recently facing a crisis evidenced by cultivated areas decreasing and yields stagnating. Several factors, including EU policies and climate variability, impose an improvement in [...] Read more.
The prosperity of Po River Valley’s quality agri-food system depends on the efficiency of its field crops, which are recently facing a crisis evidenced by cultivated areas decreasing and yields stagnating. Several factors, including EU policies and climate variability, impose an improvement in the use of production factors and adapted business models: literature shows how digitalization and Agriculture 4.0 can contribute to addressing these challenges. This paper aims to explore drivers and barriers in the adoption of digitalization among Po River Valley field crop farms, in a dynamic view. Using a case study approach to guarantee adequate consideration of context and conditions, three farms were studied. As one of the main outcomes, several drivers (digital skills, data management practices, and interoperability) that should be at the heart of policies were identified as demands to farmers in exchange for financial contributions, or as “innovation space” offered by EU institutions. Policies should not only focus on supporting mechanical/digital equipment acquisition but also on promoting the evolution of farmers’ human capital. The framework developed paves the way for future research on the degree of farm digitalization in the same/similar territorial contexts: identified drivers of digital transition can be used as a basis for survey questionnaires, as well as tested in their validity. Full article
(This article belongs to the Section Agricultural Economics, Policies and Rural Management)
Show Figures

Figure 1

Back to TopTop