Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (103)

Search Parameters:
Keywords = RESTful API

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 848 KB  
Article
Automated Multi-Platform EDI Integration for B2B Retail: A Romanian Case Study on System Architecture, Implementation, and e-Factura Convergence
by Ionut Adrian Tudoroiu, Andrei Cosmin Gheorghe and Emil Mihai Diaconu
Electronics 2026, 15(7), 1475; https://doi.org/10.3390/electronics15071475 - 1 Apr 2026
Viewed by 264
Abstract
The mandatory introduction of Romania’s national e-invoicing system, ANAF e-Factura, in January 2024 has reshaped B2B document exchange in the retail sector, but suppliers still operate in parallel with two proprietary electronic data interchange (EDI) platforms, EDINET and DocProcess, which increases integration complexity. [...] Read more.
The mandatory introduction of Romania’s national e-invoicing system, ANAF e-Factura, in January 2024 has reshaped B2B document exchange in the retail sector, but suppliers still operate in parallel with two proprietary electronic data interchange (EDI) platforms, EDINET and DocProcess, which increases integration complexity. This paper presents the architecture, implementation, and evaluation of a custom Laravel-based B2B platform developed to automate commercial workflows across these three channels. The system supports XML purchase order ingestion and normalization, product identifier resolution, unified order persistence, platform-specific invoice XML generation, and ANAF SPV submission via SmartBill and Oblio REST APIs. A comparative analysis of real production XML documents showed full field-level overlap across 21 invoice data dimensions, with the main differences between systems related to entity identification schemes rather than business information content. During 2025, the platform processed 1247 EDI purchase orders and achieved an 87.30% fully automated processing rate, reaching 94.60% by year-end through progressive product catalog enrichment. The results indicate that ANAF e-Factura is technically capable of covering the core invoice exchange function currently duplicated by proprietary EDI platforms, while their coexistence continues to impose additional integration effort and slows SME digital transformation, particularly for small and medium-sized suppliers. Full article
Show Figures

Figure 1

13 pages, 268 KB  
Article
Hard Gelatin Capsules Compounded and Dispersed in Water in Pediatrics: Real Versus Theoretical Dose Administered
by Romain Paoli-Lombardo, Nicolas Primas, Clémence Tabélé, Ikram Zaddam, Eya Iben Slimene, Pascal Rathelot, Patrice Vanelle, Caroline Castera-Ducros and Christophe Curti
Pharmaceuticals 2026, 19(4), 534; https://doi.org/10.3390/ph19040534 - 25 Mar 2026
Viewed by 296
Abstract
Background: In pediatric practice, dose individualization often requires the manipulation of solid oral dosage forms, such as dispersing capsules in water and administering only part of the volume. Despite its frequent use, this practice is poorly documented and may lead to inaccurate [...] Read more.
Background: In pediatric practice, dose individualization often requires the manipulation of solid oral dosage forms, such as dispersing capsules in water and administering only part of the volume. Despite its frequent use, this practice is poorly documented and may lead to inaccurate dosing. Objectives: This study aimed to assess the actual dose administered when compounded hard gelatin capsules are dispersed in water and partially withdrawn, and to evaluate the influence of different manipulation protocols on dose recovery. Methods: Ten active pharmaceutical ingredients (APIs) routinely compounded as pediatric hard gelatin capsules were studied. Content uniformity was first verified according to European Pharmacopoeia (EP) requirements. One capsule was dispersed in 2 mL of water, and 1 mL was withdrawn using three protocols: (1) no mixing, (2) gentle manual mixing with immediate sampling, and (3) gentle manual mixing followed by a 10 s resting period before sampling. Drug content in the withdrawn volume was quantified using validated HPLC-UV methods. Results are expressed as the mean percentage of the theoretical dose ± standard deviation. Results: All capsules complied with EP content uniformity criteria. However, partial volume administration resulted in marked and protocol-dependent deviations from the theoretical dose. Without mixing, recovered doses ranged from 17% to 58% of the target dose, with high variability. Gentle mixing improved dose recovery, particularly for APIs forming solutions, such as captopril, thiamine hydrochloride, and clonidine hydrochloride, which achieved values close to 90%. In contrast, APIs forming suspensions consistently resulted in underdosing, even after mixing, with further reductions observed after a short resting period, indicating rapid sedimentation. Conclusions: Fractional administration of dispersed hard gelatin capsules leads to unpredictable and often clinically relevant underdosing, especially for poorly soluble APIs. Whenever possible, capsules should be compounded at the prescribed dose, and liquid formulations should be preferred when dose fractionation is required. Full article
(This article belongs to the Section Pharmaceutical Technology)
Show Figures

Graphical abstract

26 pages, 409 KB  
Article
Unified Data Governance in Heterogeneous Database Environments: An API-Driven Architecture for Multi-Platform Policy Enforcement
by Maryam Abbasi, Paulo Váz, José Silva, Filipe Cardoso, Filipe Sá and Pedro Martins
Data 2026, 11(3), 54; https://doi.org/10.3390/data11030054 - 7 Mar 2026
Viewed by 568
Abstract
Modern organizations increasingly rely on heterogeneous database environments that combine relational, document-oriented, and key-value storage systems to optimize performance for diverse application requirements. However, this technological diversity creates significant challenges for implementing consistent data governance policies, regulatory compliance, and access control across disparate [...] Read more.
Modern organizations increasingly rely on heterogeneous database environments that combine relational, document-oriented, and key-value storage systems to optimize performance for diverse application requirements. However, this technological diversity creates significant challenges for implementing consistent data governance policies, regulatory compliance, and access control across disparate systems. Traditional governance approaches that operate within individual database silos fail to provide unified policy enforcement and create compliance gaps that expose organizations to regulatory and operational risks. This paper presents a novel API-driven architecture that enables unified data governance across heterogeneous database environments without requiring database-specific modifications or vendor lock-in. The proposed framework implements a centralized governance layer that coordinates policy enforcement across PostgreSQL, MongoDB, and Amazon DynamoDB systems through RESTful API interfaces. Key architectural components include differentiated access control through hierarchical API key management, automated compliance workflows for regulatory requirements such as GDPR, real-time audit trail generation, and comprehensive data quality monitoring with automated improvement mechanisms. Comprehensive experimental evaluation demonstrates the framework’s effectiveness across multiple operational dimensions. The system achieved 95.2% accuracy in access control enforcement across different data classification levels, while automated GDPR compliance workflows demonstrated 98.6% success rates with average processing times of 2.9 h. Performance evaluation reveals acceptable overhead characteristics with linear scaling patterns for PostgreSQL operations (R2 = 0.89), consistent sub-20ms response times for MongoDB logging operations, and sustained throughput rates ranging from 38.9 to 142.7 requests per second across the integrated system. Data quality improvements ranged from 16.1% to 34.3% across accuracy, completeness, consistency, and timeliness dimensions over a 12-week monitoring period, with accuracy improving by 17.8 percentage points, completeness by 13.2 percentage points, consistency by 19.7 percentage points, and timeliness by 24.5 percentage points. The duplicate detection system achieved 94.6% precision and 95.6% recall across various duplicate types, including cross-database redundancy identification. The results demonstrate that API-driven governance architectures can effectively address the persistent challenges of policy fragmentation in multi-database environments while maintaining operational performance and enabling measurable improvements in data quality and regulatory compliance. The framework provides a practical migration path for organizations seeking to implement comprehensive governance capabilities without replacing existing database infrastructure investments. Full article
(This article belongs to the Section Information Systems and Data Management)
Show Figures

Figure 1

28 pages, 1177 KB  
Article
Context-Aware Code Review Automation: A Retrieval-Augmented Approach
by Büşra İçöz and Göksel Biricik
Appl. Sci. 2026, 16(4), 1875; https://doi.org/10.3390/app16041875 - 13 Feb 2026
Viewed by 910
Abstract
Manual code review is essential for software quality, but often slows down development cycles due to the high time demands on developers. In this study, we propose an automated solution for Python (version 3.13) projects that generates code review comments by combining Large [...] Read more.
Manual code review is essential for software quality, but often slows down development cycles due to the high time demands on developers. In this study, we propose an automated solution for Python (version 3.13) projects that generates code review comments by combining Large Language Models (LLMs) with Retrieval-Augmented Generation (RAG). To achieve this, we first curated a dataset from GitHub pull requests (PRs) using the GitHub REST Application Programming Interface (API) (version 2022-11-28) and classified comments into semantic categories using a semi-supervised Support Vector Machine (SVM) model. During the review process, our system uses a vector database to retrieve the top-k most relevant historical comments, providing context for a diverse spectrum of open-weights LLMs, including DeepSeek-Coder-33B, Qwen2.5-Coder-32B, Codestral-22B, CodeLlama-13B, Mistral-Instruct-7B, and Phi-3-Mini. We evaluated the system using a multi-step validation that combined standard metrics (BLEU-4, ROUGE-L, cosine similarity) with an LLM-as-a-Judge approach, and verified the results through targeted human review to ensure consistency with expert standards. The findings show that retrieval augmentation improves feedback relevance for larger models, with DeepSeek-Coder’s alignment score increasing by 17.9% at a retrieval depth of k = 3. In contrast, smaller models such as Phi-3-Mini suffered from context collapse, where too much context reduced accuracy. To manage this trade-off, we built a hybrid expert system that routes each task to the most suitable model. Our results indicate that the proposed approach improved performance by 13.2% compared to the zero-shot baseline (k = 0). In addition, our proposed system reduces hallucinations and generates comments that closely align with the standards expected from the experts. Full article
(This article belongs to the Special Issue Artificial Intelligence in Software Engineering)
Show Figures

Figure 1

26 pages, 5458 KB  
Article
Knowledge-Driven Human-in-the-Loop Decision Support for Student Services Using Active Learning and Large Language Models
by Anil Eyupoglu, Kian Jazayeri and Erbuğ Çelebi
Appl. Sci. 2026, 16(4), 1802; https://doi.org/10.3390/app16041802 - 11 Feb 2026
Viewed by 442
Abstract
This study presents an AI-based, human-in-the-loop decision support system designed for large-scale institutional query routing and response generation. The proposed system combines semantic text classification with large language model-based response generation to assist administrative staff in handling high-volume natural language requests from various [...] Read more.
This study presents an AI-based, human-in-the-loop decision support system designed for large-scale institutional query routing and response generation. The proposed system combines semantic text classification with large language model-based response generation to assist administrative staff in handling high-volume natural language requests from various system users, while preserving human oversight. Using a dataset of 135,359 real student and staff interactions collected over 15 years, the system was designed, deployed, and evaluated in a live university information portal. The classification component achieved 95.88% accuracy in evaluation and 82.21% staff acceptance in practice, while 94.81% of AI-generated draft responses were adopted with minor edits. Operational evaluation showed a 30.8% reduction in resolution time, a 32.6% decrease in misrouting, and an increase in user satisfaction from 3.6 to 4.9 out of 5. The system is implemented as a modular RESTful API to ensure interoperability with existing Student Information Systems, with analysis code available upon request to support replication in similar resource-constrained environments. The results illustrate how human-in-the-loop AI systems can support improvements in service quality, efficiency, and institutional capacity in resource-constrained environments, providing a transferable applied AI framework for scalable decision support in complex administrative domains. Full article
Show Figures

Figure 1

13 pages, 1512 KB  
Proceeding Paper
REST API Fuzzing Using API Dependencies and Large Language Models
by Chien-Hung Liu, Shu-Ling Chen and Kuang-Yao Li
Eng. Proc. 2025, 120(1), 42; https://doi.org/10.3390/engproc2025120042 - 3 Feb 2026
Viewed by 631
Abstract
With the widespread adoption of cloud services, ensuring the quality and security of the representational state transfer application programming interface (REST API) has become critical. Among various REST API testing techniques, fuzz testing stands out as a promising approach due to its ability [...] Read more.
With the widespread adoption of cloud services, ensuring the quality and security of the representational state transfer application programming interface (REST API) has become critical. Among various REST API testing techniques, fuzz testing stands out as a promising approach due to its ability to automatically generate large volumes of random or malformed inputs. To improve test coverage through fuzzing, we developed an enhanced method for generating API sequences and parameter values, building upon the widely used open-source tool RESTler. The approach extends RESTler by incorporating resource-level dependencies between APIs in addition to the existing producer–consumer relationships, enabling the construction of more valid API sequences. It also leverages a large language model to automatically generate parameter values. To further ensure input validity, a feedback loop is introduced to refine invalid inputs using error messages from API responses. Experimental results show that, compared to RESTler, the proposed method increases API coverage and detects more faults on average, demonstrating its effectiveness. Full article
(This article belongs to the Proceedings of 8th International Conference on Knowledge Innovation and Invention)
Show Figures

Figure 1

24 pages, 5682 KB  
Article
An Ontology-Driven Digital Twin for Hotel Front Desk: Real-Time Integration of Wearables and OCC Camera Events via a Property-Defined REST API
by Moises Segura-Cedres, Desiree Manzano-Farray, Carmen Lidia Aguiar-Castillo, Rafael Perez-Jimenez, Vicente Matus Icaza, Eleni Niarchou and Victor Guerra-Yanez
Electronics 2026, 15(3), 567; https://doi.org/10.3390/electronics15030567 - 28 Jan 2026
Viewed by 444
Abstract
This article presents an ontology-driven Digital Twin (DT) for hotel front-desk operations that fuses two real-time data streams: (i) physiological and activity signals from wrist-worn wearables assigned to staff, and (ii) 3D people-positioning and occupancy events captured by reception-area cameras using a proprietary [...] Read more.
This article presents an ontology-driven Digital Twin (DT) for hotel front-desk operations that fuses two real-time data streams: (i) physiological and activity signals from wrist-worn wearables assigned to staff, and (ii) 3D people-positioning and occupancy events captured by reception-area cameras using a proprietary implementation of Optical Camera Communication (OCC). Building on a previously proposed front-desk ontology, the semantic model is extended with positional events, zone semantics, and wearable-derived workload indices to estimate queue state, staff workload, and service demand in real time. A vendor-agnostic, property-based REST API specifies the DT interface in terms of observable properties, including authentication and authorization, idempotent ingestion, timestamp conventions, version negotiation, integrity protection for signed webhooks, rate limiting and backoff, pagination and filtering, and privacy-preserving identifiers, enabling any compliant backend to implement the specification. The proposed layered architecture connects ingestion, spatial reasoning, and decision services to dashboards and key performance indicators (KPIs). This article details the positioning pipeline (calibration, normalized 3D coordinates, zone mapping, and confidence handling), the wearable workload pipeline, and an evaluation protocol covering localization error, zone classification, queue-length estimation, and workload accuracy. The results indicate that a spatially aware, ontology-based DT can support more balanced staff allocation and improved guest experience while remaining technology-agnostic and privacy-conscious. Full article
Show Figures

Figure 1

27 pages, 7306 KB  
Article
Design and Implementation of the AquaMIB Unmanned Surface Vehicle for Real-Time GIS-Based Spatial Interpolation and Autonomous Water Quality Monitoring
by Huseyin Duran and Namık Kemal Sonmez
Appl. Sci. 2026, 16(3), 1209; https://doi.org/10.3390/app16031209 - 24 Jan 2026
Viewed by 401
Abstract
This article introduces the design and implementation of an Unmanned Surface Vehicle (USV), named “AquaMIB”, which introduces a novel and integrated approach for real-time and autonomous water quality monitoring in aquatic environments. The system integrates modular hardware and software, combining sensors for temperature, [...] Read more.
This article introduces the design and implementation of an Unmanned Surface Vehicle (USV), named “AquaMIB”, which introduces a novel and integrated approach for real-time and autonomous water quality monitoring in aquatic environments. The system integrates modular hardware and software, combining sensors for temperature, pH, conductivity, dissolved oxygen, and oxidation reduction potential with GPS, LiDAR, a digital compass, communication modules, and a dedicated power unit. Software components include Python on a Raspberry Pi for navigation and control, C on an Atmega 324P for sensing, C++ on an Arduino Uno for remote control, and C#/JavaScript for the web-based control center. Users assign task points, and the USV autonomously navigates, collects data, and transmits it via RESTful API. Field trials showed 96.5% navigation accuracy over 2.2 km, with 66% of task points reached within 3 m. A total of 120 measurements were processed in real time and visualized as GIS-based spatial maps. The system demonstrates a cost-effective, modular solution for aquatic monitoring. The system’s ability to generate real-time GIS maps enables immediate identification of environmental anomalies, transforming raw sensor data into an actionable decision-support tool for aquatic management. Full article
Show Figures

Figure 1

15 pages, 1613 KB  
Article
Exploring the Cognitive Capabilities of Large Language Models in Autonomous and Swarm Navigation Systems
by Dawid Ewald, Filip Rogowski, Marek Suśniak, Patryk Bartkowiak and Patryk Blumensztajn
Electronics 2026, 15(1), 35; https://doi.org/10.3390/electronics15010035 - 22 Dec 2025
Viewed by 864
Abstract
The rapid evolution of autonomous vehicles necessitates increasingly sophisticated cognitive capabilities to handle complex, unstructured environments. This study explores the cognitive potential of Large Language Models (LLMs) in autonomous navigation and swarm control systems, addressing the limitations of traditional rule-based approaches. The research [...] Read more.
The rapid evolution of autonomous vehicles necessitates increasingly sophisticated cognitive capabilities to handle complex, unstructured environments. This study explores the cognitive potential of Large Language Models (LLMs) in autonomous navigation and swarm control systems, addressing the limitations of traditional rule-based approaches. The research investigates whether multimodal LLMs, specifically a customized version of LLaVA 7B (Large Language and Vision Assistant), can serve as a central decision-making unit for autonomous vehicles equipped with cameras and distance sensors. The developed prototype integrates a Raspberry Pi module for data acquisition and motor control with a main computational unit running the LLM via the Ollama platform. Communication between modules combines REST API for sensory data transfer and TCP sockets for real-time command exchange. Without fine-tuning, the system relies on advanced prompt engineering and context management to ensure consistent reasoning and structured JSON-based control outputs. Experimental results demonstrate that the model can interpret real-time visual and distance data to generate reliable driving commands and descriptive situational reasoning. These findings suggest that LLMs possess emerging cognitive abilities applicable to real-world robotic navigation and lay the groundwork for future swarm systems capable of cooperative exploration and decision-making in dynamic environments. These insights are particularly valuable for researchers in swarm robotics and developers of edge-AI systems seeking efficient, multimodal navigation solutions. Full article
(This article belongs to the Special Issue Data-Centric Artificial Intelligence: New Methods for Data Processing)
Show Figures

Figure 1

27 pages, 1460 KB  
Article
Multimodal Cognitive Architecture with Local Generative AI for Industrial Control of Concrete Plants on Edge Devices
by Fernando Hidalgo-Castelo, Antonio Guerrero-González, Francisco García-Córdova, Francisco Lloret-Abrisqueta and Carlos Torregrosa Bonet
Sensors 2025, 25(24), 7540; https://doi.org/10.3390/s25247540 - 11 Dec 2025
Viewed by 1226
Abstract
Accessing operational information across industrial systems (ERP, MES, SCADA, PLC) in concrete plants requires 15–30 min and specialized knowledge. This work addresses this accessibility gap by developing a conversational AI system that democratizes industrial information access through natural language. A five-layer cognitive architecture [...] Read more.
Accessing operational information across industrial systems (ERP, MES, SCADA, PLC) in concrete plants requires 15–30 min and specialized knowledge. This work addresses this accessibility gap by developing a conversational AI system that democratizes industrial information access through natural language. A five-layer cognitive architecture was implemented integrating the Mistral-7B model quantized in GGUF Q4_0 format (3.82 GB) on a Raspberry Pi 5, Spanish speech recognition/synthesis, and heterogeneous industrial protocols (OPC UA, MQTT, REST API) across all automation pyramid levels. Experimental validation at Frumecar S.L. (Murcia, Spain) characterized performance, thermal stability, and reliability. Results show response times of 14.19 s (simple queries, SD = 7.56 s), 16.45 s (moderate, SD = 6.40 s), and 23.24 s (complex multilevel, SD = 6.59 s), representing 26–77× improvement over manual methods. The system maintained average temperature of 69.3 °C (peak 79.6 °C), preserving 5.4 °C margin below throttling threshold. Communication latencies averaged 8.93 ms across 10,163 readings (<1% of total latency). During 30 min of autonomous operation, 100% reliability was achieved with 39 successful queries. These findings demonstrate the viability of deploying quantized LLMs on low-cost edge hardware, enabling cognitive democratization of industrial information while ensuring data privacy and cloud independence. Full article
Show Figures

Figure 1

19 pages, 912 KB  
Article
Lightweight Embedded IoT Gateway for Smart Homes Based on an ESP32 Microcontroller
by Filippos Serepas, Ioannis Papias, Konstantinos Christakis, Nikos Dimitropoulos and Vangelis Marinakis
Computers 2025, 14(9), 391; https://doi.org/10.3390/computers14090391 - 16 Sep 2025
Cited by 4 | Viewed by 4977
Abstract
The rapid expansion of the Internet of Things (IoT) demands scalable, efficient, and user-friendly gateway solutions that seamlessly connect resource-constrained edge devices to cloud services. Low-cost, widely available microcontrollers, such as the ESP32 and its ecosystem peers, offer integrated Wi-Fi/Bluetooth connectivity, low power [...] Read more.
The rapid expansion of the Internet of Things (IoT) demands scalable, efficient, and user-friendly gateway solutions that seamlessly connect resource-constrained edge devices to cloud services. Low-cost, widely available microcontrollers, such as the ESP32 and its ecosystem peers, offer integrated Wi-Fi/Bluetooth connectivity, low power consumption, and a mature developer toolchain at a bill of materials cost of only a few dollars. For smart-home deployments where budgets, energy consumption, and maintainability are critical, these characteristics make MCU-class gateways a pragmatic alternative to single-board computers, enabling always-on local control with minimal overhead. This paper presents the design and implementation of an embedded IoT gateway powered by the ESP32 microcontroller. By using lightweight communication protocols such as Message Queuing Telemetry Transport (MQTT) and REST APIs, the proposed architecture supports local control, distributed intelligence, and secure on-site data storage, all while minimizing dependence on cloud infrastructure. A real-world deployment in an educational building demonstrates the gateway’s capability to monitor energy consumption, execute control commands, and provide an intuitive web-based dashboard with minimal resource overhead. Experimental results confirm that the solution offers strong performance, with RAM usage ranging between 3.6% and 6.8% of available memory (approximately 8.92 KB to 16.9 KB). The initial loading of the single-page application (SPA) results in a temporary RAM spike to 52.4%, which later stabilizes at 50.8%. These findings highlight the ESP32’s ability to serve as a functional IoT gateway with minimal resource demands. Areas for future optimization include improved device discovery mechanisms and enhanced resource management to prolong device longevity. Overall, the gateway represents a cost-effective and vendor-agnostic platform for building resilient and scalable IoT ecosystems. Full article
(This article belongs to the Section Internet of Things (IoT) and Industrial IoT)
Show Figures

Figure 1

16 pages, 2074 KB  
Article
Benchmarking Control Strategies for Multi-Component Degradation (MCD) Detection in Digital Twin (DT) Applications
by Atuahene Kwasi Barimah, Akhtar Jahanzeb, Octavian Niculita, Andrew Cowell and Don McGlinchey
Computers 2025, 14(9), 356; https://doi.org/10.3390/computers14090356 - 29 Aug 2025
Viewed by 996
Abstract
Digital Twins (DTs) have become central to intelligent asset management within Industry 4.0, enabling real-time monitoring, diagnostics, and predictive maintenance. However, implementing Prognostics and Health Management (PHM) strategies within DT frameworks remains a significant challenge, particularly in systems experiencing multi-component degradation (MCD). MCD [...] Read more.
Digital Twins (DTs) have become central to intelligent asset management within Industry 4.0, enabling real-time monitoring, diagnostics, and predictive maintenance. However, implementing Prognostics and Health Management (PHM) strategies within DT frameworks remains a significant challenge, particularly in systems experiencing multi-component degradation (MCD). MCD occurs when several components degrade simultaneously or in interaction, complicating detection and isolation processes. Traditional data-driven fault detection models often require extensive historical degradation data, which is costly, time-consuming, or difficult to obtain in many real-world scenarios. This paper proposes a model-based, control-driven approach to MCD detection, which reduces the need for large training datasets by leveraging reference tracking performance in closed-loop control systems. We benchmark the accuracy of four control strategies—Proportional-Integral (PI), Linear Quadratic Regulator (LQR), Model Predictive Control (MPC), and a hybrid model—within a Digital Twin-enabled hydraulic system testbed comprising multiple components, including pumps, valves, nozzles, and filters. The control strategies are evaluated under various MCD scenarios for their ability to accurately detect and isolate degradation events. Simulation results indicate that the hybrid model consistently outperforms the individual control strategies, achieving an average accuracy of 95.76% under simultaneous pump and nozzle degradation scenarios. The LQR model also demonstrated strong predictive performance, especially in identifying degradation in components such as nozzles and pumps. Also, the sequence and interaction of faults were found to influence detection accuracy, highlighting how the complexities of fault sequences affect the performance of diagnostic strategies. This work contributes to PHM and DT research by introducing a scalable, data-efficient methodology for MCD detection that integrates seamlessly into existing DT architectures using containerized RESTful APIs. By shifting from data-dependent to model-informed diagnostics, the proposed approach enhances early fault detection capabilities and reduces deployment timelines for real-world DT-enabled PHM applications. Full article
(This article belongs to the Section Internet of Things (IoT) and Industrial IoT)
Show Figures

Figure 1

20 pages, 1920 KB  
Article
Management of Virtualized Railway Applications
by Ivaylo Atanasov, Evelina Pencheva and Kamelia Nikolova
Information 2025, 16(8), 712; https://doi.org/10.3390/info16080712 - 21 Aug 2025
Cited by 1 | Viewed by 1260
Abstract
Robust, reliable, and secure communications are essential for efficient railway operation and keeping employees and passengers safe. The Future Railway Mobile Communication System (FRMCS) is a global standard aimed at providing innovative, essential, and high-performance communication applications in railway transport. In comparison with [...] Read more.
Robust, reliable, and secure communications are essential for efficient railway operation and keeping employees and passengers safe. The Future Railway Mobile Communication System (FRMCS) is a global standard aimed at providing innovative, essential, and high-performance communication applications in railway transport. In comparison with the legacy communication system (GSM-R), it provides high data rates, ultra-high reliability, and low latency. The FRMCS architecture will also benefit from cloud computing, following the principles of the cloud-native 5G core network design based on Network Function Virtualization (NFV). In this paper, an approach to the management of virtualized FRMCS applications is presented. First, the key management functionality related to the virtualized FRMCS application is identified based on an analysis of the different use cases. Next, this functionality is synthesized as RESTful services. The communication between application management and the services is designed as Application Programing Interfaces (APIs). The APIs are formally verified by modeling the management states of an FRMCS application instance from different points of view, and it is mathematically proved that the management state models are synchronized in time. The latency introduced by the designed APIs, as a key performance indicator, is evaluated through emulation. Full article
(This article belongs to the Section Information Applications)
Show Figures

Figure 1

15 pages, 3602 KB  
Article
Remote Monitoring and Energy Grade Evaluation for Water-Based Centrifugal Pumps Based on Browser/Server Architecture
by Shenlong Gao, Mengjiao Zhao, Jingming Liu, Qiang Huang, Yang Liu, Jie Liu and Tie Sun
Processes 2025, 13(8), 2650; https://doi.org/10.3390/pr13082650 - 21 Aug 2025
Cited by 1 | Viewed by 1004
Abstract
This study presents an online evaluation system for the energy efficiency grade of centrifugal pump units using a Browser/Server architecture. The system employs direct calculation and characteristic curve fitting methods to evaluate efficiency, with corrections for viscous fluids. It utilizes Java20, SpringBoot2.7x, HTML5, [...] Read more.
This study presents an online evaluation system for the energy efficiency grade of centrifugal pump units using a Browser/Server architecture. The system employs direct calculation and characteristic curve fitting methods to evaluate efficiency, with corrections for viscous fluids. It utilizes Java20, SpringBoot2.7x, HTML5, CSS3, Ajax, and RESTful API technologies for real-time monitoring and evaluation. The system has undergone rigorous testing and full-scale deployment within a petrochemical facility. As demonstrated herein, it delivers exceptional stability and precision, cutting evaluation time substantially while markedly enhancing energy-conservation performance. Full article
(This article belongs to the Section Process Control and Monitoring)
Show Figures

Figure 1

36 pages, 6099 KB  
Article
RestRho: A JSON-Based Domain-Specific Language for Designing and Developing RESTful APIs to Validate RhoArchitecture
by Enrique Chavarriaga, Luis Rojas, Francy D. Rodríguez, Kat Sorbello and Francisco Jurado
Future Internet 2025, 17(8), 346; https://doi.org/10.3390/fi17080346 - 31 Jul 2025
Viewed by 3166
Abstract
Domain-Specific Languages with JSON grammar (JSON-DSLs) are specialized programming languages tailored to specific problem domains, offering higher abstraction levels and simplifying software implementation through the JSON standard. RhoArchitecture is an approach for designing and executing JSON-DSLs, incorporating a modular programming model, a JSON-based [...] Read more.
Domain-Specific Languages with JSON grammar (JSON-DSLs) are specialized programming languages tailored to specific problem domains, offering higher abstraction levels and simplifying software implementation through the JSON standard. RhoArchitecture is an approach for designing and executing JSON-DSLs, incorporating a modular programming model, a JSON-based evaluation engine, and an integrated web development environment. This paper presents RestRho, a RESTful NodeJS server developed using two JSON-DSLs designed with RhoArchitecture: SQLRho and DBRestRho. These languages enable declarative specification of database operations and HTTP requests, respectively, supporting modularity, reuse, and template-based transformations. We validate the RestRho implementation through a dual approach. First, we apply software metrics to assess code quality, maintainability, and complexity. Second, we conduct an empirical study involving 39 final-year computer engineering students, who completed 18 structured tasks and provided feedback via questionnaires. The results demonstrate the tool’s usability, development efficiency, and potential for adoption in web application development. Full article
Show Figures

Figure 1

Back to TopTop