Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,423)

Search Parameters:
Keywords = IEC-61499

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 4270 KB  
Article
Influence of the Shape of Power Supply Waveform on Power Quality and Optical Parameters of Selected Light Sources
by Przemysław Ptak, Tadeusz Lorkowski and Krzysztof Górecki
Energies 2026, 19(9), 2209; https://doi.org/10.3390/en19092209 (registering DOI) - 2 May 2026
Abstract
The article describes the results of research on the power supply quality of selected fluorescent lamps and solid-state light sources powered by voltage with different waveforms and supply voltage values. The power factor, total harmonic distortion (THD) factor and values of [...] Read more.
The article describes the results of research on the power supply quality of selected fluorescent lamps and solid-state light sources powered by voltage with different waveforms and supply voltage values. The power factor, total harmonic distortion (THD) factor and values of individual harmonics were measured and their compliance with international standards was assessed. The measurement set-up used and the measurement results obtained with it are described. The results of the experimental research showed that the light sources under consideration did not meet the criteria specified in international standards for the THD factor and the values of individual harmonics, regardless of the shape of the supply voltage waveform. Current total harmonic distortion always exceeded 44%, exceeding the upper limit of 23% specified in the IEC 61000-3-2:2018 standard. The third harmonic values far exceeded the 21.6% of the first harmonics, which is the limit specified in this standard as well. However, it was shown that supplying some light sources with a triangular voltage waveform can increase the illuminance value by up to 28%. On the other hand, the use of a rectangular voltage waveform leads to an increase in the power factor and a decrease in reactive power. Full article
Show Figures

Figure 1

21 pages, 506 KB  
Article
Cybersecurity Risk Mitigation in Digital Substations Based on a Control Model for Communication Systems: An Experimental Validation
by Oscar A. Tobar-Rosero, Ivar F. Gomez-Pedraza, John E. Candelo-Becerra, Juan D. Grajales-Bustamante and Fredy E. Hoyos
Automation 2026, 7(3), 68; https://doi.org/10.3390/automation7030068 - 30 Apr 2026
Viewed by 33
Abstract
The increasing digitalization of electrical substations, enabled by IEC 61850-based architectures, has improved operational efficiency while expanding the cyber attack surface. This paper introduces a standards-aligned cybersecurity risk mitigation model specifically designed for digital substations and mapped to representative attack scenarios. The model [...] Read more.
The increasing digitalization of electrical substations, enabled by IEC 61850-based architectures, has improved operational efficiency while expanding the cyber attack surface. This paper introduces a standards-aligned cybersecurity risk mitigation model specifically designed for digital substations and mapped to representative attack scenarios. The model integrates preventive, detective, and application-level controls derived from NIST SP 800-82r3, IEC 62443, and ISO/IEC 27019, and is validated in a laboratory process-bus environment. A baseline risk assessment identified four high-risk scenarios in the studied digital substation architecture. For validation, a selected subset of controls was experimentally evaluated against two representative attack vectors, namely false data injection (FDI) on GOOSE messages and denial-of-service (DoS) against PTP synchronization. For the remaining scenarios, the post-mitigation effects were reassessed analytically based on control coverage, architectural exposure, and standards-aligned cybersecurity reasoning. The experimental validation demonstrated that both empirically tested high-risk scenarios (FDI on GOOSE and DoS on PTP) were effectively mitigated, reducing their residual risk to moderate and low levels, respectively. For the remaining two scenarios, a post-mitigation analytical reassessment based on control coverage and architectural exposure suggested a consistent risk reduction trend, although without direct experimental confirmation. Under this combined empirical–analytical assessment, the number of high-risk scenarios decreased from four to one, corresponding to a 50% experimentally validated reduction in high-risk exposure, complemented by an analytical reassessment of the remaining scenarios. These results provide quantitative evidence about the effectiveness of the model, even with partial implementation. The scientific contribution of this study lies in integrating multistandard cybersecurity requirements into an operational mitigation model tailored to IEC 61850 substations, combined with experimental risk quantification in a realistic process-bus testbed. The proposed model offers practical guidance for utilities and establishes a scalable foundation for advancing cybersecurity in critical power infrastructure. Full article
(This article belongs to the Special Issue Substation Automation, Protection and Control Based on IEC 61850)
Show Figures

Figure 1

44 pages, 856 KB  
Article
A GPT-Based Assessment of Alignment Between Privacy Legal Frameworks & ISO/IEC 27701:2025: A Latin American Case Study
by David Cevallos-Salas, José Estrada-Jiménez and Danny S. Guamán
Technologies 2026, 14(5), 273; https://doi.org/10.3390/technologies14050273 - 30 Apr 2026
Viewed by 44
Abstract
The 2025 update of the International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) 27701 standard offers a major advantage by enabling organizations to implement a Privacy Information Management System (PIMS) autonomously while maintaining alignment with the General Data Protection Regulation (GDPR). However, it remains [...] Read more.
The 2025 update of the International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) 27701 standard offers a major advantage by enabling organizations to implement a Privacy Information Management System (PIMS) autonomously while maintaining alignment with the General Data Protection Regulation (GDPR). However, it remains unclear to what extent privacy legal frameworks in developing jurisdictions, particularly in Latin American countries, align with this new standard. At the same time, the traditional method for assessing the alignment between privacy legal frameworks and ISO/IEC 27701 continues to rely on manual mapping between the standard’s subclauses and privacy regulatory articles, a process that is time-consuming, costly, and error-prone. More critically, no method exists to quantitatively assess the reliability of such mappings, leaving alignment assessments largely subjective. To address these limitations, this paper proposes a novel method based on an OpenAI Generative Pre-trained Transformer (GPT) combined with a Chain-of-Thought (CoT) reasoning strategy to quantitatively assess the alignment between privacy legal frameworks and ISO/IEC 27701:2025. By leveraging GPT’s logarithmic probabilities (logprobs) and the standard’s subclause definitions as classification categories, the method enables confidence-based evaluation of legal–standard alignment. The proposed method is then applied to analyze the privacy legal frameworks of Paraguay, Chile, Ecuador, México, Colombia, and Perú, examining how effectively they promote the standard’s guidelines. A suitable confidence threshold is then selected by assessing the GDPR and comparing the results with the reference mappings reported in Annex D of the standard. Finally, the method identifies the number of compliant subclauses per clause, the regulatory articles influencing the resulting logprobs, and the underlying privacy gaps for reduced alignment across the analyzed privacy legal frameworks. Overall, our results indicate that while Latin American privacy legal frameworks mandate protective measures by promoting a suitable operation and continuous improvement of a PIMS, they do not explicitly demand adequate risk management and sufficient preventive safeguards for citizens’ Personally Identifiable Information (PII) in dynamic contexts. Full article
(This article belongs to the Section Information and Communication Technologies)
Show Figures

Graphical abstract

34 pages, 3896 KB  
Article
A Fuzzy AHP-Based Framework for Assessing Cybersecurity Readiness in Smart Circular Economy Systems Aligned with ISO/IEC 27001
by Seyedeh Azadeh Alavi-Borazjani and Muhammad Noman Shafique
Information 2026, 17(5), 429; https://doi.org/10.3390/info17050429 - 29 Apr 2026
Viewed by 88
Abstract
The increasing digitalization of smart circular economy (CE) systems intensifies reliance on interconnected cyber-physical infrastructures, thereby increasing exposure to cybersecurity risks that may affect operational continuity and regulatory compliance. This study proposes a Fuzzy Analytical Hierarchy Process (Fuzzy AHP)-based framework to systematically assess [...] Read more.
The increasing digitalization of smart circular economy (CE) systems intensifies reliance on interconnected cyber-physical infrastructures, thereby increasing exposure to cybersecurity risks that may affect operational continuity and regulatory compliance. This study proposes a Fuzzy Analytical Hierarchy Process (Fuzzy AHP)-based framework to systematically assess cybersecurity readiness in alignment with the ISO/IEC 27001:2022 Information Security Management System (ISMS) standard. The framework adopts a structured three-level hierarchy consisting of seven main criteria and 39 sub-criteria, derived from ISO/IEC 27001:2022 clause-based requirements and Annex A control families, and expanded with an additional regulatory criterion based on the Cyber Resilience Act (CRA) Requirements Standards Mapping. Expert judgments from ten specialists in cybersecurity and digital systems were elicited using linguistic assessments and converted into triangular fuzzy numbers to compute priority weights under uncertainty. The results indicate that ISMS governance and organizational context are the most influential determinants of cybersecurity readiness, followed by regulatory and compliance alignment, operational oversight, and technological controls, while organizational, human, and physical controls play supportive roles. Consistency and sensitivity analyses confirm the robustness and stability of the weighting structure. Overall, the framework provides a standards-aligned decision-support tool for prioritizing cybersecurity readiness in digitally intensive CE environments. Full article
(This article belongs to the Special Issue Digital Technology and Cyber Security)
18 pages, 2441 KB  
Article
Energy Measurement Characteristics of Electricity Meters with Different Input Configurations Under IEC 61000-4-19-Based Conducted Disturbances
by Grzegorz Sadkowski and Andrzej Bień
Sensors 2026, 26(9), 2781; https://doi.org/10.3390/s26092781 - 29 Apr 2026
Viewed by 448
Abstract
The influence of conducted electromagnetic disturbances on the energy measurement error of electricity meters remains insufficiently explored in electromagnetic compatibility (EMC) studies, particularly in the frequency range above the classical harmonic domain. The aim of this study was to investigate the susceptibility of [...] Read more.
The influence of conducted electromagnetic disturbances on the energy measurement error of electricity meters remains insufficiently explored in electromagnetic compatibility (EMC) studies, particularly in the frequency range above the classical harmonic domain. The aim of this study was to investigate the susceptibility of electricity meters with different input configurations to conducted disturbances in the frequency range 1 kHz–150 kHz, including the 2 kHz–150 kHz band covered by IEC 61000-4-19. The novelty of this work lies in the comparative analysis of meters employing a shunt resistor, current transformer, Rogowski coil, Hall-effect sensor, and a digital system based on a Merging Unit and Sampled Values. The tests were performed as a preliminary screening stage of the study, using a test procedure based on IEC 61000-4-19, in which the energy measurement error was determined from the difference between the energy measured by the meter under test and that measured by a reference meter while disturbances were injected into the current circuit. The results showed that, for most of the tested electronic meters, the influence of the disturbances at the applied 1 A level was limited. For the tested Class B meters, the observed maximum error deviations remained below 1%, while the largest deviation under continuous-wave disturbance was observed for the Merging Unit + SV system. The highest immunity to amplitude-modulated disturbances was found for the shunt-based meter and the Rogowski coil-based meter. In none of the investigated cases were large error deviations on the order of several percent or several tens of percent observed. The obtained results indicate that, under the applied test conditions, conducted disturbances in the investigated frequency range did not cause significant deterioration of the metrological performance of most of the analyzed electronic meters. However, the results should be interpreted as a comparative assessment under modified IEC-based test conditions, rather than as a full compliance evaluation at the normative disturbance levels for directly connected meters. Full article
23 pages, 416 KB  
Article
Formal Integration of ISO/IEC Digital Twin Standards: A Layered Compliance Model with Uncertainty Quantification
by George Balan, Elena Serea, Alexandru Sălceanu and Dorin-Dumitru Lucache
Mathematics 2026, 14(9), 1425; https://doi.org/10.3390/math14091425 - 23 Apr 2026
Viewed by 174
Abstract
Digital Twin (DT) implementations in electrical and industrial systems are governed by fragmented ISO/IEC and IEC standards spanning terminology, architecture, interoperability, lifecycle management, and cybersecurity. This paper proposes a mathematical framework that integrates these standards into a unified compliance model. A layered DT [...] Read more.
Digital Twin (DT) implementations in electrical and industrial systems are governed by fragmented ISO/IEC and IEC standards spanning terminology, architecture, interoperability, lifecycle management, and cybersecurity. This paper proposes a mathematical framework that integrates these standards into a unified compliance model. A layered DT architecture is defined as a finite set of functional abstractions, and standards are linked to layers through a multivalued mapping and an incidence matrix. Traceability, interoperability, fidelity, and security/governance indicators are normalized and aggregated through a bounded weighted functional to obtain a deterministic compliance score. The model is then extended by treating selected indicators as random variables, which enables probabilistic maturity classification and Monte Carlo-based robustness analysis. The resulting functional is bounded, monotone, and stable under bounded perturbations. Numerical experiments on a synthetic portfolio illustrate deterministic scoring and uncertainty effects. The framework provides a proof-of-concept basis for structured DT compliance assessment across heterogeneous electrical systems; however, broader empirical validation is still required before operational deployment. Full article
(This article belongs to the Special Issue Mathematical Applications in Electrical Engineering, 2nd Edition)
18 pages, 9312 KB  
Article
Load-Predictive Pitch Control Strategy for Wind Turbines Under Turbulent Wind Conditions Based on Long Short-Term Memory Neural Networks
by Daorina Bao, Peng Li, Jun Zhang, Zhongyu Shi, Yongshui Luo, Xiaohu Ao, Ruijun Cui and Xiaodong Guo
Energies 2026, 19(9), 2044; https://doi.org/10.3390/en19092044 - 23 Apr 2026
Viewed by 164
Abstract
Under turbulent wind conditions, rapid wind speed fluctuations can markedly increase the fatigue loads borne by wind turbine blades and towers. In practice, conventional PID pitch control based on speed feedback often struggles to deliver satisfactory load mitigation, mainly because the wind turbine [...] Read more.
Under turbulent wind conditions, rapid wind speed fluctuations can markedly increase the fatigue loads borne by wind turbine blades and towers. In practice, conventional PID pitch control based on speed feedback often struggles to deliver satisfactory load mitigation, mainly because the wind turbine system is highly nonlinear, strongly coupled, and subject to time-delay effects. To overcome these limitations, this paper proposes a load-predictive pitch control strategy built on a Long Short-Term Memory (LSTM) network. Specifically, the LSTM model is first employed to predict the hub-fixed tilt and yaw moments ahead of time. These predicted values are then introduced as feedforward signals and combined with the conventional speed-based pitch control signal as well as a proportional feedback term. After that, the inverse Coleman transformation is used to generate the individual pitch commands for each blade. To verify the effectiveness of the proposed method, co-simulations were carried out in FAST and MATLAB/Simulink on a 5000 KW distributed pitch-controlled wind turbine under IEC Kaimal spectrum wind conditions, with a mean wind speed of 18 m/s and Class B turbulence intensity. The results show that the LSTM prediction model achieves an R² of 0.998 on the test dataset, with an RMSE as low as 0.0051. Compared with the conventional pitch-based power control strategy, the proposed approach maintains the same average power output while significantly reducing fatigue loads, thereby contributing to a longer service life for the wind turbine. Full article
28 pages, 1429 KB  
Article
Engineering Systems with Standards and Digital Models: Specifying Stakeholder Needs and Capabilities—MGOS
by Kevin MacG. Adams, Irfan Ibrahim and Steven L. Krahn
Systems 2026, 14(5), 458; https://doi.org/10.3390/systems14050458 - 23 Apr 2026
Viewed by 148
Abstract
This paper proposes a formal method and associated techniques for completing the ISO/IEC/IEEE Standard 15288 technical process 6.4.2—Stakeholder Needs and Requirements definition within the 15288-SysML Grid framework. The paper is a companion work to Engineering Systems with Standards and Digital Models: Development of [...] Read more.
This paper proposes a formal method and associated techniques for completing the ISO/IEC/IEEE Standard 15288 technical process 6.4.2—Stakeholder Needs and Requirements definition within the 15288-SysML Grid framework. The paper is a companion work to Engineering Systems with Standards and Digital Models: Development of a 15288-SysML Grid, which describes an engineering design method that supports the tenets of the Industry 4.0 paradigm. The formal method presented here is grounded using established constructs from systems science; specifically, the systems principles of hierarchy, emergence, requisite parsimony, minimum critical specification, and requisite saliency. The application of accepted principles ensures that stakeholders are able to objectively specify measurable criteria that can satisfy stakeholder needs and capabilities. The method uses: (1) international standards for systems (e.g., ISO/IEC/IEEE 15288); (2) adopts the four fundamental aspects of system design supported by model-based systems engineering (MBSE); (3) invokes the international standard for the systems modeling language (SysML); and (4) adopts a hierarchical requirements tree that specifies Mission, Goals, Objectives, and Sub-objectives (MGOS) to provide the stakeholder-analysis process a means for articulating system-level engineering requirements. Utilization of the MGOS framework is intended to have a positive impact on the system design process by ensuring reproducibility, replicability, transparency, and generalization. Full article
(This article belongs to the Special Issue Model-Based Systems Engineering (MBSE) for Complex Systems)
15 pages, 1302 KB  
Proceeding Paper
Quantum-Resistant Encryption for IoT Communication in Critical Engineering Infrastructure
by Wai Yie Leong
Eng. Proc. 2026, 134(1), 76; https://doi.org/10.3390/engproc2026134076 - 22 Apr 2026
Viewed by 335
Abstract
The growing interconnection of critical engineering infrastructure through IoT introduces unprecedented exposure to cyber threats. Emerging quantum computing capabilities pose a transformative risk to classical cryptographic primitives such as Rivest–Shamir–Adleman and Elliptic-Curve Cryptography, which underpin secure communication and device authentication in industrial control [...] Read more.
The growing interconnection of critical engineering infrastructure through IoT introduces unprecedented exposure to cyber threats. Emerging quantum computing capabilities pose a transformative risk to classical cryptographic primitives such as Rivest–Shamir–Adleman and Elliptic-Curve Cryptography, which underpin secure communication and device authentication in industrial control systems, power grids, transportation networks, and healthcare infrastructure. This paper investigates quantum-resistant encryption, often termed post-quantum cryptography (PQC), as a sustainable security paradigm for IoT communication within critical systems. By analyzing lattice-based, code-based, multivariate, and hash-based schemes, the study evaluates trade-offs between computational cost, memory footprint, and latency constraints intrinsic to resource-limited IoT nodes. A hybrid architectural framework integrating the National Institute of Standards and Technology-standardized algorithms (e.g., Cryptographic Suite for Algebraic Lattices—Kyber, Dilithium) with lightweight symmetric primitives (e.g., Ascon, GIFT block cipher in Combined Feedback mode) is proposed for secure data transmission across heterogeneous IoT layers. Experimental simulations benchmark key-exchange throughput, ciphertext expansion, and resilience against quantum-adversarial models, demonstrating up to 65% reduction in handshake latency compared to baseline lattice implementations under constrained conditions. The paper concludes with policy and engineering recommendations for the adoption of quantum-resistant IoT protocols in energy, transportation, and industrial automation sectors, highlighting alignment with global PQC migration roadmaps and IEC 62443 cybersecurity standards. Full article
Show Figures

Figure 1

30 pages, 1277 KB  
Review
Global Regulatory Mandates as Drivers for Advanced Chemical Analysis in Food Safety
by Lin Guo, Xiaoxiao Dong, Heng Zhou, Zilong Liu and Xingchuang Xiong
Foods 2026, 15(8), 1454; https://doi.org/10.3390/foods15081454 - 21 Apr 2026
Viewed by 473
Abstract
The globalization of the food supply chain presents complex challenges for safety assurance within a highly fragmented regulatory landscape. This review synthesizes the frameworks of eight influential jurisdictions—including the European Union (EU), the United States, China, and Codex Alimentarius—to evaluate how legal mandates [...] Read more.
The globalization of the food supply chain presents complex challenges for safety assurance within a highly fragmented regulatory landscape. This review synthesizes the frameworks of eight influential jurisdictions—including the European Union (EU), the United States, China, and Codex Alimentarius—to evaluate how legal mandates function as regulatory drivers that guide the evolution of analytical chemistry. By examining legislation on Maximum Residue Limits (MRLs), positive list systems, and method validation guidelines (e.g., SANTE), we demonstrate that strict preventive controls have established chromatography coupled with tandem mass spectrometry (LC/GC-MS/MS) as the universal standard for multi-residue screening. We show that global regulatory fragmentation is not merely an administrative artifact, but is rooted in divergent toxicological philosophies and localized dietary exposure models. This regulatory heterogeneity requires analytical laboratories to adopt a posture of “defensive technological redundancy,” forcing them to continuously optimize targeted methods against the strictest global default limits (e.g., 0.01 mg/kg). We establish that this continuous methodological escalation for ultra-trace quantification has reached practical and operational limits. Consequently, we conclude that the future of food safety testing must transition from static target-list compliance toward adaptable, non-targeted chemical profiling using High-Resolution Mass Spectrometry (HRMS), enabling laboratories to proactively address emerging contaminants, food fraud, and the complexities of modern food matrices. Full article
(This article belongs to the Section Food Analytical Methods)
Show Figures

Figure 1

27 pages, 5602 KB  
Article
Low-Power Direct Hardware Implementation of Logic Controllers Using Standard Languages
by Adam Milik, Wojciech Kierat and Tomasz Rudnicki
Energies 2026, 19(8), 2001; https://doi.org/10.3390/en19082001 - 21 Apr 2026
Viewed by 253
Abstract
The paper shows the methodologies of implementing a high-performance low-power logic control system designed with the use of standard languages like LD and SFC (according to the IEC61131-3 standard) directly in hardware utilizing FPGA devices. The essential idea is to convert the sequential [...] Read more.
The paper shows the methodologies of implementing a high-performance low-power logic control system designed with the use of standard languages like LD and SFC (according to the IEC61131-3 standard) directly in hardware utilizing FPGA devices. The essential idea is to convert the sequential sentences of a language to parallel computations and then map them to a dedicated hardware structure. The flexible graph-based method of language mapping is shown. It enables extracting control and data flow from language sentences. The direct hardware mapping technique enables building not only a high-performance structures but also a low-power implementations. All implementations retain a very short response time consisting of several clock cycles (from 3 to 7). The proposed low-power mapping strategies enable power saving up to 10 times while retaining processing performance. The obtained results are compared with a standard implementation using a benchmark program set. The paper is concluded with a comparison of the performance and energy consumption for the proposed implementation strategies. Full article
(This article belongs to the Topic VLSI-Based Sequential Devices in Cyber-Physical Systems)
Show Figures

Figure 1

31 pages, 2201 KB  
Article
Anomaly Detection for Substations Based on IEC 61850-NFA Model
by Deniz Berfin Tastan and Musa Balta
Appl. Sci. 2026, 16(8), 4000; https://doi.org/10.3390/app16084000 - 20 Apr 2026
Viewed by 290
Abstract
The increasing digitalization of energy transmission and distribution infrastructures has made industrial control systems (ICS), and especially IEC 61850-based communication structures, critical. IEC 61850 performs protection and control functions in substations in real time via GOOSE and MMS protocols. The fast and low-latency [...] Read more.
The increasing digitalization of energy transmission and distribution infrastructures has made industrial control systems (ICS), and especially IEC 61850-based communication structures, critical. IEC 61850 performs protection and control functions in substations in real time via GOOSE and MMS protocols. The fast and low-latency operation of these protocols is essential; however, their open structure leaves systems vulnerable to cyberattacks. Traditional signature-based solutions are insufficient for detecting such anomalies, and models capable of learning both time and state relationships are needed. This study develops a time-aware probabilistic NFA model to detect anomalous behavior in IEC 61850 traffic. The model analyzes GOOSE and MMS message sequences with both state transitions and time differences (Δt). Thus, not only the message sequence but also the timing variations between events are learned. The probability of each transition is dynamically updated, and deviations from normal behavior are marked as “anomalies”. The dataset used in this study was created based on normal and attack scenarios conducted in the Sakarya University Critical Infrastructure National Testbed Center Energy Laboratory (Center Energy). The experimental results obtained in the study show that the model detects time-based, structural, and behavioral anomalies with high accuracy. With a dual-model configuration, results of 91.7% accuracy, 88.9% precision, 100% recall, and a 94.1% F1-score were achieved; particularly in time-based attack scenarios, the model performance reached an accuracy level of up to 93%. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

17 pages, 913 KB  
Article
An Empirical Study of Knowledge Graph-Enhanced RAG for Information Security Compliance
by Dimitar Jovanovski, Marija Stojcheva, Mila Dodevska, Petre Lameski, Igor Mishkovski and Dejan Gjorgjevikj
Information 2026, 17(4), 389; https://doi.org/10.3390/info17040389 - 20 Apr 2026
Viewed by 847
Abstract
Information security compliance has become critical for organizations worldwide, with the ISO/IEC 27000 family serving as the most widely adopted framework for establishing information security management systems. Despite their global acceptance, these standards present significant interpretation challenges due to their formal language, abstract [...] Read more.
Information security compliance has become critical for organizations worldwide, with the ISO/IEC 27000 family serving as the most widely adopted framework for establishing information security management systems. Despite their global acceptance, these standards present significant interpretation challenges due to their formal language, abstract structure, and extensive cross-referencing across 97 documents. Traditional retrieval-augmented generation (RAG) systems, which rely on independent text chunking and dense vector retrieval, prove inadequate for such highly interconnected regulatory materials, often fragmenting contextual relationships and reducing accuracy. This study introduces a privacy-preserving RAG framework that integrates LightRAG, a knowledge graph-based retrieval system, with locally hosted open-source language models. Unlike chunk-based RAG systems that treat document segments independently, the system in this study constructs a semantic knowledge graph that explicitly models relationships between clauses through typed edges representing cross-references, semantic similarity, and hierarchical dependencies. To enable rigorous evaluation, we developed a curated benchmark dataset of 222 multiple-choice questions with authoritative ground-truth answers, systematically constructed from official ISO standards, certification preparation materials, and academic sources. Through systematic evaluation on this benchmark, we show that knowledge graph-based retrieval achieves higher accuracy than chunk-based RAG and non-retrieval LLM baselines within the evaluated setup. The analysis indicates that embedding model quality is strongly associated with system performance, that hybrid retrieval modes combining local and global graph traversal tend to yield better accuracy, and that mid-sized open-source models paired with strong retrievers can approach the performance of larger proprietary systems. The best configuration achieves 90.54% accuracy, demonstrating the promising effectiveness of graph-structured retrieval for multiple-choice regulatory questions. Full article
Show Figures

Figure 1

42 pages, 3983 KB  
Systematic Review
IEC 61850 GOOSE: A Systematic Literature Review on the State of the Art and Current Applications
by Arthur Kniphoff da Cruz, Ana Clara Hackenhaar Kellermann, Ingridy Caroliny da Silva, Jaine Mercia Fernandes de Oliveira, Marcia Elena Jochims Kniphoff da Cruz and Lorenz Däubler
Automation 2026, 7(2), 62; https://doi.org/10.3390/automation7020062 - 17 Apr 2026
Viewed by 418
Abstract
To develop secure, fast, and interoperable smart substations, it is vital to understand the current situation and potential future directions of the technologies involved. This study presents the evolution and state of the art of the Generic Object Oriented Substation Event (GOOSE) communication [...] Read more.
To develop secure, fast, and interoperable smart substations, it is vital to understand the current situation and potential future directions of the technologies involved. This study presents the evolution and state of the art of the Generic Object Oriented Substation Event (GOOSE) communication protocol, defined by the International Electrotechnical Commission (IEC) 61850 standard. A Systematic Literature Review (SLR) was conducted following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) protocol. This included journal articles published from 2004 to 2025 and conference papers from 2020 to 2025, written in English within Engineering. Only studies primarily focusing on GOOSE, citing it at least ten times, and indexed in the Scopus, IEEE Xplore, and Web of Science databases were included. The quantitative analysis used SciMAT software, complemented by a qualitative analysis. Due to the bibliometric and thematic nature of this review, potential biases were considered at the review level rather than by applying a formal study-level risk-of-bias tool. The final analysis comprised 82 journal articles and 84 conference papers. The results offer a comprehensive mapping of GOOSE research evolution, identify nine main challenges and limitations from the last 22 years, and highlight current research directions. The literature reveals methodological heterogeneity, a predominance of simulation-based approaches, and limited large-scale empirical validation. Full article
(This article belongs to the Special Issue Substation Automation, Protection and Control Based on IEC 61850)
Show Figures

Figure 1

30 pages, 712 KB  
Review
AI Risk Governance for Advancing Digital Sovereignty in Data-Driven Systems: An Integrated Multi-Layer Framework
by Segun Odion and Santosh Reddy Addula
Future Internet 2026, 18(4), 209; https://doi.org/10.3390/fi18040209 - 15 Apr 2026
Viewed by 719
Abstract
The integration of algorithmic systems into critical digital infrastructure is no longer peripheral to governance, it is governance. As AI-mediated decisions influence credit access, clinical diagnoses, criminal risk scores, and infrastructure routing, the question of who controls these algorithms and whether that control [...] Read more.
The integration of algorithmic systems into critical digital infrastructure is no longer peripheral to governance, it is governance. As AI-mediated decisions influence credit access, clinical diagnoses, criminal risk scores, and infrastructure routing, the question of who controls these algorithms and whether that control is meaningful has become a central concern for states and institutions at every level of development. Existing frameworks, including the NIST AI Risk Management Framework, ISO/IEC 42001, and the EU AI Act, have made real progress toward structured AI governance. However, none treats digital sovereignty as a first-order goal, nor do they provide integrated cross-layer guidance applicable across the diverse institutional landscape found worldwide. From this synthesis, we develop the Integrated AI Risk Governance Framework (IARGF): a four-layer structure covering policy and regulations, institutional oversight, technical controls, and operational execution, organized around five risk categories—technical, ethical, security, systemic, and sovereignty-related. A comparative analysis with major existing frameworks highlights the IARGF’s unique contributions, especially its explicit focus on sovereignty, adaptability across different institutional capacities, and recursive feedback mechanisms that connect all four governance layers. The framework is analyzed across three domains—healthcare AI, financial services, and critical infrastructure—to demonstrate its practical utility. Results confirm that governance effectiveness is a system property, not just a feature of individual layers; that digital sovereignty is both a governance goal and a distinct risk dimension with specific technical and institutional needs; and that context-aware, capacity-scaled governance is a design requirement, not a political compromise. The IARGF is presented as a conceptual governance model based on a systematic literature review rather than an empirically validated tool, and it remains to be tested in actual organizational settings. Its main contribution is the comprehensive theoretical integration of sovereignty, institutional capacity, and inter-layer governance dynamics, rather than proven performance advantages over existing models. Future research should aim to validate this framework through longitudinal case studies, expert panels, and retrospective failure analyses. Full article
(This article belongs to the Special Issue Security and Privacy in AI-Powered Systems)
Show Figures

Graphical abstract

Back to TopTop